[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
New function: lsqnonlin
From: |
Bill Denney |
Subject: |
New function: lsqnonlin |
Date: |
Tue, 01 Apr 2008 00:32:55 -0400 |
User-agent: |
Thunderbird 2.0.0.12 (Windows/20080213) |
Attached is lsqnonlin. I will probably add optimset soon.
Have a good day,
Bill
# HG changeset patch
# User address@hidden
# Date 1207024191 14400
# Node ID 7c261c6207ae37c0ab7443c3d87a71030a6332cf
# Parent 899bee11acd154a4309020f84cf5331820fb88e1
[mq]: optimization
diff -r 899bee11acd1 -r 7c261c6207ae scripts/ChangeLog
--- a/scripts/ChangeLog Sat Mar 29 09:36:10 2008 -0400
+++ b/scripts/ChangeLog Tue Apr 01 00:29:51 2008 -0400
@@ -1,3 +1,7 @@ 2008-03-27 Jaroslav Hajek <address@hidden
+2008-03-31 Bill Denney <address@hidden>
+
+ * optimization/lsqnonneg.m: new function
+
2008-03-27 Jaroslav Hajek <address@hidden>
* general/lookup.m: Remove (lookup moved to DLD-FUNCTIONS).
diff -r 899bee11acd1 -r 7c261c6207ae scripts/optimization/lsqnonneg.m
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/scripts/optimization/lsqnonneg.m Tue Apr 01 00:29:51 2008 -0400
@@ -0,0 +1,147 @@
+## Copyright (C) 2008 Bill Denney
+##
+## This file is part of Octave.
+##
+## Octave is free software; you can redistribute it and/or modify it
+## under the terms of the GNU General Public License as published by
+## the Free Software Foundation; either version 3 of the License, or (at
+## your option) any later version.
+##
+## Octave is distributed in the hope that it will be useful, but
+## WITHOUT ANY WARRANTY; without even the implied warranty of
+## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+## General Public License for more details.
+##
+## You should have received a copy of the GNU General Public License
+## along with Octave; see the file COPYING. If not, see
+## <http://www.gnu.org/licenses/>.
+
+## -*- texinfo -*-
+## @deftypefn {Function File} address@hidden =} lsqnonneg (@var{c}, @var{d})
+## @deftypefnx {Function File} address@hidden =} lsqnonneg (@var{c}, @var{d},
@var{x0})
+## @deftypefnx {Function File} address@hidden, @var{resnorm}] =} lsqnonneg
(@dots{})
+## @deftypefnx {Function File} address@hidden, @var{resnorm}, @var{residual}]
=} lsqnonneg (@dots{})
+## @deftypefnx {Function File} address@hidden, @var{resnorm}, @var{residual},
@var{exitflag}] =} lsqnonneg (@dots{})
+## @deftypefnx {Function File} address@hidden, @var{resnorm}, @var{residual},
@var{exitflag}, @var{output}] =} lsqnonneg (@dots{})
+## @deftypefnx {Function File} address@hidden, @var{resnorm}, @var{residual},
@var{exitflag}, @var{output}, @var{lambda}] =} lsqnonneg (@dots{})
+## Minimize @code{norm (@address@hidden)} subject to @address@hidden >=
+## 0}. @var{c} and @var{d} must be real. @var{x0} is an optional
+## initial guess for @var{x}.
+##
+## Outputs:
+## @itemize @bullet
+## @item resnorm
+##
+## The squared 2-norm of the residual: norm(@address@hidden@var{d})^2
+## @item residual
+##
+## The residual: @address@hidden@var{x}
+## @item exitflag
+##
+## An indicator of convergence. 0 indicates that the iteration count
+## was exceeded, and therefore convergence was not reached; >0 indicates
+## that the algorithm converged. (The algorithm is stable and will
+## converge given enough iterations.)
+## @item output
+##
+## A structure with two fields:
+## @itemize @bullet
+## @item "algorithm": The algorithm used ("nnls")
+## @item "iterations": The number of iterations taken.
+## @end itemize
+## @item lambda
+##
+## Not implemented.
+## @end itemize
+## @seealso{optimset}
+## @end deftypefn
+
+## This is implemented from Lawson and Hanson's 1973 algorithm on page
+
+function [x, resnorm, residual, exitflag, output, lambda] = lsqnonneg (c, d,
x=[], options=[])
+
+ if isempty (x)
+ ## initial guess is 0s
+ x = zeros (columns (c), 1);
+ endif
+ if isempty (options)
+ ## FIXME: Optimset should be updated
+ ## options = optimset ();
+ options = struct ("maxiter", 1e5, "tolx", 1e-8);
+ endif
+
+ ## Initialize the values
+ p = [];
+ z = 1:numel (x);
+ ## compute the gradient
+ w = c'*(d - c*x);
+
+ iter = 0;
+ while ((! isempty (z)) && any (w(z) > 0) && (iter < options.maxiter))
+ idx = find (w == max (w));
+ if (numel (idx) > 1)
+ warning ("lsqnonneg:nonunique",
+ "A non-unique solution may be returned due to equal
gradients.");
+ idx = idx(1);
+ endif
+ p(end+1) = z(idx);
+ z(idx) = [];
+
+ newx = false ();
+ while ((! newx) && (iter < options.maxiter))
+ iter++;
+
+ cpos = c;
+ cpos(:,z) = 0;
+ ## find min norm solution to the positive matrix
+ [cpos_q cpos_r] = qr (cpos, 0);
+ xtmp = (cpos_r\cpos_q')*d;
+ xtmp(z) = 0;
+ if all (xtmp >= 0)
+ ## complete the inner loop
+ newx = true ();
+ x = xtmp;
+ else
+ mask = (xtmp < 0);
+ alpha = min(x(mask)./(x(mask) - xtmp(mask)));
+ x = x + alpha*(xtmp - x);
+ idx = find (x == 0);
+ p(ismember (p, x)) = [];
+ z = [z idx];
+ endif
+ endwhile
+ w = c'*(d - c*x);
+ endwhile
+
+ ## generate the additional output arguments
+ if (nargout > 1)
+ resnorm = norm (C*x-d) ^ 2;
+ endif
+ if (nargout > 2)
+ residual = d-C*x;
+ endif
+ exitflag = iter;
+ if ((nargout > 3) && (iter >= options.maxiter))
+ exitflag = 0;
+ endif
+ if (nargout > 4)
+ output = struct ("algorithm", "nnls", "iterations", iter);
+ endif
+ if (nargout > 5)
+ ## FIXME
+ error ("lsqnonneg: lambda is not yet implemented");
+ endif
+
+endfunction
+
+## Tests
+%!test
+%! C = [1 0;0 1;2 1];
+%! d = [1;3;-2];
+%! assert (lsqnonneg (C, d), [0;0.5], 100*eps)
+
+%!test
+%! C = [0.0372 0.2869;0.6861 0.7071;0.6233 0.6245;0.6344 0.6170];
+%! d = [0.8587;0.1781;0.0747;0.8405];
+%! xnew = [0;0.6929];
+%! assert (lsqnonneg (C, d), xnew, 0.0001)
- New function: lsqnonlin,
Bill Denney <=