Search code examples
matlabnumerical-methods

'HessPattern' not working in matlab?


I've this struct for setting up a solver for an optimization problem:

option = optimoptions(@fminunc,...
    'Display','iter','GradObj','on','MaxIter',30,...
    'ObjectiveLimit',10e-10,'Algorithm','quasi-newton','HessPattern',sparseH);

Such setting seems fine to me, but when I run my solver with a call

[P, FVAL, INFO, OUTPUT, GRAD, HESS] = fminunc (@myFunc,X0(:),option);

(Literally the next call), I get the error

Requested 254016x254016 (480.7GB) array exceeds maximum array size preference. Creation of arrays greater than this limit may
take a long time and cause MATLAB to become unresponsive. See array size limit or preference panel for more information.

However my sparseH is

>> whos sparseH
  Name              Size                   Bytes  Class     Attributes

  sparseH      254016x254016            87043112  double    sparse    

Moreover if I set trust-regions instead of quasi-newton the algorithm run. Although for small inputs I can actually see quasi-newton is faster.

Am I setting something wrong?

For a better check I did

>> A = sparseH(1:100,1:100);
>> sum(A(:))

ans =

   (1,1)      880

>> size(A)

ans =

   100   100

>> 

So I have less than 1000 ones, in a submatrix with a total of 10000 entries. Also in total

>> sum(sparseH(:))

ans =

   (1,1)        5313186

>> prod(size(sparseH))

ans =

   6.4524e+10

>> 

Solution

  • As discussed in the comments: It appears that a quasy-newton approach would need the entire Hessian matrix and that HessPattern is only available for trust-region type of algorithms.

    Indeed the quasy-newton approach is a faster algorithm, but it is common that faster algorithms need higher memory and often very big problems can only be solved with simpler optimization, by waiting longer (e.g. Neural Networks are just solved using gradient descend-type algoritms)