Search code examples
optimizationjuliaminimization

How to defining Non Linear Vector Constraints in Julia


I'm trying to minimize a function which takes a vector as input and is subjected to some non linear constraints. I'm very new to Julia. I’m trying to implement pseudospectral methods using Ipopt.My isssue is Optimizer which i'm using takes gradient of cost function and constraints. Functions like "ForwardDiff , ReverseDiff" are not helping in finding the gradient of my vector function. I found that similar issue has been face by @acauligi. So far I haven't found any solution.

using LinearAlgebra, DiffEqOperators, ForwardDiff, ApproxFun, FFTW, ToeplitzMatrices
using ModelingToolkit,DifferentialEquations,NLPModels,ADNLPModels,NLPModelsIpopt
using DCISolver,JSOSolvers

# Number of collocation points
N=31  # This number can go up to 200

function Dmatrix(N::Integer)
    h=2*pi/N;
    ns=range(1,N-1,step=1);
    col1(nps)=0.5*((-1)^nps)/sin(nps*h/2);
    col=[0,col1.(ns)...];
    row=[0,col[end:-1:2]...];
    D=Toeplitz(col,row)
end

Dmat=Dmatrix(N);

function dzdt(x,y,t,a)
    u=(1-(x^2)/4)-y^2;
    dx=-4*y+x*u+a*x;
    dy=x+y*u+a*y;
    [dx,dy]
end
# initial guess
tfinal=1.1*pi;
tpoints=collect(range(1,N,step=1))*tfinal/N;
xguess=sin.((2*pi/tfinal)*tpoints)*2.0
yguess=-sin.((2*pi/tfinal)*tpoints)*0.5

function dxlist(xs,ys,tf,a)
    nstates=2
     ts=collect(range(1,N,step=1))*tf/N;
    xytsZip=zip(xs,ys,ts);
    dxD0=[dzdt(x,y,t,a) for (x,y,t) in xytsZip];
    dxD=reduce(hcat, dxD0)';
    xlyl=reshape([xs;ys],N,nstates);
    dxF=(Dmat*xlyl)*(2.0*pi/tf);
    err=dxD-dxF;
    [vcat(err'...).-10^(-10);-vcat(err'...).+10^(-10)]
end


function cons(x)
    tf=x[end-1];
    a=x[end];
    xs1=x[1:N];
    ys1=x[N+1:2*N];
    dxlist(xs1,ys1,tf,a)

end
a0=10^-3;
x0=vcat([xguess;yguess;[tfinal,a0]]);

obj(x)=0.0;

xlower1=push!(-3*ones(2*N),pi);
xlower=push!(xlower1,-10^-3)
xupper1=push!(3*ones(2*N),1.5*pi);
xupper=push!(xupper,10^-3)
consLower=-ones(4*N)*Inf;
consUpper=zeros(4*N)


# println("constraints vector = ",cons(x0))

model=ADNLPModel(obj,x0,xlower,xupper,cons,consLower,consUpper; backend = 
ADNLPModels.ReverseDiffAD)
output=ipopt(model)
xstar=output.solution
fstar=output.objective

I got the solution for this same problem in 3 minutes in MatLab.(solution to this problem is . Time period of system is "pi" when a=0.).

I was hoping I could get the same result much faster in Julia. I have asked in Julia discourse so far I have got any suggestion. Any suggestion on how fix this issue highly appreciated. Thank you all.


Solution

  • I think there was two issues with your code. First,

    xupper1=push!(3*ones(2*N),1.5*pi);
    xupper=push!(xupper1,10^-3)
    

    and then for some reason the product of the Toeplitz matrix by another matrix gives an error with the automatic differentiation. However, the following works:

    function dxlist(xs,ys,tf,a)
        nstates=2
         ts=collect(range(1,N,step=1))*tf/N;
        xytsZip=zip(xs,ys,ts);
        dxD0=[dzdt(x,y,t,a) for (x,y,t) in xytsZip];
        dxD=reduce(hcat, dxD0)';
        xlyl=reshape([xs;ys],N,nstates);
        dxF=vcat((Dmat*xlyl[:,1])*(2.0*pi/tf), (Dmat*xlyl[:,2])*(2.0*pi/tf));
        err=vcat(dxD...) - dxF;
        [err.-10^(-10);-err.+10^(-10)]
    end
    

    At the end, Ipopt returns the right results

    model=ADNLPModel(obj,x0,xlower,xupper,cons,consLower,consUpper)
    output=ipopt(model)
    xstar=output.solution
    fstar=output.objective
    

    I also noticed that using Percival.jl is faster

    using Percival
    output=percival(model, max_time = 300.0)
    xstar=output.solution
    fstar=output.objective
    

    Note that ADNLPModels.jl is receiving some attention and will improve significantly.