Search code examples
optimizationscipyleast-squares

Equation constraints with scipy least_squares


I'm trying to use least square to minimize a loss function by changing x,y,z. My problem is nonlinear hence why i chose scipy least_squares. The general structure is:

from scipy.optimize import least_squares
def loss_func(x, *arguments):
 # plug x's and args into an arbitrary equation and return loss
 return loss # loss here is an array

# x_arr contains x,y,z
res = least_squares(loss_func, x_arr, args=arguments)

I am trying to constraint x,y,z by: x-y = some value, z-y = some value. How do I go about doing so? The scipy least_squares documentation only provided bounds. I understand I can create bounds like 0<x<5. However, my constraints is an equation and not a constant bound. Thank you in advance!


Solution

  • If anyone ever stumble on this question, I've figured out how to overcome this issue. Since least_squares does not have constraints, it is best to just use linear programming with scipy.optimize.minimize. Since the loss_func returns an array of residuals, we can use L1 norm (as we want to minimize the absolute difference of this array of residuals).

    from scipy.optimize import minimize
    import numpy as np
    
    def loss_func(x, *arguments):
     # plug x's and args into an arbitrary equation and return loss (loss is an array)
     return np.linalg.norm(loss, 1)
    

    The bounds can be added to scipy.optimize.minimize fairly easily:)