I need to do some optimization but I have very little experience and not 100% where to start.
Say I have a regression equation;
y = exp(intercept + beta1 * x1 + beta2 * x2)
How could I use scipy.optimize to find the inputs to x1 and x2 that will maximize y given;
intercept = 10
beta1 = 1.1
beta2 = 1.2
Constraints;
(x1 + x2) has to be less than 10
x1 has to be greater than x2
I don't want to brute force it (i.e. calculate y for every combination of x1 and x2 manually). This is an extremely simple example that I hope to learn and build from.
I was able to find a package called gekko that can do nonlinear optimization like I needed. Full docs for the package can be found here; Gekko Documentation
Here's the code I used to solve my simple example problem:
from gekko import GEKKO
m = GEKKO() # Initialize gekko
# Initialize variables
x1 = m.Var(value=1,lb=0,ub=10)
x2 = m.Var(value=1,lb=0,ub=10)
# Init the coefficients
alpha = m.Const(value=7)
beta1 = m.Const(value=1.01)
beta2 = m.Const(value=1.02)
# Inequality constraints
m.Equation(x1 + x2 <= 10)
m.Equation(x2 < x1)
m.Obj(-1 * m.exp(alpha + x1*beta1 + x2*beta2)) # Objective
m.options.IMODE = 3 # Steady state optimization set to 3, change to 1 for integer
m.solve() # Solve
print('Results')
print('x1: ' + str(x1.value))
print('x2: ' + str(x2.value))