Search code examples
openmdao

Can a SubmodelComp have its own driver?


I'm checking out the new OpenMDAO SubmodelComp and I am wondering if I understand how it is meant to be used correctly.

Is it possible for a SubmodelComp to have its own Driver and optimizer? In the scheme I am implementing the Submodel would use gradient-based optimization, while the top level problem would have a gradient-free optimizer and work on discrete variable inputs to the subproblem so being able to get gradients of the Submodel is of no concern.

Some experimentation seems to indicate that perhaps the Submodel is not meant to be used this way, I am doing something wrong, or there is a bug. I have made two test cases using the paraboloid problem to keep the code short. I am not sure if I am running into a bug or am just using it incorrectly.

First example:

import openmdao.api as om

sub_prob = om.Problem()

sub_prob.model.add_subsystem("paraboloid", om.ExecComp("f = (x-3)**2 + x*y + (y+4)**2 - 3"))

sub_prob.driver = om.ScipyOptimizeDriver()
sub_prob.driver.options["optimizer"] = "SLSQP"

sub_prob.model.add_design_var("paraboloid.x", lower=-50, upper=50)
sub_prob.model.add_design_var("paraboloid.y", lower=-50, upper=50)
sub_prob.model.add_objective("paraboloid.f")

top_prob = om.Problem()
submodel = om.SubmodelComp(problem=sub_prob, outputs=[("paraboloid.f", "g")])
top_prob.model.add_subsystem("submodel", submodel, promotes=["*"])
top_prob.setup()
top_prob.run_model()

print(sub_prob.get_val("paraboloid.f"))
print(sub_prob.get_val("paraboloid.x"))
print(sub_prob.get_val("paraboloid.y"))

the expected result is returned:

Optimization terminated successfully    (Exit mode 0)
            Current function value: -27.333333074220675
            Iterations: 6
            Function evaluations: 6
            Gradient evaluations: 6
Optimization Complete
-----------------------------------
[-27.33333307]
[6.66712855]
[-7.33324946]

Now, if I add an input to the Submodel, modifying the paraboloid equation, and just leave it constant at the top level, the problem fails

import openmdao.api as om

sub_prob = om.Problem()

# modify paraboloid eqn with parameter "a"
sub_prob.model.add_subsystem("paraboloid", om.ExecComp("f = (x-3)**2 + x*y + (y+4)**2 + a"))

sub_prob.driver = om.ScipyOptimizeDriver()
sub_prob.driver.options["optimizer"] = "SLSQP"

sub_prob.model.add_design_var("paraboloid.x", lower=-50, upper=50)
sub_prob.model.add_design_var("paraboloid.y", lower=-50, upper=50)
sub_prob.model.add_objective("paraboloid.f")

top_prob = om.Problem()
# submodel = om.SubmodelComp(problem=sub_prob, outputs=[("paraboloid.f", "g")])
submodel = om.SubmodelComp(problem=sub_prob, inputs=[("paraboloid.a", "a")], outputs=[("paraboloid.f", "g")])
top_prob.model.add_subsystem("submodel", submodel, promotes=["*"])
top_prob.setup()
top_prob.set_val("a", val=-3)

top_prob.run_model()

print(sub_prob.get_val("paraboloid.f"))
print(sub_prob.get_val("paraboloid.x"))
print(sub_prob.get_val("paraboloid.y"))

output is:

Inequality constraints incompatible    (Exit mode 4)
            Current function value: -1e+30
            Iterations: 49
            Function evaluations: 49
            Gradient evaluations: 49
Optimization FAILED.
Inequality constraints incompatible
-----------------------------------
[-1.e+30]
[-50.]
[-50.]

Which is not the expected result, the answer should not change. Strangely enough, if modify the the paraboloid equation like so

sub_prob.model.add_subsystem("paraboloid", om.ExecComp("f = (x-3)**2 + x*y + (y+4)**2 + a -a"))

I get the following output:

Optimization terminated successfully    (Exit mode 0)
            Current function value: -24.333333074220675
            Iterations: 6
            Function evaluations: 6
            Gradient evaluations: 6
Optimization Complete
-----------------------------------
[-24.33333307]
[6.66712855]
[-7.33324946]
.......\openmdao\core\total_jac.py:1788: DerivativesWarning:Design variables [('paraboloid.a', inds=[0])] have no impact on the constraints or objective.

The last warning indicates that the input "a" is being treated as a design variable, although nowhere is it designated as a design variable.

To restate my question, is the behavor I'm seeing due to the fact a SubmodelComp is not meant to have a driver attached, is my problem setup wrong, or is it a bug?


Solution

  • This is a timely question. We're currently undergoing a refactor that will clean up the subproblem implementation.

    Currently subproblem was designed to for that problem to use run_model rather than run_driver.

    Once that cleanup is done, which should be released in the next week or so, we plan on making it easier to driver sub-optimizations in each subproblem. Currently we have to place a "dummy" driver on that problem, because derivatives are coupled to the driver (this will change in our upcoming release).

    This will, as you say, enable you to drive a gradient based optimization with a DOE or some other gradient-free driver. We don't currently have a plan to do gradients "through" a sub-optimization, but as our capability expands that may happen someday.