Search code examples
sparse-matrixderivativeopenmdao

How can I prevent OpenMDAO components from allocating large dense arrays for derivatives?


I have some OpenMDAO components that do math on 1D lists of x-coordinates and 1D lists of y-coordinates, outputting lists of distance and angle from each pair. Each computation depends on only a single x and y.

They are similar to

x = cos(t)
y = sin(t)
d2 = x**2 + y**2

where in the setup method I've configured

self.add_input("t", shape_by_conn=True)
self.add_output("x", copy_shape="t")
self.add_output("y", copy_shape="t")
self.add_output("d2", copy_shape="t")

and the compute method works as expected. Typically t is a 1D array from np.linspace().

In setup_partials I have

size = self._get_var_meta("t", "size")
self.declare_partials("x", ["t"],
    rows=range(size), cols=range(size))
self.declare_partials("y", ["t"],
    rows=range(size), cols=range(size))
self.declare_partials("d2", ["t"],
    rows=range(size), cols=range(size))

.

and in compute_partials I pass a 1D numpy array to J["x", "t"] and so on. The test check_partials throws no error when I set t to be of length 10. I do see it printing 10x10 arrays in its output if I do not have out_stream=None.

Yet, if I set t to have size 100000, when I run prob.check_partials(out_stream=None) it tries to allocate a large array: Unable to allocate 74.5 GiB for an array with shape (100000, 100000) and data type float64. I would have expected that only arrays of size ~100000 would be allocated, not 100000^2.

The same error occurs if I do not run check_partials, but instead attach an ExecComp:

self.add_subsystem("hypotsq", om.ExecComp("d_sq = x**2 + y**2",
    has_diag_partials=True,
    x={"shape_by_conn":True},
    y={"copy_shape": "x"},
    d_sq={"copy_shape": "y"}))

The doc page for Spartial partial derivatives mentions a "sparse AssembledJacobian" and also a "sparse global jacobian". Might I need one of those? Is there something else I've left out?


Solution

  • During normal run, it will only allocate sparse jacobians. However, during check_partials, OpenMDAO allocates a second dense array for the FD partials to be put into.

    OpenMDAO does this because in general, getting the rows/cols right is tricky and its often helpful to see the full jacobian matrix. Its clearly causing problems for your code, and there is a few workarounds I can suggest:

    1. when you run check_partials, set up your component with a much smaller size. You can use the full size at run time, but for checking this is excessive.

    2. if checking with a large value of t is important to you then consider using the directional derivative checking instead.