I'm trying to use the ExpmGrad
function built in in Theano.
However, when I define a theano.function
out of the ExpmGrad
I get an error saying that the outputs must by theano variables.
I'm not sure on what exactly should be the correct way to use this ExpmGrad
function as I didn't find any examples of its usage online.
This is what I tried:
import theano
from theano.tensor import T
J1 = T.dscalar('J1')
H = np.arange(16).reshape(4, 4) * J1
gJ = theano.tensor.slinalg.ExpmGrad(H)
f = theano.function([J1], gJ)
and this is the error I get:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-122-2e2976e72a77> in <module>()
4 # gJ = theano.gradient.jacobian(H[0], J1)
5 gJ = theano.tensor.slinalg.ExpmGrad(H)
----> 6 f = theano.function([J1], gJ)
//anaconda/lib/python3.5/site-packages/theano/compile/function.py in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
318 on_unused_input=on_unused_input,
319 profile=profile,
--> 320 output_keys=output_keys)
321 # We need to add the flag check_aliased inputs if we have any mutable or
322 # borrowed used defined inputs
//anaconda/lib/python3.5/site-packages/theano/compile/pfunc.py in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
440 rebuild_strict=rebuild_strict,
441 copy_inputs_over=True,
--> 442 no_default_updates=no_default_updates)
443 # extracting the arguments
444 input_variables, cloned_extended_outputs, other_stuff = output_vars
//anaconda/lib/python3.5/site-packages/theano/compile/pfunc.py in rebuild_collect_shared(outputs, inputs, replace, updates, rebuild_strict, copy_inputs_over, no_default_updates)
225 raise TypeError('Outputs must be theano Variable or '
226 'Out instances. Received ' + str(v) +
--> 227 ' of type ' + str(type(v)))
228 # computed_list.append(cloned_v)
229 else:
TypeError: Outputs must be theano Variable or Out instances. Received ExpmGrad of type <class 'theano.tensor.slinalg.ExpmGrad'>
What am I doing wrong?
ExpmGrad
is not a function, it's a subclass of theano.Op
. "Calling" the class will just create a Op instance, not the result you want.
To use it properly, you should instantiate the Op as a functor to make use of it:
expm_grad = theano.tensor.slinalg.ExpmGrad()
gJ = expm_grad(H, gw)
For the above code, you need to properly define gw
argument, which is the upstream gradient.
Note: the gradient op is normally not designed to be used directly, it's recommended to use theano.grad
to indirectly use it
J1 = T.dscalar('J1')
H = np.arange(16).reshape(4, 4) * J1
expH = theano.tensor.slinalg.expm(H)
e = some_scalar_function(expH)
gJ = theano.grad(e, J1)
f = theano.function([J1], gJ)