Search code examples
optimizationluaneural-networkdeep-learningtorch

Can Torch optim package support multiple inputs


I'm trying to use the torch7 optim package adam algorithm implementation for optimizing a neural network which takes two independent inputs. Can this be done? The code seems to only support a single input vector. Is there some other implementation which can take a generic table of inputs? The reference usage I saw, upon which I based my code is here


Solution

  • In principle, if you flatten all your parameters and gradients into two tensors then optim can handle it. See getParameters() function.