Search code examples
pythonpytorch

Combine module and list of torch.nn.Parameters in one optimizer


I have the following code:

optimizer = torch.optim.Adam([self.model.parameters()] + [self.latent_params_class.latent_params], lr=lr)

self.model is a BoTorch SingleTaskGP model (https://botorch.org/tutorials/fit_model_with_torch_optimizer) and self.latent_params_class.latent_params is just a list of torch.nn.Parameter. The above line throws the following error:

TypeError: optimizer can only optimize Tensors, but one of the params is Module.parameters

How do I put self.model.parameters() and self.latent_params_less.latent_params into the optimizer?


Solution

  • The issue doesn't seem to originate from the nn.Parameter but from the fact that you are passing a list containing a parameter generator (in 1st position). A solution is to convert the generator to list:

    model = nn.Linear(10,10)
    params = nn.Parameter(torch.rand(10))
    optimizer = torch.optim.Adam(list(net.parameters()) + [params], lr=1)