Search code examples
pythonoptimizationpytorchgenerative-adversarial-network

Pytorch Custom Optimizer got an empty parameter list


new here. I am trying to create a custom optimizer in PyTorch, where the backprop takes place in a meta RL policy, with the policy receiving the model parameters, and outputting the desired model parameters. However, I am seeing the above error. My models work fine on Adam and SGD, but not my optimizer.

Code:

class MetaBackProp(torch.optim.Optimizer):
    def __init__(self, params):

        self.param_shape_list = np.array([])
        for param in list(params):
            np.append(self.param_shape_list, list(param.size()))

        pseudo_lr = 1e-4
        pseudo_defaults = dict(lr=pseudo_lr)
        length = 100 #TODO: get shape, flatten, multiply...
        self.policy = AEPolicy(length)
        self.policy_optim = torch.optim.Adam(self.policy.parameters(), lr=pseudo_lr)
        super(MetaBackProp, self).__init__(params, pseudo_defaults)

    def step(self, closure=None):
        params = torch.cat([p.view(-1) for p in self.param_groups])
        self.policy_optim.zero_grad()
        quit()

Traceback:

Traceback (most recent call last):
  File "main.py", line 6, in <module>
    gan = CycleGAN()
  File "/home/ai/Projects_v2/R/cycle_gan.py", line 32, in __init__
    self.discriminator2_optim = MetaBackProp(self.discriminator2.parameters())
  File "/home/ai/Projects_v2/R/lr_schedule.py", line 34, in __init__
    super(MetaBackProp, self).__init__(params, pseudo_defaults)
  File "/home/ai/anaconda3/lib/python3.7/site-packages/torch/optim/optimizer.py", line 46, in __init__
    raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list

Solution

  • You retrieve the parameters with self.discriminator2.parameters(), which returns an iterator. In your constructor you are converting them to a list for the for loop:

    for param in list(params):
    

    This consumes the iterator, but you are passing that same iterator to the constructor of the base class, hence it does not contain any parameter at all.

    super(MetaBackProp, self).__init__(params, pseudo_defaults)
    

    Instead of passing the iterator, you can use the list you created from the iterator, since the parameters just need to be iterable, which lists are.

    # Convert parameters to a list to allow multiple iterations
    params = list(params)
    for param in params: