Search code examples
pythonpytorchmatrix-factorization

How to change default optimization in spotlight from pytorch e.g. torch.optim.SGD?


I'm currently using spotlight https://github.com/maciejkula/spotlight/tree/master/spotlight to implement Matrix Factorization in recommender system. spotlight is based on pytorch, it's a integrated platform implementing RS. In spotlight/factorization/explicit, it uses torch.optim.Adam as optimizer, I want to change it to torch.optim.SGD. I tried

emodel = ExplicitFactorizationModel(n_iter=15,
                                embedding_dim=32, 
                                use_cuda=False,
                                loss='regression',
                                l2=0.00005,
                                optimizer_func=optim.SGD(lr=0.001, momentum=0.9))

but it gives:TypeError: init() missing 1 required positional argument: 'params' Any suggestions?


Solution

  • You could use partial from functools to first set the learning rate and momentum and then pass this class to ExplicitFactorizationModel. Something like:

    from functools import partial
    SDG_fix_lr_momentum = partial(torch.optim.SGD, lr=0.001, momentum=0.9)
    emodel = ExplicitFactorizationModel(n_iter=15,
                                    embedding_dim=32, 
                                    use_cuda=False,
                                    loss='regression',
                                    l2=0.00005,
                                    optimizer_func=SDG_fix_lr_momentum)