Search code examples
rneural-networkregularized

How to implement regularization / weight decay in R


I'm surprised at the number of R neural network packages that don't appear to have a parameter for regularization/lambda/weight decay. I'm assuming I'm missing something obvious. When I use a package like MLR and look at the integrated learners, I don't see parameters for regularization.

For example: nnTrain from the deepnet package:
list of params

I see parameters for just about everything - even drop out - but not lambda or anything else that looks like regularization.

My understanding of both caret and mlr is that they basically organize other ML packages and try to provide a consistent way to interact with them. I'm not finding L1/L2 regularization in any of them.

I've also done 20 google searches looking for R packages with regularization but found nothing. What am I missing? Thanks!


Solution

  • I looked through more of the models within mlr, (a daunting task), and eventually found the h2o package learners. In mlr, the classif.h2o.deeplearning model has every parameter I could think of, including L1 and L2.

    Installing h2o is as simple as:
    install.packages('h2o')