I'd like to regularise the weights of a network with both L1 and L2 regularisation. However, I can't find a way to vary the strength of the regularisations independently. The Keras documentation doesn't provide any information either.
So, is there a way to use different strengths in the l1_l2
regulariser? Or perhaps an alternative method to achieve the same result?
My current model is simply:
stren = 0.001
model = Sequential()
model.add(Dense(64, input_dim=148, activation='relu', kernel_regularizer=reg.l2(stren)))
model.add(Dense(1, activation='sigmoid', kernel_regularizer=reg.l2(stren)))
And I'd like to be able to have something along the lines of:
kernel_regularizer=reg.l1_l2(l1_str, l2_str)
Of course you can vary the strengths of regularizers independently:
from keras import regularizers
regularizers.l1_l2(l1=0.001, l2=0.1) # the strength of l1 is set to 0.001 and l2 to 0.1