I want to add custom constraints on the parameters of a layer.
I write a custom activation layer with two trainable parameters a and b s.t:
activation_fct = a*fct() + b*fct()
.
I need to have the sum of the parameters (a+b) equal to 1 but I don't know how to write such a constraint.
Can you give me some advices ?
Thanks in advance.
You can have a single weight instead of two, and use this custom constraint:
import keras.backend as K
class Between_0_1(keras.constraints.Constraint):
def __call__(self, w):
return K.clip(w, 0, 1)
Then when building the weights, build only a and use the constraints.
def build(self, input_shape):
self.a = self.add_weight(name='weight_a',
shape=(1,),
initializer='uniform',
constraint = Between_0_1(),
trainable=True)
#if you want to start as 0.5
K.set_value(self.a, [0.5])
self.built = True
In call
, b = 1-a
:
def call(self, inputs, **kwargs):
#do stuff
....
return (self.a * something) + ((1-self.a)*another_thing)
You can alsto try @MatusDubrava softmax
approach, but in this case your weights need to have shape (2,)
, and no constraint:
def build(self, input_shape):
self.w = self.add_weight(name='weights',
shape=(2,),
initializer='zeros',
trainable=True)
self.build = True
def call(self, inputs, **kwargs):
w = K.softmax(self.w)
#do stuff
....
return (w[0] * something ) + (w[1] * another_thing)