Search code examples
kerasneural-networkdeep-learningkeras-layer

Make a "non-fully connected" (singly connected?) neural network in keras


I don't know the name of what I'm looking for, but I want to make a layer in keras where each input is multiplied by its own, independent weight and bias. E.g. if there were 10 inputs, there would be 10 weights, and 10 biases, and each input would be multiplied by its weight and summed with its bias to get 10 outputs.

For example here is a simple Dense network:

from keras.layers import Input, Dense
from keras.models import Model
N = 10
input = Input((N,))
output = Dense(N)(input)
model = Model(input, output)
model.summary()

As you can see, this model has 110 parameters, because it is fully connected:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_2 (InputLayer)         (None, 10)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 10)                110       
=================================================================
Total params: 110
Trainable params: 110
Non-trainable params: 0
_________________________________________________________________

I want to replace output = Dense(N)(input) with something like output = SinglyConnected()(input), such that the model now has 20 parameters: 10 weights and 10 Biases.


Solution

  • Create a custom layer:

    class SingleConnected(Layer):
    
        #creator
        def __init__(self, **kwargs):
            super(SingleConnected, self).__init__(**kwargs)
    
       #creates weights
       def build(self, input_shape):
    
           weight_shape = (1,) * (len(input_shape) - 1)
           weight_shape = weight_shape + (input_shape[-1]) #(....., input)
    
           self.kernel = self.add_weight(name='kernel', 
                                      shape=weight_shape,
                                      initializer='uniform',
                                      trainable=True)
    
           self.bias = self.add_weight(name='bias', 
                                       shape=weight_shape,
                                       initializer='zeros',
                                       trainable=True)
    
           self.built=True
    
       #operation:
       def call(self, inputs):
           return (inputs * self.kernel) + self.bias
    
       #output shape
       def compute_output_shape(self, input_shape):
           return input_shape
    
       #for saving the model - only necessary if you have parameters in __init__
       def get_config(self):
           config = super(SingleConnected, self).get_config()
           return config
    

    Use the layer:

    model.add(SingleConnected())