Search code examples
pythontensorflowtheanopycaffepytorch

Fixing a subset of weights in Neural network during training


I am considering creating a customized neural network. The basic structure is the same as usual, but I want to truncate the connections between layers. For example, if I construct a network with two hidden layers, I would like to delete some weights and keep the others, like so:

enter image description here

This is not conventional dropout (to avoid overfitting), since the remaining weights (connections) should be specified and fixed.

Are there any ways in python to do it? Tensorflow, pytorch, theano or any other modules?


Solution

  • Yes you can do this in tensorflow.

    You would have some layer in your tensorflow code something like so:

    m = tf.Variable( [width,height] , dtype=tf.float32  ))
    b = tf.Variable( [height] , dtype=tf.float32  ))
    h = tf.sigmoid( tf.matmul( x,m ) + b )
    

    What you want is some new matrix, let's call it k for kill. It is going to kill specific neural connections. The neural connections are defined in m. This would be your new configuration

    k = tf.Constant( kill_matrix , dtype=tf.float32 )
    m = tf.Variable( [width,height] , dtype=tf.float32  )
    b = tf.Variable( [height] , dtype=tf.float32  )
    h = tf.sigmoid( tf.matmul( x, tf.multiply(m,k) ) + b )
    

    Your kill_matrix is a matrix of 1's and 0's. Insert a 1 for every neural connection you want to keep and a 0 for every one you want to kill.