Search code examples
pythontensorflowpytorchconv-neural-networkpruning

How to achieve removing/pruning the near-zero parameters in neural network?


I need to remove the near-zero weights of the Neural network so that the distribution of parameters is far away from the zero point. The distribution of weights after removing nearzero weights and weight-scaling

I met the problem from this paper: https://ieeexplore.ieee.org/document/7544366

I wonder how can I achieve this in my PyTorch/TensorFlow program, such as use a customized activation layer? Or Define a loss function that punishes the near-zero weight?

Thank you if you can provide any help.


Solution

  • You're looking for L1 regularization, read the docs.

    import tensorflow as tf
    
    tf.keras.layers.Dense(units=128,
                          kernel_regularizer=tf.keras.regularizers.L1(.1))
    

    Smaller coefficients will be turned to zero.