Search code examples
machine-learningdeep-learningneural-networksealfhe

What is the multiplicative depth of a single relu layer for encrypted inference?


For a single square activation (x^2), the multiplicative depth is 1, and for a polynomial activation like x^3 + x, the multiplicative depth is 2.

For a convolution (wx + b), multiplicative depth is 1.

What's the depth for a single ReLU layer?


Solution

  • ReLU is not a polynomial function over the reals and therefore an exact computation of the ReLU has infinite depth. This problem can be eliminated in several ways. One option is to replace ReLU by some other function such as the square activation. Another option is to approximate the ReLU using a polynomial. Yet another option is to use the base-2 representation of numbers and compute ReLU as a logic function instead of computing it as an arithmetic function. Other options may exist.