I've been profiling an application all day long and, having optimized a couple bits of code, I'm left with this on my todo list. It's the activation function for a neural network, which gets called over a 100 million times. According to dotTrace, it amounts to about 60% of the overall function time.
How would you optimize this?
public static float Sigmoid(double value) {
return (float) (1.0 / (1.0 + Math.Pow(Math.E, -value)));
}
Try:
public static float Sigmoid(double value) {
return 1.0f / (1.0f + (float) Math.Exp(-value));
}
EDIT: I did a quick benchmark. On my machine, the above code is about 43% faster than your method, and this mathematically-equivalent code is the teeniest bit faster (46% faster than the original):
public static float Sigmoid(double value) {
float k = Math.Exp(value);
return k / (1.0f + k);
}
EDIT 2: I'm not sure how much overhead C# functions have, but if you #include <math.h>
in your source code, you should be able to use this, which uses a float-exp function. It might be a little faster.
public static float Sigmoid(double value) {
float k = expf((float) value);
return k / (1.0f + k);
}
Also if you're doing millions of calls, the function-calling overhead might be a problem. Try making an inline function and see if that's any help.