What is this "ih" and "ho" in this function . It is softmax activation function I am not able to understand the string check reason.
public double sim(double x, string layer)
{
// Determine max
double max = double.MinValue;
if (layer == "ih")
max = (ihSum0 > ihSum1) ? ihSum0 : ihSum1;
else if (layer == "ho")
max = (hoSum0 > hoSum1) ? hoSum0 : hoSum1;
// Compute scale
double scale = 0.0;
if (layer == "ih")
scale = Math.Exp(ihSum0 - max) + Math.Exp(ihSum1 - max);
else if (layer == "ho")
scale = Math.Exp(hoSum0 - max ) + Math.Exp(hoSum1 - max);
return Math.Exp(x - max) / scale;
}
The function is not too hard to be understood. You may want to take some time to look at how the function implement neural network activation function behavior.
In neural network, it is typical for you to have activation function which receives sets of input and decides which would trigger the function based on maximum value (among the inputs).
Likewise for your case.
It seems like there are two sets of input (each "set" called "layer", thus there are two layers) codenamed "ih" and "ho". Each set further has two elements called Sum0
and Sum1
and thus making combinations of four inputs:
1. ihSum0
and ihSum1
(for ih
layer)
2. hoSum0
and hoSum1
(for ho
layer)
Whatever the ih
, ho
, and layer
mean in your context, you would understand better. But the function simply checks which input set (or "layer") is to be used (that is "ih" or "ho") to determine two variables (max
and scale
).
if (layer == "ih")
max = (ihSum0 > ihSum1) ? ihSum0 : ihSum1;
else if (layer == "ho")
max = (hoSum0 > hoSum1) ? hoSum0 : hoSum1;
which ultimately (together with x
), will be used to determine the final output of your function.
return Math.Exp(x - max) / scale;