I want to make a model using TensorFlow which will return the 2 characteristics of a Weibull distribution. In order to make it I need to create a loss function which fits the Weibull Distribution. I found online how to make Negative log likelihood for a binomial distribution (k is the shape parameter, l is the scale parameter and y_true is the current value that the loss function gets):
nll = (
tf.math.lgamma(k)
+ tf.math.lgamma(y_true + 1)
- tf.math.lgamma(k + y_true)
- k * tf.math.log(l)
- y_true * tf.math.log(1 - l)
)
but I don't know how to calculate the negative log likelihood for a Weibull distribution.
you need to take the sum of the log probabilities of this PDF:
Now, ignoring the part of x<0 (which i hope that your dataset fulfill, otherwise you have chosen the wrong distribution), the transformation is done as follows:
prod p(x)
log(prod p(x))
sum log(p(x))
p(x)
with the above formula, and apply the properties of logNow, your NN should predict both k
and lambda
(will call it l
), supposing that the output layer respect the domain of the parameters, the loss of a single element is the following:
def loss(x,k,l,epsilon=1e-7):
return tf.math.log(k)-tf.math.log(l) + (tf.math.log(x) - tf.math.log(k)) * (k-1) - tf.pow(x/(k+epsilon), k)
And I'm pretty sure you can just add a reduce at the beginning to make it work for multiple prediction:
def loss(targets,k_predictions,lambda_predictions,epsilon=1e-7):
return tf.reduce_mean(
tf.math.log(k_predictions) - tf.math.log(lambda_predictions) + (tf.math.log(targets) -
tf.math.log(k_predictions)) * (k_predictions-1) - tf.pow(targets/(k_predictions+epsilon), k_predictions)
)