I am working on a maximum likelihood estimator, and one of the parameters is estimated using the digamma function. I'm trying to use uniroot to solve the equation but am unable to do so. Here's my code:
dig = function(alpha){
digamma(2 + alpha) - digamma(alpha) - (1/(2+alpha)) + (2/(2+alpha))
}
curve(dig, from = 0, to = 10)
uniroot(dig, lower = 0, upper = 10)
This produces the following error:
Error in uniroot(dig, lower = 0, upper = 10) : f.lower = f(lower) is NA
In addition: Warning messages:
1: In digamma(alpha) : NaNs produced
2: In digamma(alpha) : NaNs produced
The first error sort of makes sense based on the curve, but the second has me stuck. It's entirely possible that I'm misunderstanding how to find the roots of the digamma function, or that there's a numerical package (maybe rootsolve?) in R that could help. Not sure what I'm missing here- any tips would be appreciated. Thanks!
Consider the following
curve(dig, from = 0.01, to = 10)
uniroot(dig, lower = 0.01, upper = 10)
Error in
uniroot(dig, lower = 0.01, upper = 10)
:f()
values at end points not of opposite sign