I'm trying to estimate LAD regression, but it gives my the message: "false convergence (8)". What does it mean and why nlminb
estimators are equal to lm
estimators?
Sample generation step
dgp=function(){
x=c(sample(0:9,10),sample(0:9,10));
b0=2;
b1=-6;
eps=rbinom(20,1,0.05)*rnorm(20,0,1)+rbinom(20,1,0.95)*rnorm(20,0,1);
eps=eps/sd(eps);
y=b0+b1*x+eps;
return(data.frame(y=y,x=x))
}
z=dgp()
Estimation step
LAD=function(...){
z=(...);
y=z[[1]]
x=z[[2]]
LADf=function(par) {(sum(y-par[1]-par[2]*x)^2)}
outLS=lm(y~x);
b0=as.numeric(outLS$coefficients[1]);b0
b1=as.numeric(outLS$coefficients[2]);b1
out=nlminb(c(b0,b1),LADf)
return(list(out$par,out$message))
}
LAD(z)
Your LAD function:
LADf=function(par) {(sum(y-par[1]-par[2]*x)^2)}
Looks exactly like least squares for me. Hence what you do is minimizing sum of squares, not abs of deviance. You need something like
LADf <- function(par) { sum(abs(y - par[1] - par[2]*x)) }
Note that this function is not differentiable, so you have to use an optimizer that can handle that (Nelder-Mead or SANN for instance).
LAD estimator is also equivalent to median regression so you can do this with quantile regression package instead.