Search code examples
rlinear-regressionleast-squaresconfidence-intervalstandard-error

Manually calculating the confidence interval of a multiple linear regression(OLS)


I am trying to understand how to manually calculate a confidence interval of a multiple linear regression(OLS). My problem is that I don't know how to calculate the standard error for all of the individual coefficients.

For a regression with just one independent variable, I followed to following tutorial: http://stattrek.com/regression/slope-confidence-interval.aspx. This tutorial provides the following formula:

formula

As it turns out, the formula works. However, I did not fully understand the formula. For example, why is the (-2) at the top of the formula. To validate the correctness I wrote the following r code that already shows the standard errors:

x<-1:50
y<-c(x[1:48]+rnorm(48,0,5),rnorm(2,150,5))

QR <- rq(y~x, tau=0.5)
summary(QR, se='boot')

LM<-lm(y~x)

alligator = data.frame(
      lnLength = c(3.78, 3.71, 3.73, 3.78),
      lnWeight = c(4.43, 4.38, 4.42, 4.25)
)

alli.mod1 = lm(lnWeight ~ ., data = alligator)

newdata = data.frame(
      lnLength = c(3.78, 3.71, 3.73, 3.78)
)

y_predicted = predict(alli.mod1, newdata, interval="predict")[,1]
length_mean = mean(alligator$lnLength)
> summary(alli.mod1)

 Call:
 lm(formula = lnWeight ~ ., data = alligator)

 Residuals:
        1        2        3        4 
  0.08526 -0.02368  0.03316 -0.09474 

 Coefficients:
             Estimate Std. Error t value Pr(>|t|)
 (Intercept)   7.5279     5.7561   1.308    0.321
 lnLength     -0.8421     1.5349  -0.549    0.638

 Residual standard error: 0.09462 on 2 degrees of freedom
 Multiple R-squared:  0.1308,   Adjusted R-squared:  -0.3038 
 F-statistic: 0.301 on 1 and 2 DF,  p-value: 0.6383

Then I manually computed the SE using the following r code(According to the above formula):

rss = (alligator$lnWeight[1] - y_predicted[1])^2 + 
      (alligator$lnWeight[2] - y_predicted[2])^2 +
      (alligator$lnWeight[3] - y_predicted[3])^2 + 
      (alligator$lnWeight[4] - y_predicted[4])^2

a = sqrt(rss/(length(y_predicted)-2))

b = sqrt((alligator$lnLength[1] - length_mean)^2 + 
         (alligator$lnLength[2] - length_mean)^2 +
         (alligator$lnLength[3] - length_mean)^2 + 
         (alligator$lnLength[4] - length_mean)^2)

a/b
1.534912

Which resulted in the same value as the SE of summary(alli.mod1). So, I thought, maybe it works when I try it with 2 variables. Unfortunately, this resulted in an incorrect answer. As shown in the below code:

alligator = data.frame(
    lnLength = c(3.78, 3.71, 3.73, 3.78),
    lnColor = c(2.43, 2.59, 2.46, 2.22),
    lnWeight = c(4.43, 4.38, 4.42, 4.25)
  )

alli.mod1 = lm(lnWeight ~ ., data = alligator)

newdata = data.frame(
  lnLength = c(3.78, 3.71, 3.73, 3.78),
  lnColor = c(2.43, 2.59, 2.46, 2.22)
)


y_predicted = predict(alli.mod1, newdata, interval="predict")[,1]
length_mean = mean(alligator$lnLength)
color_mean = mean(alligator$lnColor)


rss = (alligator$lnWeight[1] - y_predicted[1])^2 + 
      (alligator$lnWeight[2] - y_predicted[2])^2 +
      (alligator$lnWeight[3] - y_predicted[3])^2 + 
      (alligator$lnWeight[4] - y_predicted[4])^2

a = sqrt(rss/(length(y_predicted)-2))

b = sqrt((alligator$lnColor[1] - color_mean)^2 + 
         (alligator$lnColor[2] - color_mean)^2 +
         (alligator$lnColor[3] - color_mean)^2 + 
         (alligator$lnColor[4] - color_mean)^2)

b1 = sqrt((alligator$lnLength[1] - length_mean)^2 + 
          (alligator$lnLength[2] - length_mean)^2 +
          (alligator$lnLength[3] - length_mean)^2 + 
          (alligator$lnLength[4] - length_mean)^2)

> summary(alli.mod1)
Call:
lm(formula = lnWeight ~ ., data = alligator)

Residuals:
        1         2         3         4 
 0.006725 -0.041534  0.058147 -0.023338 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)
(Intercept)  -3.5746     8.8650  -0.403    0.756
lnLength      1.6569     2.1006   0.789    0.575
lnColor       0.7140     0.4877   1.464    0.381

Residual standard error: 0.07547 on 1 degrees of freedom
Multiple R-squared:  0.7235,    Adjusted R-squared:  0.1705 
F-statistic: 1.308 on 2 and 1 DF,  p-value: 0.5258

> a/b
             1 
     0.2009918 
> a/b1
             1 
     0.8657274 

Is there a general approach I could follow to compute the standard error?


Solution

  • I would suggest some general reading on OLS, including multiple regression. There are several freely available sources of information; one starting point might be Penn State's STAT 501 website. You can find a derivation of the formula for OLS β standard errors on slides 8 and 9 of these slides from MIT Open Courseware.

    Essentially, the standard error is the square root of the variance in β. As you can see in the slides I linked, the formula for the coefficients' variance-covariance matrix is σ2(X'X)-1, where σ2 is the variance of the error term. Then the variance of each βj is the j-th diagonal of that matrix. Since we don't know the true σ2, we estimate it as you did above -- we take the square root of the sum of the squared errors divided by n - p, where p is the number of explanatory variables (including/+ the intercept) -- in simple regression p = 2. However, while in the case of simple regression, the diagonals of (X'X)-1 can be found by the denominator of your formula up there, this won't be the case in multiple regression; you'll need to do the matrix algebra. Fortunately this is very easy in R:

    # First we make the example data
    alligator = data.frame(
        lnLength = c(3.78, 3.71, 3.73, 3.78),
        lnColor = c(2.43, 2.59, 2.46, 2.22),
        lnWeight = c(4.43, 4.38, 4.42, 4.25)
    )
    # Then we use lm() for a check on our answers later
    alli.mod1 = lm(lnWeight ~ ., data = alligator)
    # Find the sum of the squared residuals
    rss <- sum(alli.mod1$residuals^2)
    # And use that to find the estimate of sigma^2, commonly called S
    S <- sqrt(rss / (length(alli.mod1$residuals) - length(alli.mod1$coefficients)))
    # Make the X matrix; a column of 1s for the intercept and one for each variable
    X <- cbind(rep(1, nrow(alligator)), alligator$lnLength, alligator$lnColor)
    # We can multiply matrices using %*%, transpose them with t(),
    # and invert them with solve(); so we directly apply the formula above with:
    std.errors <- S * sqrt(diag(solve(t(X) %*% X)))
    # Now we check our answers:
    summary(alli.mod1)$coefficients[ , 2] # the second column is the std. errors
    # (Intercept)    lnLength     lnColor 
    #   8.8650459   2.1005738   0.4876803 
    std.errors
    # [1] 8.8650459 2.1005738 0.4876803