Search code examples
rparallel-processingregressionmlm

Fit many linear models in R with identical design matrices


For a neuroimaging application, I'm trying to fit many linear models by least squares in R (standard call to lm). Imagine I have a design matrix X. This design matrix will be the same across all of the models. The data (Y) that is being fit will change, and as a result so will all of the fit parameters (e.g. betas, p-values, residuals, etc).

At present, I'm just sticking it in a for loop, so it's doing hundreds of thousands of calls to lm. It seems like there's got to be a better way.

I believe the most computationally expensive piece is the matrix inversion. It looks like this gets handled with a Fortran call in lm.fit.

If I were doing this regression by hand, I'd do the matrix inversion, then just multiply it by the various datasets. In fact, I've coded up a function to do that when I have well-behaved design matrices (e.g. all continuously valued covariates). However, I really like all of the work that lm does, like recoding my factors appropriately, etc, and the output of lm is really nice, too.

Is there anyway to have my cake and eat it, too? Namely, to get the friendliness of lm, but use that power to computationally efficiently fit many models with identical design matrices?


Solution

  • From the help page for lm:

    If ‘response’ is a matrix a linear model is fitted separately by least-squares to each column of the matrix.

    So it would seem that a simple approach would be to combine all the different y vectors into a matrix and pass that as the response in a single call to lm. For example:

    (fit <- lm( cbind(Sepal.Width,Sepal.Length) ~ Petal.Width+Petal.Length+Species, data=iris))
    summary(fit)
    summary(fit)[2]
    coef(summary(fit)[2])
    coef(summary(fit))[2]
    sapply( summary(fit), function(x) x$r.squared )