Search code examples
roptimizationregressionleast-squares

Least square optimization (of matrices) in R


Yesterday I asked a question about least square optimization in R and it turned out that lm function is the thing that I was looking for.

On the other hand, now I have an other least square optimization question and I am wondering if lm could also solve this problem, or if not, how it can be handled in R.

I have fixed matrices B (of dimension n x m) and V (of dimension n x n), I am looking for an m-long vector u such that

       sum( ( V - ( B %*% diag(u) %*% t(B)) )^2 )

is minimized.


Solution

  • 1) lm.fit Use the fact that

    vec(AXA') = (A ⊗ A ) vec(X)

    so:

    k <- ncol(A)
    AA1 <- kronecker(A, A)[, c(diag(k)) == 1]
    lm.fit(AA1, c(V))
    

    Here is a self contained example:

    # test data
    set.seed(123)
    A <- as.matrix(BOD)
    u <- 1:2
    V <- A %*% diag(u) %*% t(A) + rnorm(36)
    
    # solve
    k <- ncol(A)
    AA1 <- kronecker(A, A)[, c(diag(k)) == 1]
    fm1 <- lm.fit(AA1, c(V))
    

    giving roughly the original coefficients 1:2 :

    > coef(fm1)
          x1       x2 
    1.011206 1.999575 
    

    2) nls We can alternately use nls like this:

    fm2 <- nls(c(V) ~ c(A %*% diag(x) %*% t(A)), start = list(x = numeric(k)))
    

    giving the following for the above example:

    > fm2
    Nonlinear regression model
      model: c(V) ~ c(A %*% diag(x) %*% t(A))
       data: parent.frame()
       x1    x2 
    1.011 2.000 
     residual sum-of-squares: 30.52
    
    Number of iterations to convergence: 1 
    Achieved convergence tolerance: 1.741e-09
    

    Update: Corrections and second solution.