Search code examples
rmachine-learningpca

PCA: why do I get so different results from princomp() and prcomp()?


In the code below what is the difference between pc3$loadings and pc4$rotation?

Code:

pc3<-princomp(datadf, cor=TRUE)
pc3$loadings

pc4<-prcomp(datadf,cor=TRUE)
pc4$rotation

Data:

datadf<-dput(datadf)
structure(list(gVar4 = c(11, 14, 17, 5, 5, 5.5, 8, 5.5, 
6.5, 8.5, 4, 5, 9, 10, 11, 7, 6, 7, 7, 5, 6, 9, 9, 6.5, 9, 3.5, 
2, 15, 2.5, 17, 5, 5.5, 7, 6, 3.5, 6, 9.5, 5, 7, 4, 5, 4, 9.5, 
3.5, 5, 4, 4, 9, 4.5), gVar1 = c(0L, 0L, 0L, 0L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 0L, 1L, 0L, 0L, 
0L, 0L, 0L, 0L, 0L, 1L, 0L, 0L, 0L, 0L, 0L), gVar2 = c(0L, 
1L, 0L, 1L, 0L, 0L, 0L, 0L, 0L, 1L, 1L, 1L, 1L, 0L, 1L, 1L, 0L, 
2L, 3L, 0L, 0L, 1L, 0L, 0L, 1L, 1L, 0L, 1L, 0L, 0L, 0L, 1L, 0L, 
0L, 1L, 0L, 1L, 1L, 0L, 0L, 0L, 0L, 1L, 1L, 1L, 0L, 1L, 0L, 0L
), gVar3 = c(2L, 4L, 1L, 3L, 3L, 2L, 1L, 2L, 3L, 6L, 5L, 
2L, 7L, 4L, 2L, 7L, 5L, 6L, 1L, 3L, 3L, 6L, 3L, 2L, 3L, 1L, 1L, 
1L, 1L, 1L, 2L, 5L, 4L, 5L, 6L, 5L, 5L, 6L, 7L, 6L, 2L, 5L, 8L, 
5L, 5L, 0L, 2L, 4L, 2L)), .Names = c("gVar4", "gVar1", 
"gVar2", "gVar3"), row.names = c(1L, 2L, 3L, 4L, 
5L, 6L, 7L, 9L, 10L, 11L, 12L, 13L, 14L, 15L, 16L, 17L, 18L, 
19L, 20L, 21L, 22L, 23L, 24L, 25L, 26L, 27L, 28L, 29L, 30L, 31L, 
32L, 33L, 34L, 35L, 36L, 37L, 38L, 39L, 40L, 41L, 42L, 43L, 44L, 
45L, 46L, 47L, 48L, 49L, 50L), class = "data.frame", na.action = structure(8L, .Names = "8", class = "omit"))

Solution

  • Did not you receive a warning when you do pc4 <- prcomp(datadf, cor = TRUE)? You should have been told that prcomp has no cor argument, and it is ignored. I will first tell you the correct thing to do, and explain why.

    Correct way to do

    You should do:

    pc3 <- princomp(datadf, cor = TRUE)
    pc4 <- prcomp(datadf, scale = TRUE)
    

    then both give you the same root eigen/singular values in pc3$sdev and pc4$sdev, as well as the same eigen vectors (loadings/rotations) in pc3$loadings and pc4$rotation.

    why

    When you do pc3 <- princomp(datadf, cor = TRUE), you are performing eigen decomposition of the correlation matrix:

    foo <- eigen(cor(datadf))  ## cor()
    foo$values <- sqrt(foo$values)
    foo
    #$values
    #[1] 1.1384921 1.0614224 0.9249764 0.8494921
    
    #$vectors
    #           [,1]       [,2]        [,3]       [,4]
    #[1,]  0.3155822 -0.6186905  0.70263064  0.1547260
    #[2,] -0.4725640  0.4633071  0.68652912 -0.3011769
    #[3,] -0.4682583 -0.6040654 -0.18558974 -0.6175724
    #[4,] -0.6766279 -0.1940969 -0.02333235  0.7098991
    

    These are what you will get from pc3$sdev and pc3$loadings.

    However, when you do pc4 <- prcomp(datadf, cor = TRUE), cor = TRUE is ignored, and R will do:

    pc4 <- prcomp(datadf)  ## with default, scale = FALSE
    

    so it will performe singular value decomposition of the covariance matrix:

    bar <- eigen(cov(datadf))  ## cov()
    bar$values <- sqrt(bar$values)
    bar
    #$values
    #[1] 3.440363 2.048703 0.628585 0.196056
    
    #$vectors
                  [,1]        [,2]        [,3]         [,4]
    #[1,]  0.997482373 -0.06923771  0.01349921  0.007268119
    #[2,] -0.008316998 -0.01265655  0.01132874  0.999821133
    #[3,]  0.007669026 -0.08271789 -0.99649018  0.010307681
    #[4,] -0.070006635 -0.99408435  0.08183363 -0.014093521
    

    These are what you will see in pc4$sdev and pc4$rotation.

    But if you do pc4 <- prcomp(datadf, scale = TRUE), it will operate on correlation matrix, as same as pc3 <- princomp(datadf, cor = TRUE).