Search code examples
rresponsepredictgammgcv

Using gratia::fitted_values() to get response predictions for the global smooth in an HGAM


This question is a follow-up to this post: Using gratia::draw() in R to display partial effect plots within an HGAM that are not relative to the global smooth

I have a dataset that looks like this:

df <- data.frame(
  Lake = structure(c(1L, 1L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 1L, 
                     2L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 1L, 1L, 
                     1L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 
                     1L, 2L, 1L, 2L, 2L, 1L, 1L, 1L, 2L, 2L, 1L, 2L, 2L, 1L, 1L, 1L, 
                     1L, 1L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 
                     2L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 
                     2L, 2L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 
                     2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 
                     1L, 2L, 2L, 1L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 2L, 2L, 1L, 
                     1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 2L, 
                     2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 2L, 2L, 1L, 1L, 1L, 
                     2L, 1L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 
                     1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 1L, 1L, 2L, 
                     2L, 2L, 2L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 
                     1L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 
                     2L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 
                     2L, 1L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 1L, 2L, 1L, 1L, 
                     1L, 1L, 2L, 1L, 1L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 1L, 1L, 1L, 
                     2L, 2L, 1L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 
                     2L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 2L, 
                     2L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 2L, 2L, 2L, 
                     2L, 1L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 1L, 1L, 1L, 
                     1L, 2L, 2L, 2L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 2L, 2L, 1L, 1L, 1L, 
                     2L, 1L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 2L, 1L, 2L, 2L, 2L, 
                     2L, 1L, 2L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 
                     2L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 2L, 
                     2L, 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 1L, 
                     1L, 1L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 
                     1L, 1L, 1L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 
                     1L, 1L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 
                     1L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 
                     2L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 
                     1L, 1L, 1L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 
                     2L, 1L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 
                     2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 1L, 
                     2L, 2L, 2L, 1L, 2L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 1L, 
                     1L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 
                     1L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 1L, 1L, 2L, 
                     2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 
                     1L, 1L, 1L, 2L, 2L, 1L, 1L, 1L, 2L, 1L, 1L, 2L, 1L, 2L, 2L, 1L, 
                     1L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 
                     1L, 1L, 1L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 
                     1L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 
                     1L, 1L, 1L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 
                     2L, 2L, 1L, 1L, 1L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 
                     1L, 2L, 1L, 1L, 2L), .Label = c("F", "T"), class = "factor"),
  D = c(1.63, 3, 10, 3, 10, 4, 13, 17, 14, 2.81, 20, 3, 28, 24, 6, 
            1.81999999999999, 7, 25, 2.20999999999998, 10, 15, 7.25999999999999, 
            4, 4, 6.64999999999998, 8.83999999999997, 6, 2.20999999999998, 
            22.96, 5.63, 11, 30, 32.31, 25, 1, 3, 4, 7.41000000000003, 2, 
            6, 17, 7, 5, 4.20999999999998, 3, 22, 5, 4.74000000000001, 7, 
            10, 3, 11, 14, 2, 24, 1, 7, 15, 16, 2.68000000000001, 12, 11, 
            5, 10, 10, 6, 12, 4, 4, 4.64999999999998, 18, 7.5, 13, 3, 15, 
            10, 22, 19, 4, 12, 2, 3, 5.41000000000003, 6, 19, 6, 3, 3, 34, 
            3.63, 11, 6, 7, 25, 4, 2.81, 4.70999999999998, 3, 12.31, 5, 17, 
            28, 3.63, 8, 9, 3, 30, 20, 11, 4, 12, 3, 4, 16, 5, 10, 2, 14, 
            58, 10, 2.06, 15, 2.74000000000001, 7, 10.74, 2.81, 11, 6, 5, 
            7.25999999999999, 10, 2.68000000000001, 9, 2.83999999999997, 
            5.5, 15, 7, 6.56, 14, 6, 3.25999999999999, 2.31, 1, 7, 3, 4, 
            2, 3, 9, 28, 18.84, 5, 5, 2.75999999999999, 7.63, 8.20999999999998, 
            18, 3, 11, 1, 24, 4, 22, 2, 3, 4.20999999999998, 14.65, 16, 9, 
            5, 3, 7, 1, 2, 4.5, 2, 20, 1, 10, 17, 4, 2, 1, 23, 5, 11, 12, 
            17, 10, 3, 18, 6, 7, 5, 3, 32, 16, 5, 7, 9, 29, 2, 12, 4, 23, 
            14, 4, 5, 11, 11.82, 6.20999999999998, 7, 12, 3, 6, 4, 17, 4, 
            24, 6, 12, 11.63, 4, 2, 25, 2, 54, 7, 8, 9.25999999999999, 14, 
            15, 11, 6, 21, 1, 3, 8, 1, 2.83999999999997, 19, 6, 19, 2.06, 
            3, 3, 4, 8, 6, 9.41000000000003, 4, 8.64999999999998, 3, 3, 2.5, 
            30, 12, 14, 15, 16, 10.56, 24, 12, 16.71, 25, 1, 10, 17, 1, 1.25999999999999, 
            12, 4, 24, 15, 8.68000000000001, 8, 3, 15.82, 17, 5, 3, 6.70999999999998, 
            5.63, 10, 10.68, 8, 3, 8.81, 5.25999999999999, 22, 12, 5.81999999999999, 
            6, 6, 3.5, 1.52999999999997, 4, 22, 15, 4, 23, 12, 25, 4, 22, 
            5.41000000000003, 9, 19, 8, 4, 8.56, 20, 10.21, 24, 1, 6, 3, 
            10, 3, 28, 12, 6, 17, 1, 3.41000000000003, 6.16000000000003, 
            4, 20.68, 4, 2.74000000000001, 5, 12, 1, 45, 4.74000000000001, 
            18, 15, 1, 8, 20, 21, 3, 16, 1, 3, 30, 10, 6.06, 4, 10.84, 25, 
            26, 12, 2.56, 2, 6, 10.56, 10.31, 16, 29.26, 5, 6, 3.81999999999999, 
            15, 1, 8, 3, 2, 22, 5, 2.95999999999998, 4.5, 1, 18, 2.66000000000003, 
            19, 12, 4, 14, 3, 7, 28, 4, 23, 6, 5, 3, 22, 1, 4, 12, 7, 1.63, 
            12.21, 15, 4, 3, 9, 20.65, 4.74000000000001, 22, 8.81, 5.81999999999999, 
            4.16000000000003, 7, 10, 24, 4.95999999999998, 30, 2, 10, 5, 
            9, 5, 12, 29.82, 2, 6.5, 6.20999999999998, 1, 1, 22, 22, 6.64999999999998, 
            32, 11, 15, 1, 18, 1.81999999999999, 4, 8, 20, 15, 4, 7, 22, 
            2, 2, 1, 1, 15, 20, 3, 5, 1.63, 4.66000000000003, 22, 6, 2, 31, 
            20, 5, 9.5, 30, 18, 13, 12, 12, 4.20999999999998, 12, 10.06, 
            2.68000000000001, 2, 1, 5, 2, 9, 2, 4, 1, 6, 1, 1, 2.16000000000003, 
            7, 8.95999999999998, 2.74000000000001, 5, 4, 5, 15, 20, 5.41000000000003, 
            29.41, 7, 32, 4, 14, 2.74000000000001, 4, 15, 8, 21, 32, 13.41, 
            3, 14, 4, 3, 18, 2.31, 25, 3.5, 4.74000000000001, 19, 21, 5.25999999999999, 
            10.21, 12.84, 2.95999999999998, 2, 4.31, 7, 7, 2.31, 17, 10.71, 
            23.41, 3, 3.41000000000003, 4.68000000000001, 22, 3, 13, 15, 
            8.74000000000001, 14.81, 5, 1, 4, 16, 1.41000000000003, 13, 3, 
            2, 6.06, 7, 3, 22, 4.83999999999997, 7, 2.81, 21, 3, 19, 6, 14, 
            2, 1, 10, 7.5, 8.70999999999998, 30, 14, 20, 1, 18, 30, 28, 1.41000000000003, 
            20, 5, 1.41000000000003, 3.5, 4.64999999999998, 5, 9.5, 3, 1.63, 
            11, 21, 2.66000000000003, 20.74, 15, 15, 14, 5, 14, 4.5, 4, 6.06, 
            4.20999999999998, 12, 18, 10.16, 7.81999999999999, 1, 2.95999999999998, 
            15, 2.5, 2.70999999999998, 11, 13.63, 18, 6, 18, 11, 6, 12, 7.5, 
            4.56, 1.38, 2.95999999999998, 17, 4, 1, 15, 4.74000000000001, 
            5.5, 11, 4, 1, 3, 25, 3, 9, 15, 11, 29, 8.56, 23, 14.65, 1, 7, 
            8, 14.06, 2, 3, 26, 2.56, 2.5, 25, 2.74000000000001, 1, 3, 8.56, 
            9.38, 2, 18, 3, 30, 16.96, 4, 22, 11, 6, 4, 3, 8.83999999999997, 
            22, 18, 7, 2.68000000000001, 6, 14.76, 7, 5, 1, 21, 3.81999999999999, 
            10, 3, 5, 7, 6, 20, 6.81, 7, 19, 24, 5, 1, 21.41, 3, 1.81999999999999, 
            10, 11.41, 6, 30, 3, 4, 4, 4, 1.5, 10.5, 18, 10, 2, 25, 14, 4, 
            5.63, 4.20999999999998, 2, 10.84, 10, 7, 30, 1, 17, 3, 3, 22, 
            2.74000000000001, 1, 8, 7, 32.65, 4, 3, 5, 4, 5, 1, 5, 10.76, 
            4, 2, 3.41000000000003, 4, 17),
  OrdDay = c(254, 271, 286, 88, 181, 209, 246, 259, 218, 324, 230, 181, 
             271, 351, 364, 224, 268, 232, 210, 215, 260, 281, 286, 351, 195, 
             167, 248, 54, 308, 254, 322, 125, 33, 248, 336, 319, 322, 238, 
             181, 304, 195, 181, 273, 210, 153, 230, 28, 349, 195, 78, 286, 
             41, 355, 109, 78, 187, 31, 286, 41, 336, 187, 146, 305, 70, 290, 
             129, 290, 160, 83, 195, 147, 7, 159, 195, 146, 195, 181, 11, 
             349, 230, 140, 146, 268, 305, 181, 244, 299, 124, 155, 254, 232, 
             218, 12, 78, 286, 324, 177, 131, 33, 304, 56, 211, 254, 218, 
             60, 167, 147, 167, 322, 181, 299, 167, 215, 351, 230, 334, 25, 
             63, 11, 246, 5, 281, 349, 209, 91, 324, 246, 63, 203, 281, 167, 
             336, 63, 167, 88, 248, 153, 184, 237, 28, 281, 33, 195, 167, 
             109, 260, 56, 268, 248, 259, 187, 11, 124, 75, 254, 54, 218, 
             319, 322, 91, 12, 204, 195, 211, 125, 54, 195, 271, 364, 83, 
             335, 75, 75, 109, 75, 299, 160, 124, 334, 7, 146, 153, 184, 129, 
             146, 181, 131, 364, 31, 124, 11, 304, 290, 181, 204, 195, 322, 
             290, 305, 28, 336, 101, 174, 335, 109, 322, 273, 304, 364, 224, 
             210, 246, 25, 305, 349, 319, 83, 160, 28, 224, 187, 254, 124, 
             7, 167, 195, 12, 12, 187, 281, 101, 336, 304, 195, 244, 75, 232, 
             322, 246, 167, 237, 167, 336, 5, 125, 232, 187, 204, 286, 268, 
             131, 195, 322, 155, 104, 325, 28, 215, 195, 224, 184, 224, 174, 
             177, 167, 21, 363, 244, 268, 281, 286, 286, 335, 286, 336, 286, 
             109, 224, 181, 322, 299, 177, 254, 124, 336, 268, 218, 324, 281, 
             12, 119, 224, 248, 187, 215, 234, 159, 7, 204, 167, 78, 167, 
             325, 244, 290, 238, 305, 322, 246, 334, 184, 195, 210, 335, 160, 
             248, 218, 299, 78, 322, 167, 41, 211, 184, 238, 21, 281, 336, 
             322, 349, 268, 363, 273, 334, 349, 83, 78, 75, 204, 25, 237, 
             104, 232, 195, 319, 363, 355, 5, 335, 167, 237, 349, 286, 184, 
             75, 91, 184, 33, 215, 281, 28, 78, 224, 215, 116, 268, 124, 248, 
             7, 70, 308, 160, 336, 237, 105, 195, 273, 305, 273, 155, 248, 
             281, 160, 209, 259, 63, 101, 143, 67, 187, 203, 11, 254, 210, 
             31, 167, 363, 70, 195, 91, 41, 324, 224, 21, 351, 146, 268, 308, 
             28, 334, 259, 56, 12, 232, 174, 224, 101, 335, 54, 195, 143, 
             25, 171, 195, 167, 336, 281, 203, 25, 224, 75, 218, 248, 160, 
             181, 237, 195, 133, 172, 146, 75, 143, 260, 215, 56, 254, 105, 
             271, 319, 88, 364, 12, 230, 271, 125, 203, 248, 211, 286, 54, 
             63, 5, 336, 259, 105, 28, 299, 224, 172, 125, 75, 299, 177, 105, 
             21, 28, 308, 91, 88, 63, 281, 167, 349, 238, 238, 204, 12, 237, 
             349, 91, 364, 174, 237, 63, 363, 268, 167, 28, 181, 155, 160, 
             33, 304, 244, 349, 248, 28, 281, 54, 167, 308, 116, 33, 224, 
             181, 33, 364, 177, 268, 268, 238, 336, 281, 181, 299, 246, 349, 
             324, 56, 75, 273, 271, 268, 195, 246, 181, 5, 248, 146, 322, 
             167, 140, 324, 286, 286, 174, 322, 60, 187, 260, 335, 104, 177, 
             167, 203, 304, 177, 232, 336, 209, 238, 125, 260, 268, 203, 195, 
             363, 88, 232, 254, 203, 246, 105, 349, 268, 160, 336, 336, 260, 
             88, 56, 5, 54, 363, 31, 21, 224, 260, 308, 355, 25, 177, 167, 
             254, 224, 70, 349, 281, 119, 7, 75, 184, 124, 308, 273, 146, 
             202, 167, 349, 88, 218, 70, 210, 160, 147, 155, 181, 244, 195, 
             56, 184, 41, 195, 160, 260, 101, 5, 116, 230, 351, 184, 25, 224, 
             349, 91, 67, 184, 124, 355, 237, 167, 209, 308, 167, 268, 31, 
             218, 101, 155, 167, 12, 125, 143, 336, 286, 75, 167, 187, 260, 
             304, 224, 203, 290, 125, 195, 290, 355, 324, 153, 187, 349, 355, 
             324, 238, 260, 224, 281, 238, 140, 290, 273, 119, 181, 153, 129, 
             271, 75, 230, 116, 41, 91, 167, 254, 54, 290, 167, 11, 237, 336, 
             105, 181, 11, 286, 244, 349, 91, 230, 336, 195, 119, 230, 349, 
             349, 203, 238, 63, 75, 335, 91, 268, 322, 83),
  stringsAsFactors = FALSE)

I am running a GAM that looks like this:

library(mgcv)
library(gratia)
knots <- list(OrdDay = c(0.5, 366.5))
m <- gam(D ~ Lake +
           s(OrdDay, bs = "cc") +
           s(OrdDay, by = Lake, bs = "cc", m = 1),
         data = df, family = nb, knots = knots)

I am then taking this GAM to get predictions for each lake with the following code.

ds <- data_slice(m, OrdDay = evenly(OrdDay, n = 100), Lake = evenly(Lake))

fv <- fitted_values(m, data = ds, scale = "response")

I know I can take the code above and get the individual predicted responses for each lake; however, I am wondering if there is a way to use similar code to get the fitted values for the global term (s(OrdDay)) so that I can get mean predictions for all lakes together?

I've tried troubleshooting to exclude certain terms, but always get predictions for one lake or another using fitted_values().

Thanks!


Solution

  • This is an additive model, so you can select which of the additive terms to include in the fitted values. Note that you'll need the latest CRAN version of {mgcv} to get this to work as prior to this there was some inconsistencies in how terms and exclude worked in predict.gam().

    For your example, it is easiest to just specify the value of OrdDay that you want to get fitted values for. data_slice() will add on any other variables you need to get passed the checks in predict.gam(). So even though, in the data slice below we have a variable Lake, we won't use this when actually generating predictions:

    ds <- data_slice(m, OrdDay = evenly(OrdDay, n = 100))
    fv <- fitted_values(m, data = ds, scale = "response",
                        terms = c("(Intercept)", "s(OrdDay)"))
    

    The terms argument takes labels as per those shown in the output from summary(m), so here I'm asking for the intercept and the global smooth.

    Note however that because of the way R parameterises these models, the intercept is still technically for the refence level of the Lake factor.

    So perhaps it would be better to fit Lake and a random effect so it can be exclude altogether:

    m2 <- gam(D ~ s(Lake, bs = "re") +
                s(OrdDay, bs = "cc") +
                s(OrdDay, by = Lake, bs = "cc", m = 1),
              data = df, family = nb, knots = knots)
    ds <- data_slice(m2, OrdDay = evenly(OrdDay, n = 100))
    fv <- fitted_values(m2, data = ds, scale = "response",
                        terms = c("(Intercept)", "s(OrdDay)"))
    

    Then we have

    library("ggplot2")
    
    fv |>
    ggplot(aes(x = OrdDay, y = fitted)) +
      geom_ribbon(aes(ymin = lower, ymax = upper, y = NULL), alpha = 0.2) +
      geom_line()
    

    enter image description here