The full model for growth of plants is as follows:
lmer(log(growth) ~ nutrition + fertilizer + season + (1|block)
where nutrition(nitrogen/phosphorus), fertilizer(none/added), season(dry/wet)
The summary of the model is as follows:
REML criterion at convergence: 71.9
Scaled residuals:
Min 1Q Median 3Q Max
-1.82579 -0.59620 0.04897 0.62629 1.54639
Random effects:
Groups Name Variance Std.Dev.
block (Intercept) 0.06008 0.2451
Residual 0.48633 0.6974
Number of obs: 32, groups: tank, 16
Fixed effects:
Estimate Std. Error df t value Pr(>|t|)
(Intercept) 3.5522 0.2684 19.6610 13.233 3.02e-11 ***
nutritionP 0.2871 0.2753 13.0000 1.043 0.31601
fertlizeradded -0.3513 0.2753 13.0000 -1.276 0.22436
seasonwet 1.0026 0.2466 15.0000 4.066 0.00101 **
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Plant growth here is only dependent on season, and the increase in growth is 1.0026 on the log scale. How do I interpret this on the scale of the original data, if I want to what the increase in actual plant height was? Is it only e(1.0026) ~ 3 cms, or is there any other way to interpret this?
exp(1.0026)
is indeed about 3 (2.72), but this value represents proportional change. Growth is three times higher in the wet than in the dry season, all other things being equal.