I read a paper, and they chose standard deviation (SD) of log(Relative Risk) as follows:
For example, we have a normal distribution with mean of log(0.66). Then we chose the SD as
SD = log(0.66)/qnorm(0.05) = -0.4155154 / -1.644854 = 0.2526155
What I know is
qnorm(0.05) = P(X < -1.644854) = 0.05
Ultimately, they obtained samples from
Normal distribution with mean = log(0.66) and SD = 0.2526155.
My question is: I don't have any clue about what this means. What will be the purpose/meaning of dividing a value by qnorm(0.05)?
Can anyone explain what it means? Thank you!!
It's probably to do with the fact that by using a standard deviation which is the mean divided by qnorm(0.05)
the random variable then has a 95% chance of being below 0.
mean(rnorm(1e6, log(0.66), log(0.66)/qnorm(0.05)) < 0)
#> [1] 0.950121