Search code examples
rentropy

Entropy output is NaN for some class solutions and not others


I am running a latent class analysis in R and using the Entropy function. I wanted to understand why in the output, it produces a result for lower nclasses and then NaN for higher Nclasses.

I am a beginner to the software!

For reference here, is the output and code:

> entropy<-function (p) sum(-p*log(p))
> error_prior <- entropy(France_2class$P) # Class proportions
> error_post <- mean(apply(France_2class$posterior, 1, entropy))
> R2_entropy <- (error_prior - error_post) / error_prior
> R2_entropy
[1] 0.8121263
> 
> entropy<-function (p) sum(-p*log(p))
> error_prior <- entropy(France_3class$P) # Class proportions
> error_post <- mean(apply(France_3class$posterior, 1, entropy))
> R2_entropy <- (error_prior - error_post) / error_prior
> R2_entropy
[1] 0.8139903
> 
> entropy<-function (p) sum(-p*log(p))
> error_prior <- entropy(France_4class$P) # Class proportions
> error_post <- mean(apply(France_4class$posterior, 1, entropy))
> R2_entropy <- (error_prior - error_post) / error_prior
> R2_entropy
[1] NaN
> 
> entropy<-function (p) sum(-p*log(p))
> error_prior <- entropy(France_5class$P) # Class proportions
> error_post <- mean(apply(France_5class$posterior, 1, entropy))
> R2_entropy <- (error_prior - error_post) / error_prior
> R2_entropy
[1] NaN
> 
> entropy<-function (p) sum(-p*log(p))
> error_prior <- entropy(France_6class$P) # Class proportions
> error_post <- mean(apply(France_6class$posterior, 1, entropy))
> R2_entropy <- (error_prior - error_post) / error_prior
> R2_entropy
[1] NaN

Can anyone help? Thank you


Solution

  • I guess the problem comes from the definition of entropy. More precisely, if 0 is contained in p, then you will obtain NaN, e.g.,

    > entropy(p1)
    [1] 1.279854
    
    > entropy(p2)
    [1] NaN
    
    > entropy(p3)
    [1] 0.5004024
    

    To fix it, you can add na.omit to function entropy like below

    entropy<-function(p) sum(na.omit(-p*log(p)))
    

    then you can see

    > entropy(p1)
    [1] 1.279854
    
    > entropy(p2)
    [1] 0.5004024
    
    > entropy(p3)
    [1] 0.5004024
    

    DATA

    p1 <- c(0.1,0.2,0.3,0.4)
    p2 <- c(0,0.2,0.8)
    p3 <- c(0.2,0.8)