Search code examples
conditional-statementstheoryentropy

Conditional Entropy if outcome is known


I have a question about Entropy and Information Flow. Suppose that X = {-1, 1}; meaning that it can be either -1 or 1, and the following assignment for Y:

Y := X * X

My question is that the value of Y, after the assignment, will always be 1. If X = -1, then Y=1 and if X = 1, then Y= 1. Knowing this, can I still assume that the conditional entropy H(X/Y) = 0, because knowing X will always tell you the Value of Y. On the other hand, the conditional entropy H(Y/X) = 1.0 because knowing Y will not give me the value of X. Am I thinking in the right direction? Please help


Solution

  • You are partially correct, though it seems like you are rather "swapped" in your notation and your definition.

    H(X|Y) is entropy of X given Y rather than entropy of Y given X.
    

    Also, you should try to look at the condition here more carefully. Since you have a very clear relationship between X and Y, that means Y = f(X). And in that situation, just as you say, the conditional entropy is always 0 (yet you are swapped in your notation). Thus it should be

    H(Y|X) = 0
    

    On the other hand, if you have Y, you completely have no clue of what is X and both -1 and 1 have equal probability. So in this case

    H(X|Y) = 1