Next: Cross-Entropy
Up: Information and Uncertainty Measures
Previous: Shannon's Entropy
The strength of the relationship between two variables (in bits) can be measured
by the following measure which is known as ``information transmission''
[28, pg. 164]:
T(X,Y) |
= H(X) + H(Y) - H(X,Y) |
|
(11) |
|
= H(X) - H( X | Y) |
|
|
|
= H(Y) - H( Y| X ) |
|
|
From the discussion about Shannon's entropy we know that this measure
equals 0 if X and Y are independent. It increases with stronger
relationship. In this sense transmission is a kind of nominal measure for
correlation.
Thomas Prang
1998-06-07