Next: Shannon's Entropy
Up: Hartley Information
Previous: Hartley Information
Instead of talking about known or unknown states selected out of a
finite set Sn we can also see the situation from the viewpoint of a random variable X
which induces a probability measure F and a probability distribution f on a Sn,
denotes the powerset of Sn:
where X is a measurable function from the probability space
to Sn,
and FA the probability measure defined on sigma algebra
over set A.
Further mathematical details will be ignored, the interested
reader might refer to any book on probability-theory.
In the following discussion variables will be seen as equivalent to random variables,
and a state-set Sn
will be denoted as the domain of its corresponding variable X,
dom(X):= Sn,
denotes a state or value of variable X.
The important point is that we can see the Hartley measure as defined on a special case
of random variables which assigns equal probabilities to all values:
We can now define Hartley information again on the space of probability distributions with
equal probability (denoted P'):
|
(7) |
Now Hartley information means the information we have when we know
the state of the random variable X, or the uncertainty if we don't know it.
This different viewpoint seems not to make sense as we are still only interested
in the size of the domain. I introduced this interpretation to show the
context of a more general information-measure introduced in 1948 by Shannon.
Note, though, that this is only one of many interpretations; the Hartley measure
can be seen in this probability-context, but does by no means imply it. Hartley's measure has
also other useful interpretations, e.g. as an unspecifity-measure [28].
Next: Shannon's Entropy
Up: Hartley Information
Previous: Hartley Information
Thomas Prang
1998-06-07