next up previous contents
Next: Shannon's Entropy Up: Hartley Information Previous: Hartley Information

Probabilistic Interpretation of the Hartley Measure:

Instead of talking about known or unknown states selected out of a finite set Sn we can also see the situation from the viewpoint of a random variable X which induces a probability measure F and a probability distribution f on a Sn, $\mathcal{P}(S_n)$ denotes the powerset of Sn:

\begin{displaymath}\func{X}{A}{S_n} \end{displaymath}


\begin{displaymath}\func{F}{\mathcal{P}(S_n)}{[0,1]}, \quad F(U)= F_A(X^{-1}(U)), \quad
U \subseteq S_n \Rightarrow X^{-1}(U) \in \Omega_A \end{displaymath}


\begin{displaymath}\func{f}{S_n}{[0,1]}, \quad f(s)= F(\{s\}), \quad s \in S_n \end{displaymath}

where X is a measurable function from the probability space $(A,\Omega_A,F_A)$ to Sn, and FA the probability measure defined on sigma algebra $\Omega_A$ over set A. Further mathematical details will be ignored, the interested reader might refer to any book on probability-theory. In the following discussion variables will be seen as equivalent to random variables, and a state-set Sn will be denoted as the domain of its corresponding variable X, dom(X):= Sn, $x \in dom(X)$denotes a state or value of variable X.

The important point is that we can see the Hartley measure as defined on a special case of random variables which assigns equal probabilities to all values:

\begin{displaymath}f(x) = \frac{1}{\vert dom(X)\vert}, \quad x \in dom(X), \end{displaymath}


\begin{displaymath}\vert dom(X)\vert = \mbox{number of values in the domain of }X \end{displaymath}

We can now define Hartley information again on the space of probability distributions with equal probability (denoted P'):

\begin{displaymath}P' := \Big\{ f \vert \func{f}{S_n}{[0,1]}, f(s)= \frac{1}{\vert dom(S_n)\vert}=\frac{1}{n},
s\in S_n,\;n=1,2,\ldots \Big\} \end{displaymath}


\begin{displaymath}\func{I}{P'}{[0,\infty)} \end{displaymath}


\begin{displaymath}I(X) = I(f(s) \vert s \in S) := \log_2( \vert S\vert ) \mbox{ bits}
\end{displaymath} (7)

Now Hartley information means the information we have when we know the state of the random variable X, or the uncertainty if we don't know it.

This different viewpoint seems not to make sense as we are still only interested in the size of the domain. I introduced this interpretation to show the context of a more general information-measure introduced in 1948 by Shannon. Note, though, that this is only one of many interpretations; the Hartley measure can be seen in this probability-context, but does by no means imply it. Hartley's measure has also other useful interpretations, e.g. as an unspecifity-measure [28].


next up previous contents
Next: Shannon's Entropy Up: Hartley Information Previous: Hartley Information
Thomas Prang
1998-06-07