next up previous contents
Next: Hartley Information Up: Structure Previous: What is structure

   
Information and Uncertainty Measures

In this section some classical information and uncertainty measures based on probability-distributions are introduced. As discussed in the last section structure is connected with size and randomness of the underlying distribution. The more random a distribution the higher our uncertainty and the lower the amount of structure. Also the more different values occur in a distribution the higher the uncertainty. The first introduced measure, the Hartley Information, deals with the general properties of uncertainty just depending on the domain size. The properties of uncertainty for different probability assignments are discussed in connection with Shannons famous Entropy measure. Joint uncertainties and conditional uncertainties are shown as important structure and association indicators. Finally some ``distinctiveness'' measures of two distributions, e.g. usefull for outlayer detection and rule finding (3.3.5), are introduced. Alltogether these measures built an significant repertoire for evaluating structure and pattern within probability distributions.



 

Thomas Prang
1998-06-07