next up previous contents index
Next: Building Blocks of Up: No Title Previous: New Features of

Neural Network Terminology

 

  Connectionism is a current focus of research in a number of disciplines, among them artificial intelligence (or more general computer science), physics, psychology, linguistics, biology and medicine. Connectionism represents a special kind of information processing: Connectionist systems consist of many primitive cells ( units) which are working in parallel and are connected via directed links ( links, connections). The main processing principle of these cells is the distribution of activation patterns across the links similar to the basic mechanism of the human brain, where information processing is based on the transfer of activation from one group of neurons to others through synapses. This kind of processing is also known as parallel distributed processing (PDP).

The high performance of the human brain in highly complex cognitive tasks like visual and auditory pattern recognition was always a great motivation for modeling the brain. For this historic motivation   connectionist models are also called neural nets. However, most current neural network architectures do not try to closely imitate their biological model but rather can be regarded simply as a class of parallel algorithms.

In these models, knowledge is usually distributed throughout the net and is stored in the structure of the topology and the weights of the links. The networks are organized by (automated) training methods, which greatly simplify the development of specific applications. Classical logic in ordinary AI systems is replaced by vague conclusions and associative recall (exact match vs. best match). This is a big advantage in all situations where no clear set of logical rules can be given. The inherent fault tolerance of connectionist models is another advantage. Furthermore, neural nets can be made tolerant against noise in the input: with increased noise, the quality of the output usually degrades only slowly ( graceful performance degradation).





next up previous contents index
Next: Building Blocks of Up: No Title Previous: New Features of



Niels Mache
Wed May 17 11:23:58 MET DST 1995