next up previous contents index
Next: An Example of Up: Neural Network Terminology Previous: Learning in Neural

Generalization of Neural Networks

 

One of the major advantages of neural nets is their ability to generalize. This means that a trained net could classify data from the same class as the learning data that it has never seen before. In real world applications developers normally have only a small part of all possible patterns for the generation of a neural net. To reach the best generalization, the dataset should be split into three parts:

Figure gif shows a typical error development of a training set (lower curve) and a validation set (upper curve).

  
Figure: Error development of a training and a validation set

The learning should be stopped in the minimum of the validation set error. At this point the net generalizes best. When learning is not stopped, overtraining occurs and the performance of the net on the whole data decreases, despite the fact that the error on the training data still gets smaller. After finishing the learning phase, the net should be finally checked with the third data set.

SNNS performs one validation cycle every n training cycles. Just like training, validation is controlled from the remote panel.



Niels Mache
Wed May 17 11:23:58 MET DST 1995