next up previous contents index
Next: Mathematical Background Up: Pruned-Cascade-Correlation (PCC) Previous: Pruned-Cascade-Correlation (PCC)

The Algorithm

The aim of Pruned-Cascade-Correlation (PCC) is to minimize the expected test set error, instead of the actual training error [Weh94]. PCC tries to determine the optimal number of hidden units and to remove unneeded weights after a new hidden unit is installed. As pointed out by Wehrfritz, selection criteria or a hold-out set, as it is used in ``stopped-learning'', may be applied to digest away unneeded weights. In this release of SNNS, however, only selection criteria for linear models are implemented.

The algorithm works as follows (CC steps are printed italic):

  1. Train the connections to the output layer
  2. Compute the selection criterion
  3. Train the candidates
  4. Install the new hidden neuron
  5. Compute the selection criterion
  6. Set each weight of the last inserted unit to zero and compute the selection criterion; if there exists a weight, whose removal would decrease the selection criterion, remove the link, which decreases the selection criterion most. Goto step 5 until a further removal would increase the selection criterion.
  7. Compute the selection criterion; if it is greater than the one, computed before inserting the new hidden unit, notify the user that the net is getting too big.


Niels Mache
Wed May 17 11:23:58 MET DST 1995