Next: Mathematical Background
Up: Pruned-Cascade-Correlation (PCC)
Previous: Pruned-Cascade-Correlation (PCC)
The aim of Pruned-Cascade-Correlation (PCC) is to minimize the
expected test set error, instead of the actual training error
[Weh94]. PCC tries to determine the optimal number of hidden
units and to remove unneeded weights after a new hidden unit is
installed. As pointed out by Wehrfritz, selection criteria or a
hold-out set, as it is used in ``stopped-learning'', may be applied to
digest away unneeded weights. In this release of SNNS, however, only
selection criteria for linear models are implemented.
The algorithm works as follows (CC steps are printed italic):
- Train the connections to the output layer
- Compute the selection criterion
- Train the candidates
- Install the new hidden neuron
- Compute the selection criterion
- Set each weight of the last inserted unit to zero and compute
the selection criterion; if there exists a weight, whose removal
would decrease the selection criterion, remove the link, which
decreases the selection criterion most. Goto step 5 until a
further removal would increase the selection criterion.
- Compute the selection criterion; if it is greater than the one,
computed before inserting the new hidden unit, notify the user
that the net is getting too big.
Niels Mache
Wed May 17 11:23:58 MET DST 1995