next up previous contents index
Next: Weight Display Up: Windows of XGUI Previous: Print Panel

Remote Panel

     

With this window the simulator is operated (as with a remote control). Figure gif shows this window. Table gif lists all the input options with types and value ranges. The meaning of the 5 learning parameters depends upon the learning function selected with the menu select learning function invoked by the button of the remote panel.

  
Figure: Remote Panel

  
Table: Input fields of the remote panel

There are the following text fields, buttons and menu buttons:

  1. STEPS: This text field specifies the number of update steps of the network. With Topological_Order selected as update mode (chosen with the menu select update function from the button OPTIONS in the remote panel) one step is sufficient to propagate information from input to output. With other update modes or with recursive networks, several steps might be needed.

  2. : After clicking this button, the simulator kernel executes the number of steps specified in the text field STEPS. If STEPS is zero, the units are only redrawn. The update mode selected with the button is used (see chapter gif). The first update step in the mode topological takes longer than the following, because the net is sorted topologically first. Then all units are redrawn.

  3. COUNT: The text field next to the STEP button displays the steps, executed so far.

  4. : Initializes the network with values according to the function and parameters given in the last line of the panel.

  5. : The counter is reset and the units are assigned their initial activation.

  6. :     By pressing the error button in the remote panel, SNNS will print out several statistics. The formulas were contributed by Warren Sarle from the SAS institute. Note that these criteria are for linear models; they can sometimes be applied directly to nonlinear models if the sample size is large. A recommended reference for linear model selection criteria is [JGHL80].

    Notation:

    Criteria for adequacy of the estimated model in the sample
    Pearson's , the proportion of variance, is explained or accounted by the model:
    Criteria for adequacy of the true model in the population
    The mean square error [JGHL80] is defined as: , the root mean square error as: .

    The , the [JGHL80] adjusted for degrees of freedom, is defined as:
    Criteria for adequacy of the estimated model in the population
    Anemiya's prediction criterion [JGHL80] is similar to the :
    The estimated mean square error of prediction () assuming that the values of the regressors are fixed and that the model is correct is:
    The conservative mean square error in prediction [Weh94] is:
    The generalized cross validation (GCV) is given by Wahba [GHW79] as:
    The estimated mean square error of prediction assuming that both independent and dependent variables are multivariate normal is defined as:
    Shibata's criterion can be found in [Shi68].

    Finally, there is Akaikes information criterion [JGHL80]:

    and the Schwarz's Bayesian criterion [JGHL80]:
    .Obviously, most of these selection criteria do only make sense, if n>>p.

  7. : Information about the net is written to the text window.

  8. CYCLES: This text field specifies the number of learning cycles. It is mainly used in conjunction with the next two buttons. A cycle (also called an epoch sometimes) is a unit of training where all patterns of a pattern file are presented to the network once.

  9. : The net is trained with a single pattern for the number of training cycles defined in the field CYCLES. The text window reports the error of the network every CYCLES cycles, i.e. independent of the number of training cycles only 10 numbers are generated. (This prevents flooding the user with network performance data and slowing down the training by file I/O).

      The error reported in the text window is the sum of the quadratic differences between the teaching input and the real output over all output units. The average error per output unit is given behind ave.

  10. : The net is trained with all patterns for the number of training cycles specified in the field CYCLES. This is the usual way to train networks from the graphical user interface. Note, that if cycles has a value of, say, 100, the button ALL causes SNNS to train all patterns once (one cycle = one epoch) and repeat this 100 times (NOT training each pattern 100 times in a row and then applying the next pattern).

    The error reported in the text window is the sum of the quadratic difference between the teaching input and the real output over all output units summed over the number of patterns presented. The average error per output unit is given behind ave.

  11. : Stops the teaching cycle. After completion of the current step or teaching cycle, the simulation is halted immediately.

  12. : With this button, the user can test the behavior of the net with all patterns loaded. The activation values of input and output units are copied into the net. (For output units see also button ). Then the number of update steps specified in STEPS are executed.

  13. : It is important for optimal learning that the various patterns are presented in different order in the different cycles. A random sequence of patterns is created automatically, if SHUFFLE is switched on.

  14. : Offers the following menu:  

    If jog weights is selected, a popup window appears to specify the value range ( low limit .. high limit) of the random noise to be added to all links in the network.

  15. : With this button, the user specifies the changes to the activation values of the output units when a pattern is applied with . The following table gives the three possible alternatives:

    The label of this button always displays the item selected from the menu.

  16. PATTERN: This text field displays the current pattern number.

  17. : The pattern whose number is displayed in the text field PATTERN is deleted from the pattern file.

  18. : The pattern whose number is displayed in the text field PATTERN is modified in place.

    The current activation of the input units and the current output values of output units of the network loaded make up the input and output pattern. These values might have been set with the network editor and the Info panel before.

  19. : A new pattern is defined that is added behind existing patterns. Input and output values are defined as above. This button is disabled whenever the current pattern set has variable dimensions.

  20. : The simulator advances to the pattern whose number is displayed in the text field PATTERN.

  21. Arrow buttons xgui_figs/xgui_button_first.ps, xgui_figs/xgui_button_prev.ps, xgui_figs/xgui_button_next.ps, and xgui_figs/xgui_button_last.ps: With these buttons, the user can navigate through all patterns loaded, as well as jump directly to the first and last pattern. Unlike with the button no update steps are performed here.

  22. : Opens the panel for sub-pattern handling. The button is inactive when the current pattern set has no variable dimensions. The sub-pattern panel is described in section gif.

  23. : Opens the menu of loaded pattern sets. The pattern set of the selected entry is removed from main memory. The corresponding pattern file remains untouched. When the current pattern set is deleted, the last in the list becomes current. When the last remaining pattern set is deleted, the current pattern set becomes undefined and the menu shows the entry No Files.

  24. : Also opens the menu of loaded pattern sets. The pattern set of the selected entry becomes the current set. All training, testing, and propagation actions refer always to the current pattern set. The name of the corresponding pattern file is displayed next to the button in the Current Pattern Set field.

  25. Current Pattern Set: This field displays the name of the pattern set currently used for training. When no current pattern set is defined the entry " Training Pattern File ?" is displayed.

  26. VALID: Gives the intervals in which the training process is to be interrupted by the computation of the error on the validation pattern set. A value of 0 inhibits validation. The validation error is printed on the shell window and plotted in the graph display.

  27. : Opens the menu of loaded pattern sets. The pattern set of the selected entry becomes the current validation set. The name of the corresponding pattern file is displayed next to the button in the Validation Pattern Set field.

  28. Validation Pattern Set: This field displays the name of the pattern set currently used for validation. When no current pattern set is defined the entry " Validation Pattern File ?" is displayed.

  29. LEARN: Five fields to specify the parameters of the learning function. The number required and their resp. meaning depend upon the learning function used. A description of the learning functions that are already built in into SNNS is given in section gif.

  30. : in the LEARN row invokes a menu to select a learning function (learning procedure). The following learning functions are currently implemented:    

  31. UPDATE: Five fields to specify the parameters of the update function. The number required and their resp. meaning depend upon the update function used.

  32. : in the UPDATE row invokes a menu to select an update function. A list of the update functions that are already built in into SNNS and their descriptions is given in section gif.

  33. INIT: Five fields to specify the parameters of the init function. The number required and their resp. meaning depend upon the init function used.

  34. : in the INIT row invokes a menu to select an initialization function. See section gif for a list of the init functions available as well as their description.



next up previous contents index
Next: Weight Display Up: Windows of XGUI Previous: Print Panel



Niels Mache
Wed May 17 11:23:58 MET DST 1995