PNN Learning

Top  Previous  Next


(Calibration Set to Iterative or None)

Use this module to train PNN networks. Unlike backpropagation networks, which require feedback of errors and subsequent adjustment of weights and many presentations of training patterns, training a PNN network is very fast because it requires that each pattern be presented to the network only once during training.   Click on the learning events box to display the number of learning events completed during training. Training can be done in real time since training is almost instantaneous. When data is sparse, training may be superior to other network types, according to some studies.


Training Graphics

NeuroShell 2 allows you to view the Number of Incorrect Patterns Graphed Against the Smoothing Factor as learning continues.  Click on the icon to view the graph.


Smoothing Factor Update

Save Best Factor for the Test Set:  Use the smoothing factor computed by Calibration during learning (see below for details).


Keep Present Factor:  Use the smoothing factor that was set in the PNN Architecture module.


Maximum Upper Value for Search

The PNN learning module allows you to specify an upper limit that is used in the search for a smoothing factor.  The objective is to find a smoothing factor that produces the least number of incorrect classifications.  Use the mouse to click on one of the radio buttons to select one of the following maximum values:  .8, 1.6, 3.2, 6.4, or 12.8.



By clicking on the appropriate box, you can display a variety of statistics for the Training and Test Patterns as learning progresses.


Training Patterns

Learning Events: This box displays the number of training patterns that have been propagated through the network.  Unlike backpropagation networks which propagate training patterns through the network many times seeking a lower mean square error between the network's output and the actual output or answer, PNN training patterns are only propagated through the network one time.


Test Patterns

Smoothing Test Events: This box displays the number of test patterns that have been propagated through the network.

Current best smoothing factor: If you're using Calibration, this box displays the overall smoothing factor that results in the highest number of correct classifications.

Smoothing Test Epochs: This box displays the number of times the entire set of test patterns has been passed through the network.

Last Number Incorrect: This box displays the number of incorrect classifications on the last pass through the test set.

Minimum number incorrect: This box displays the minimum number of incorrect classifications for the best smoothing factor found to this point.

Epochs since min incorrect: This box displays that number of times the entire set of test patterns has passed through the network since the minimum number of incorrect patterns was reached.


Use the Run Menu to Start Training the network. You may also select options to Continue or Interrupt training.


Note: When using Calibration with PNN Networks, the mean squared error shown on the screen is for outputs after they have been mapped into the interval [0,1} and therefore will not be the mean squared error which is displayed in the Apply PNN Network module.

Note:  If you change the number of input or output neurons, you must retrain the network.  You cannot continue training the original network.

File Note: This module defaults to training on a .TRN file, if it exists, or the .PAT file if there is no .TRN file.  This module uses the .FIG file created in the Design module and the .MMX file created in the Define Inputs and Outputs module.  All of these files for a given problem must reside in the same directory.


This module can be minimized and run in the background while you do other things.