Like PNN networks, General Regression Neural Networks (GRNN) are known for their ability to train quickly on sparse data sets. GRNN is a type of supervised network. Rather than categorizing data like PNN, however, GRNN applications are able to produce continuous valued outputs. In our tests we found that GRNN responds much better than backpropagation to many types of problems (but not all). It is especially useful for continuous function approximation. GRNN can have multidimensional input, and it will fit multidimensional surfaces through data.
GRNN is a three-layer network where there must be one hidden neuron for each training pattern. There are no training parameters such as learning rate and momentum as in Backpropagation, but there is a smoothing factor, described below, that is applied after the network is trained.
Note: If you have more than 2000 patterns in your training set, then GRNN may become too slow to be feasible unless you have a very fast machine. The reason is that applying a GRNN network requires a comparison between the new pattern and each of the training patterns.
Click on each Slab to select or inspect the number of neurons. Change the default settings by typing in a new value in the text box.
The number of neurons in the hidden layer is set automatically, but you may want to change the default settings. For GRNN networks, the number is usually the number of patterns in the training set because the hidden layer consists of one neuron for each pattern in the training set. You can make it larger if you may want to add more patterns, but don't make it smaller. If you are designing a network without an existing .PAT file, the default is 0 and you must specify a value.
The number of neurons in the input layer (Slab 1) is the number of inputs in your problem, and the number of neurons in the output layer (Slab 3) corresponds to the number of outputs.
Use the mouse to select a scaling function for the input layer from the list box.
Connection Arrows (Links)
You can click on the Connection Arrows to set or inspect a smoothing factor for each link. The same smoothing factor applies to all links. The smoothing factor that you set in the design stage is a default setting. You may change it in the Apply a Trained Network module. It's a good idea to experiment with different smoothing factors to discover which works best for your problem. Apply the trained network to your training set, and perhaps a test set, using different smoothing factors, and see which one gives the best answers.
If you're using Calibration, the smoothing factor will be automatically computed and the default setting will be ignored.