This section will detail changes between NeuroShell 2 Release 3.0 and Release 2.0.
GMDH (Polynomial Nets)
The new release adds a new network architecture, GMDH (Group Method of Data Handling) or polynomial nets. The architecture computes a mathematical formula which is a nonlinear polynomial expression relating the values of the most important inputs to predict the output variable. For more information, refer to GMDH Architecture.
Source Code Generator
Once you have trained a network using NeuroShell 2, you can generate Visual Basic or C source code which you can compile into other programs. If you have large nets, the source code may have to be broken into several pieces so that it will compile. The Source Code Generator is found in the Runtime module. For more information, refer to Source Code Generator.
Training in Either 16 or 32-Bit Mode
If you are using either Windows 95 or Windows NT, you may choose to train your networks in either 16 or 32-bit mode. If you are using 32-bit mode, training speed varies depending upon whether you are using Windows 95 or Windows NT. If you are running Windows 3.1, you can train your networks only in 16-bit mode.
Omega File Import Module
This module imports an Omega Downloader data file and converts it to NeuroShell 2 internal file format. An Omega Downloader file is a data file created by Wall Street Analyst, TradeStation, or SuperCharts programs from Omega Research, Inc.
NET-PERFECT is Now Calibration
The term Calibration replaces NET-PERFECT in the NeuroShell 2 screens and help files. Calibration more accurately reflects the concept of NET-PERFECT, which optimizes the network by applying the current network to an independent test set during training.
Test Set Extract
Release 3 adds an additional extraction method to the Test Set Extract module which allows you to select a random test set and at the same time select a production file from the end of the .PAT file. The release also changes the default percent extraction for a test set to 20 percent from 10 percent in previous releases. Finally, the extraction percentage is more accurate than it was before.
Calling the Predict Function from Excel 7 or Excel 5 for NT
If you want to use the OpenNet, FireNet, CloseNet, or Predict functions with either Excel 7 or Excel 5 for NT, you need to substitute the file called NS2-32.DLL for NSHELL2.DLL in your call statement (refer to DLL Server for more information on how to use these functions). NS2-32.DLL is installed in the \WINDOWS\SYSTEM directory during NeuroShell 2 setup. It is the 32-bit version of the 16-bit NSHELL2.DLL.
Calls in Excel 7 are case sensitive, so you must make sure the upper and lower case letters match the following:
Note: If you intend to execute a trained network on a computer other than the one used to train the network, you must determine whether the computer which will run the network will be using either Excel 7 or Excel 5 for NT and supply the correct .DLL.
The Predict function uses the list separator (either a comma or period) that you specified in Windows. Refer to International Decimal Separator and List Separator described below for details.
Refer to DLL Server for more information on using the Predict function.
International Decimal Separator and List Separator
Release 3 honors your Windows selection of either a comma or a period for a decimal separator. In Windows 95, this choice is made in the Control Panel, Regional Settings, Number option. In Windows 3.1, this choice is made by opening the Control Panel, and International icons and selecting Number.
Architecture Default Setting Changes
The Architectures Module defaults to the second Ward Network rather than a simple three layer backpropagation network. The change was made because the Ward Net generally gives better results than a simple three layer backpropagation network.
The .MMX and .FIG files from Release 3 are not compatible with previous releases. You cannot run Release 3 networks in previous releases. Networks created in previous releases may be run in Release 3, however.
Training with Files in Memory
Release 3 automatically loads training and test files into memory. The option to specify whether to place the training and/or test set(s) in memory has been removed from the Training and Stop Training Criteria module.
Release 3 adds a more sophisticated method of measuring contribution factors than we had in the past. We believe the contribution factors generated by this new method are more reliable. However, we still believe that the best factors are generated by PNN and GRNN genetic adaptive nets.
When training a network, you have a choice of viewing the weights of either the best network or the last network. The weights are displayed with more precision than they were in previous releases of NeuroShell 2. You may also prune the weights that exceed a threshold that you set. Weight pruning may be done on either the best network or the last network. Note: you should interrupt training prior to using the pruning option. If you plan on continuing training, you should use the last network (otherwise the pruned weights will be overwritten). If you prune the weights on the best network, you should not continue training. Refer to Show Weights for more information. You are no longer able to view neuron values in this screen, but you still have the capability of printing a file of neuron values in the Apply Module.
Although Release 3 offers the capability of pruning weights, we still believe that using Calibration is a better method of insuring a network will generalize well on new data.
Long File Names
While Release 3 works with both Windows 95 and Windows NT, it does not support problem file names that are longer than 8 characters. Release 3 will, however, allow you to import data from files that have file names longer than 8 characters.
AUTOEXEC.BAT Path Statement
Release 3 does not require that the NeuroShell 2 directory be placed in the path statement of the AUTOEXEC.BAT file.
NeuroShell 2 Release 3 No Longer Supports NeuroBoard
Release 3.0 no longer has a NeuroBoard item on the Options Menu because today’s faster computers and NeuroShell 2’s faster algorithms have equaled or surpassed the speed of training networks on a NeuroBoard. Users may still run a NeuroBoard using previous releases of NeuroShell 2.
The Batch Processor may be used to simply apply a trained network to data files rather than training the network and then applying it to specified files. The Batch Processor skips training if you do not include any file names in the column labeled “Use these data files to train with”. You must, however, include an .MMX file in the “Define the networks inputs & outputs” column and the .FIG file in the “Architectures & training criteria” column. You may include these files by clicking on the cell beneath the appropriate column and then closing the NeuroShell 2 module which is displayed.