Advanced System - Backpropagation - XOR

Advanced System - Backpropagation - XOR

Top  Previous  Next

 

Problem

This is the famous Exclusive Or problem (XOR) which, though trivial, has long been a benchmark test for neural networks.  The reason for this is that early neural networks (perceptrons) in the 1960s could not solve it.  That deficiency in neural networks of the time led researchers to concentrate Artificial Intelligence research into symbolic processing methods instead of neural networks.  By the late 1980s, networks, especially multilayer feedforward Backpropagation networks could easily solve the problem.

 

Inputs

There are two inputs which are either ON or OFF.

 

Outputs

The single output is the logical exclusive or of the two inputs. It is ON if one or the other input is ON, but not if both inputs are ON.

 

Training Patterns

Training patterns are all four combinations of inputs, as follows:

 

 Input1    Input2    Output

 

  ON         OFF        ON

  OFF        ON         ON

  ON         ON         OFF

  OFF        OFF        OFF

 

Processing

This example has not been entered or trained, so it will give you a complete experience doing so. Use Advanced Neural Networks so you can do translations of ON and OFF to numbers, or just use 1 and 0 so you can use Beginner's Networks.

 

All inputs are strings and will be entered with the datagrid.  Put appropriate column names on the tops of the columns to be used as variable names.

 

The next step is to pass the data through symbol translate and change ON to 1 and OFF to 0 in all columns.  Do not extract a test set since we are training the entire universe of patterns.

 

Use Backpropagation and train to a threshold of .01. (Later you can try PNN if you like.)  The problem trains best with a standard 3 layer network of 3 hidden neurons.  Use a high learning rate (.6) and momentum (.9) since this is fairly simple "toy" problem.  Use a rotational pattern selection, not random.

 

XOR will train with 2 hidden neurons, but does not always converge to a solution. It is even possible to train with 1 hidden neuron, but a "jump" connection must be added, so use a complex three layer network with every slab connected to each previous slab if you want to try this.

 

After training completes, execute the trained network using the pattern file you trained.  Attach the resultant file to the pattern file to form an output file (XOR.OUT).

 

Now postprocess the file with rules.  The answers the network gives will not be exactly 0 and 1. You will likely get answers like .02 or .98.  So use two rules on the output column:

 

if output >= .5 then output=1

if output <. 5 then output=0

 

In the future when your outputs are categories, you will want to determine the output cutoff (.5 above) based upon test results for your particular problem, so that you minimize false positives and false negatives.

 

Then postprocess the file through the symbol translate to change the 0s and 1s in the new fourth column to OFF and ON.

 

Finally, view your answers in the datagrid.  You may want to print the output file (XOR.OUT) using the Printouts icon in Problem Export.