Learning Rate

Top 

Learning Rate

Each time a pattern is presented to the network, the weights leading to an output node are modified slightly during learning in the direction required to produce a smaller error the next time the same pattern is presented.  The amount of weight modification is the learning rate times the error.  For example, if the learning rate is .5, the weight change is one half the error.  The larger the learning rate, the larger the weight changes, and the faster the learning will proceed.  Oscillation or nonconvergence can occur if the learning rate is too large.