Handwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns


1 Computer Engineering, Reliance Ind. Ltd

2 ICIS, Dr. B.R.Ambedkar University


The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1/2 and 2/3) of average height of words, for segmentation into characters. These characters are clubbed together after binarization to form training patterns for neural network. Network was trained by adjusting the connection strengths on each iteration by introducing the second momentum term. This term alters the process of connection strength fast and efficiently. The conjugate gradient descent of each presented training pattern was found to identify the error minima for each training pattern. The network was trained to learn its behavior by presenting each one of the 5 samples (final input samples having 26 × 5 = 130 letters) 100 times to it, thus achieved 500 trials indicate the significant difference between the two momentum variables in the data sets presented to the neural network. The results indicate that the segmentation based on 2/3 portion of height yields better segmentation and the performance of the neural network was more convergent and accurate for the learning with newly introduced momentum term.