Vertical Segmentation Technique Based on Image Density and Role of Modified Gradient Descent Technique of Neural Networks For Handwritten Character Recognition
Published: 2012
Author(s) Name: Vijaypal Singh Dhaka, Mukta Rao
Locked
Subscribed
Available for All
Abstract
The purpose of this study is to analyze the performance of
Backpropagation algorithm with changing training
patterns and the second momentum term in feedforward
neural networks. This analysis is conducted on 250
different words of three small letters from English
alphabet. These words are presented to two vertical
segmentation programs which are designed in MATLAB
and based on portions (1/2 and 2/3) of average height of
the words, for segmentation into characters. These
Characters are clubbed together after binarization to form
training patterns for neural network. Network was trained
by adjusting the connection strengths on every iteration by
introducing the second momentum term. This term alters
the process of connection strength fast and efficiently. The
conjugate gradient descent of each presented training
pattern was found to identify the error minima for each
training pattern. The network was trained to learn its
behavior by presenting each one of the 5 samples (final
input samples having 26(a-z) x 5 = 130 letters) 100 times to
it, thus achieved 500 trials indicate the significant
difference between the two momentum variables in the
data sets presented to the neural network. The results
indicate that the segmentation based on 2/3 portion of
height yields better segmentation and the performance of
the neural network was more convergent and accurate for
the learning with newly introduced momentum term.
View PDF