Performance Analysis of Gradient Descent with Momentum & Adaptive Back Propogation
Published: 2011
Author(s) Name: Rajesh Lavania, Jawahar Thakur, Manu Pratap Singh
Locked
Subscribed
Available for All
Abstract
In this paper we are analyzing the performance of multi
layer feed forward Neural Networks with Gradient
descent with momentum & adaptive back propagation
(TRAINGDX) and BFGS quasi-Newton back propagation
(TRAINBFG) for Hand written Hindi Characters of
SWARS. In this analysis, five Hand written Hindi
characters of SWARS from different people are collected
and stored as an image. The MATLAB function is used to
determine the densities of these scanned images after
partitioning the image into 16 portions. These 16 densities
for each character are used as an input pattern for the two
different Neural Network architectures. The two learning
rules as the variant of Back Propagation learning algorithm
are used to train these Neural Networks. The performances
of these two Neural Networks are analyzed for
convergence and trends of error in the case of non
convergence. There are some interesting and important
observations have been considered for trends of error in the
case of non convergence. The inheritance of local minima
problem of back propagation algorithm is effecting
massively to these two proposed learning algorithm also.
Keywords: Back propagation Algorithm, Multilayer
Neural Networks, Gradient Descent, and Pattern
Recognition.
View PDF