Volume 14, Issue 1, Summer and Autumn 2014, Page 1-199


Learning rate for the back propagation algorithm based on modified scant equation

Dr.Khalil K. Abbo; Marwa S. Jaborry

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2014, Volume 14, Issue 1, Pages 1-11
DOI: 10.33899/iqjoss.2014.89207

The classical Back propagation method (CBP) is the simplest algorithm for training feed-forward neural networks. It uses the steepest descent direction with fixed learning rate to minimize the error function E, since is fixed for each iteration this causes slow convergence for the CBP algorithm. In this paper we suggested a new formula for computing learning rate , using modified secant equation to accelerate the convergence of the CBP algorithm. Simulation results are presented and compared with other training algorithms.

A New sufficient descent Conjugate Gradient Method for Nonlinear Optimization

Dr. Basim A. Hassan; Omer M. Esmaeel

IRAQI JOURNAL OF STATISTICAL SCIENCES, 2014, Volume 14, Issue 1, Pages 12-24
DOI: 10.33899/iqjoss.2014.89208

In this paper, a new conjugate gradient method based on exact step size which produces sufficient descent search direction at every iteration is introduced. We prove its global convergence, and give some results to illustrate its efficiency by comparing with the Polak and Ribiere method.