Two-point Stepsize Gradient Algoriyhms for Unconstrained Optimization
IRAQI JOURNAL OF STATISTICAL SCIENCES,
2007, Volume 7, Issue 1, Pages 1-26
AbstractIn this paper we have investigated three algorithms. In the first algorithm we have derived a new optimal step size gradient algorithm which is preferable over the classical SD algorithm both in theory and in the real computation. In the second algorithm we have derived and implemented a new formula for the non-quadratic model with a new . In the third algorithm we have tried to make a new hybrid algorithm between the above three different step sizes.
Our numerical results are promising in general by implementing ten non-linear different test functions with different dimensions.
- Article View: 138
- PDF Download: 40