New Projection Matrix for the Stander Conjugate Gradient Method

In this paper, we have derived anew proposed algorithm for conjugate gradient method based on a projection matrix. This Algorithm satisfies the sufficient descent condition and the globally converges . Numerical comparisons with a standard conjugate gradient algorithm show that this algorithm very effective depending on the number of iterations and the number of functions evaluation.


Let us consider the nonlinear unconstrained optimization problem
Where f is smooth and its gradient g is available.
Conjugate gradient methods are efficient for solving (1), especially when the dimension n is large.The iterates of conjugate gradient methods for solving (1) are obtained by where k is a steplength, which is computed by carrying out some line search, and k d is the search direction defined by where k is a scalar.Some well-known conjugate gradient methods include the Hestenes-Stiefel (HS) method , Fletcher-Reeves (FR) method , the Polak-Ribière-Polyak (PRP) method, and the Dai-Yuan (DY) method and Al-Bayati & Al-Assady .The parameters k of these methods are specified as follows  (Hestenes-Stiefel,1952 ) (Dai-Yuan (DY),1999) The stepsize k is usually chosen to satisfy certain line search conditions.Among them, the so-called strong Wolfe line search conditions require that,The weak Wolfe-conditions: the strong Wolfe-conditions:

2.New proposed method
in this paper we will get new projection matrix from three-term CGalgorithm as follows: If we use the exact line search and, then our method (12) becomes the nonlinear conjugate gradient method (3)

The New Algorithm :
Step 1 : For the initial point 0 Step 3 : Find 0 k satisfying the strong wolf conditions.
Step 4: Let Step 5 : compute the search direction 1 k d by (13) Step 6 : If ,then go to step 2.

Global Convergence Properties for the new Suggestion algorithm:
In this section we will study the convergence of the new proposed method depending by the following assumption Assumption(A) : We can get from assumption (A) that there exists positive constant 0 , such that: Lemma (1).Suppose that the assumption (A) hold and consider any conjugate gradient method ( 2) and ( 3

Lemma:
Suppose the assumption (A) hold , let the sequence { k x } generated by (2) and the step length k satisfies wolf conditions, then the direction which is define in (13)is satisfied sufficient condition Proof : By multiply both side of (13) by by using lemma(1) we get : 0 lim k k g

5:Numerical experiments
In this section, we will test the feasibility and effectiveness of the Algorithm 2.1.The algorithm is implemented in Fortran77 code using ), all the algorithms in this paper use the same ILS strategy. .All the results are obtained using (Pentium 4 computer).All programs are written in FORTRAN 90 language and for all cases the stopping criterion taken to be:  x 0 =(1,2,2,2;...) T .

Wolfe Function: New Projection Matrix for the Stander Conjugate Gradient
x 0 =(-1,…) T 6. Rosenborck Function: three-term restart conjugate gradient method is a well-known threeterm conjugate gradient method in which ), where is a descent direction k d and k is obtained by the strong Wolfe line search Y.H., et al, 1999) The comparative performance for all of these algorithms is evaluated by considering number of function Evaluations NOF and number of iterations NOI .

The 6 th Scientific Conference of the College of Computer Sciences & Mathematics ] 114 [ double
(1)cision arithmetic and Comparison our new algorithm with standard three term CG-algorithm in Table(1)overall the calculation and for different dimension for (

Table ( 1
) Comparison of our new algorithm with standard FR CGalgorithm.