Generalized Implicit-Update in Multi-Step QN Methods 5- Generalized Cubic Function:

In this paper, we have generalized the implicit update of the Quasi-Newton’s condition. We have here investigated a four; five and n-step update algorithm. We have applied the cases at four and five- step numerically and we have compared these cases with other QN-algorithms. The numerical results of the proposed algorithm show that the new algorithm was better than others.


Introduction:
One problem, which is more widely used, is QN method, where approximate Hessian or inverse Hessian is updated at each iteration, while the gradients are supplied. The basic requirement for the updating formula is that the QN condition (secant condition i.e ) where s k , y k are defined as: where x k is the point at k iteration, g k is the gradient at k iteration and B k+1 is the inverse Hessian. We consider QN methods for unconstrained optimization problems (min f(x), x R n ), where the basic idea behind the QN formulas is to update B k+1 from B k in some computational cheap ways while ensure secant condition, and the computation of the update should be relatively cheap.

One Step Method (BFGS Method):
The BFGS method is one of the most efficient QN methods for unconstrained optimization. This algorithm was proposed by Broyden, Fletcher, Goldfarb and Shanno in (1970). BFGS method has a search direction computed by: (1) where H k is a symmetric and positive definite matrix at the k-th iteration. The next iterate is given by: where D k is the step size that satisfies the strong Wolfe condition ( [Ahmed, 2005]. The approximation matrix is updated by: We call eq.(3) by one step method. [Dai, 2002] and [Nocedal et al., 1987].
Step 5: Update H k by the correction matrix to get H k+1 defined by: Step 6: Set k=k+1 and go to step 2.

Implicit update method:
Ford in 2001 developed a two-step implicit algorithm denoted as two-step QN method which are very similar to the standard (one-step) method in very respect, except that the Hessian approximation H k+1 in the standard method is constrained to satisfy the relation ( ). Where in the two-step methods, it must satisfy a modified relation of the form: where J is positive scalar and defined by: The relation (4) or (6)  x(M j )=x i+j-1 , j=0,1 (9) g(M j )=g(x i+j-1 ), j=0,1 (10) A Suitable matrix B k+1 satisfying (4) or (6) may then be obtained by using the BFGS formula. [Tharmlikit, 2001].
for general s k . [Ford and Moghrabi, 1993;1994] Ford proved that two-step iterations are alternated with standard one-step iterations, so that ): Substitute the value of r k and w k by (7) and (8) then we obtain: Iraqi Journal of Statistical Science (12) 2007______________ [5] [Ford and Moghrabi, 1996]

The 3-Step Implicit Update Algorithm:
Al Step 3: Compute Step 4: If < +1 k g then stop else continue.
Step 5: If k=1 then r k =s k and w k =y k , i.e. we use standard BFGS formula, Step 7: Set k=k+1 and go to step 2.

Generalized Implicit-Update in QN Methods:
Here we extended the three-step update to the four-step update by using four terms and we extended the four-step update to the five-step update by using five terms and hence we generalize the process to n-terms as the following:

N-Step Implicit Update Forms:
We prove that N-step implicit form by using mathematical induction: Assume that we extended all-step to (n-1)-step as: ...
Step 5: If k=1 then r k =s k and w k =y k , i.e. we use standard BFGS formula, Step 6: Update H k by using: Step 7: Set k=k+1 and go to step 2.

Numerical Results:
In order to assess the performance of the new implicit Nstep QN methods for the cases N=4, N=5, we tested these cases by using (5) nonlinear test functions with dimension N=100.
All results are obtained using Pentium 3. All programs are written in FORTRAN 90 language and for all cases the stopping criterion taken to be The comparative performance for all of these methods are evaluated by considering NOF, NOI, where NOF is the number of function evaluations and NOI is the number of iterations.
All the methods, in this search use the same exact line search strategy which is the quadratic interpolation technique directly adapted from [Bunday, 1984].
In table (1), we have compared our new methods 4-step and 5-step with BFGS and 3-step methods. The numerical results of the proposed methods show that the new methods were better than others (i.e. when the steps are increasing, the results be good).