A family of scaled conjugate gradient methods under a new modified weak-Wolfe-Powell line search for large-scale optimization

sanaz Bojari, Mohammad Reza Eslahchi

Abstract


In this paper‎, ‎a family of three-term conjugate gradient methods is proposed to solve a large-scale unconstrained optimization problem‎. ‎With the help of suitable features of the new family (like sufficient descent directions) a strong global convergence theorem for uniformly convex functions under weak Wolfe-Powell line search technique is established‎. ‎Furthermore‎, ‎a new well-defined modification of weak Wolfe-Powell line search technique is presented and a strong global convergence theorem for general smooth functions is obtained‎. ‎In two competitions contained two line search techniques‎, ‎five well behaved conjugate gradient methods and 200 standard problems the efficiency of these new methods in numerical experience is indicated.

Refbacks

  • There are currently no refbacks.