Dai Y-H., Liao L-Z and Li D. On Restart Procedures for the Conjugate Gradient Method: Theory and Practice in Optimization, Numerical Algorithms, Volume 35, pp. 249-260, 2004

**Abstract - **The conjugate gradient method is a powerful solution scheme for solving unconstrained optimization problems, especially for large-scale problems. However, the convergence rate of the method without restart is only linear. In this paper, we present a new restart technique for this method. Given an arbitrary descent direction *d*_{t} and the gradient *g*_{t}, our key idea is to make use of the BFGS updating formula to provide a symmetric positive definite matrix *P*_{t} such that *d*_{t}=-*P*_{t}*g*_{t}, and then define the conjugate gradient iteration in the transformed space. Two conjugate gradient algorithms are designed based on the new restart technique. Their global convergence is proved under mild assumptions on the objective function. Numerical experiments are also reported, which show that the two algorithms are comparable to the Beale–Powell restart algorithm.