文摘
Conjugate gradient methods are highly useful for solving large scale optimization problems because they do not require the storage of any matrices. Motivated by the construction of conjugate gradient parameters in some existing conjugate gradient methods, we propose four modified conjugate gradient methods, named NVLS, NVPRP*, NVHS* and NVLS* respectively, and prove that these methods with the strong Wolfe line search possess sufficient descent property, and are globally convergent when the parameter in line search conditions is restricted in some suitable interval. Preliminary numerical results show that the NVPRP*, NVHS* and NVLS* methods are more efficient than many existing conjugate gradient methods for a large number of test problems from a CUTEr collection.