摘要
提出了一类有效的求解大规模优化问题的共轭梯度法(AGGSSV),但其全局收敛性是在目标函数为一致凸的条件下成立,研究了目标函数不是凸函数的条件下,共轭梯度法(AGGSSV)的全局收敛性.
In the report, a class of efficient conjugate gradient algorithms(ACGSSV) was proposed to solve large-scale unconstrained optimization problems, however, its global convergence property was established under the condition that the objective function is uniformly convex. The global convergence of ACGSSV without convexity assumption on the objective function was also discussed.
引文
[1] Hager W W,Zhang H C.A survey of nonlinear conjugate gradient methods[J].Pacific Journal of Optimization,2006,2:35-58.
[2] 袁亚湘.非线性优化计算方法[M].北京:科学出版社,2008:38-50.
[3] Andrei N.Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update[J].Journal of Computational and Applied Mathematics,2017,325:149-164.
[4] Ou Y G.A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods[J].Journal of Computational and Applied Mathematics,2018,332:101-106.
[5] Andrei N.Scaled conjugate gradient algorithms for unconstrained optimization[J].Computational Optimization and Applications,2007,38 (3):401-416.
[6] Hager W W,Zhang H C.A new conjugate gradient method with guaranteed descent and an efficient line search[J].SIAM Journal on Optimization,2005,16 (1):170-192.
[7] Andrei N.Acceleration of conjugate gradient algorithms for unconstrained optimization[J].Applied Mathematics and Computation,2009,213(2):361-369.
[8] Deng S H,Wan Z.An improved three-term conjugate gradient algorithm for solving unconstrained optimization problems[J].Optimization,2015,64(12):2 679-2 691.
[9] Andrei.N On three-term conjugate gradient algorithms for unconstrained optimization,Applied Mathematics and Computation,2013,219(11):6 316-6 327.