稀疏拟牛顿算法研究及其应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文主要研究求解无约束优化问题的拟牛顿算法,提出了三个新的拟牛顿算法.主要内容如下:
     第二章基于新拟牛顿方程,结合BFGS类修正公式构造了一个新的拟牛顿算法,在一定假设条件下,证明了算法的全局收敛性质和算法的超线性收敛速度.数值试验结果表明算法是有效的.
     第三章基于拟牛顿方程,结合非单调线搜索技术,设计了求解无约束最优化问题的改进Grippo非单调线搜索规则的新的对角稀疏拟牛顿算法,证明了算法的全局收敛性和算法的超线性收敛速度.新的步长规则在每一次线搜索时得到一个相对于Grippo非单调线搜索规则的较大步长,同时保证算法的全局收敛性.数值试验表明算法是有效的,适合求解大规模问题.
     第四章基于广义拟牛顿方程,结合改进的非单调线搜索技术,设计了求解无约束最优化问题的改进Grippo非单调线搜索规则的新的对角稀疏广义拟牛顿算法,证明了算法的全局收敛性和算法的超线性收敛速度.数值试验表明算法是有效的,适合求解大规模问题.
In this paper, we propose three new quasi-Newton methods for unconstrained optimization. The main content of the thesis is presented as follows:
     In chapter two, based on new quasi-Newton equation, we propose new quasi-Newton method combining with BFGS-type update formula. Under some assumptions, we proved the global convergence and the speed of super linear convergence. Numerical results show that the method is effective.
     In chapter three, based on the quasi-Newton equation, together with non–monotone line search skill, we designed a new diagonal-sparse quasi-Newton method with modified Grippo non-monotone line search for unconstrained optimization. The global convergence and super linear convergence are guaranteed to our new method, meanwhile, in every iteration, the new step size generated by the new method is lager than that of Grippo non-monotone line search. Numerical experiments show that the new method is effective and suitable for large scale problems.
     In chapter four, on the base of new quasi-Newton equation, together with modified non-monotone line search skill, we designed new diagonal-sparse modified quasi-Newton method with modified Grippo non-monotone line search for unconstrained optimization. Global convergence and super linear convergent speed of the method are proved. Numerical experiments show that the new method is effective and suitable for large scale problems.
引文
[1] Grippo L, Lampariello F, Lucidi S. A nonmonotone line search technique for Newton’s method [J]. SIAM Journal on Numerical Analysis,1986,23 (4):707-716
    [2] Barzilai J and Borwein J M. Two-point step size gradient methods [J], IMA Journal of Numerical Analysis, 1988,8:141-148
    [3]时贞军,孙国.无约束优化问题的对角稀疏拟牛顿法[J].系统科学与数学. 2006,26 (1): 101-112
    [4] Z.Wei, L.Qi, X.Chen. An SQP-type method and its application in stochastic programs[J]. Optim. Theory Appl. 2003,116(1): 205–228
    [5] Z.Wei, L.Qi, S.Ito, New step-size rules for optimization problems. Department of Mathematics and Information Science, Guang xi University, Nannlng, Guangxi, PR China, October 2000
    [6] Zengxin Wei, Guoyin Li, Liqun Qi, New quasi-Newton methods for unconstrained optimization problems [J]. Applied Mathematics and Computation, 2006,175:1156-1188
    [7] J. E. Dennis, J. J. More. A character rization of super linear convergence and its application to quasi-Newton methods [J]. Math. Comp, 1974, 28: 549–560
    [8] Lampariello F. and Sciandrone M. Global convergence technique for the Newton method with periodic Hessian evaluation [J]. Journal of Optimization Theory and Applications, 2001, 111(2): 341-358
    [9] Grippo L, Lampariello F Lucidi S. A non-monotone line search technique for newton’s method [J]. SIAM J, NUMER. ANAL, 1986, 23(4): 707-716
    [10] Sun Qingying.Global Convergence Results of a Three Term Memory Gradient Method with a Non-monotone Line Search Technique [J]. ACTA Mathematica Scientia, 2005, 25B(1): 170-178
    [11] Y. H. DAI. On the nonmonotone line search [J]. Journal of Optimization Theory and Applications, 2002,112: 315-330
    [12] Y. H. DAI.A nonmonotone conjugate gradient algorithm for unconstrained optimization [J]. Syst. Sci. Complex. 2002, 15: 139-145
    [13] Grippo L, Lampariello F Lucidi S. A truncated Newton method with non-monotone line search for unconstrained optimization [J]. Journal of Optimization Theory and Applications, 1989, 60(3): 401-419
    [14] Touati-Ahmed D. and Storey, C. Efficient hybrid conjugate gradient techniques [J], Journal of Optimization Theory and Applications, 1990, 64 : 2: 379-397
    [15] J. J. Morè, B. S. Garbow, K. E. Hillstrome. Testing unconstrained optimization software[J]. ACM Trans. Math. Software, 1981, 7: 17–41
    [16]袁亚湘,孙文渝.最优化理论与方法[M].北京:科学出版社, 1997
    [17]赖炎连,韦增欣.初始点任意且全局收敛的梯度投影法[J].科学通报, 1990, 20: 1536-1539
    [18]高自友,贺国平.约束优化问题的广义梯度投影法[J].科学通报, 1991, 19: 1444-1447
    [19]高自友,于伟.初始点任意的广义梯度投影法[J].科学通报, 1992, 20: 1832-1836
    [20]张祥蓀.关于Rosen梯度法的收敛性证明,应用数学学报[J]. 1985, 8 (1): 125-128
    [21] Y. H. DAI, On the nonmonotone line search [J], Journal of Optimization Theory and Applications, 2002, 112: 315-330
    [22] Y. H. DAI, A nonmonotone conjugate gradient algorithm for unconstrained optimization [J], J. Syst. Sci. Complex. 2002, 15: 139-145
    [23] Barzilai J and Borwein J M. Two-point step size gradient methods [J], IMA Journal of Numerical Analysis, 1988, 8:141-148
    [24] Raydan M. On the Barzilai and Borwein choice of steplength for the gradient method [J], IMA Journal of Numerical Analysis, 1993,13: 321-326
    [25] Raydan M and Svaiter BF. Relaxed steepest descent and Cauchy- Barzilai-Borwein method [J]. Computational Optimization and Applications, 2002, 21: 155-167
    [26] Raydan M. The Barzilai and Borwein gradient method for large scale unconstrained minimization problems [J], SIAM J. Optim.1997, 7 (1): 26-33
    [27] Shi Z. J. Convergence of qusi-Newton method with new inexact line search, Journal of Mathematical Analysis and Applications, 2006, 315(1):120-131
    [28] Y. Dai, N. Qi, Testing different conjugate gradient methods for large-scale unconstrained optimization, J. Comput. Math. 2003, 21(3): 311–320
    [29] C. G. Broyden, J. E. Dennis, J. J. Morè, On the local and superlinear convergence of quasi-Newton methods. J. Inst. Math. Appl. 1973, 12: 223–246
    [30] R. Byrd, J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAMJ. Numer. Anal. 1989, 26: 727–739
    [31] R. Byrd, J. Nocedal, Y. Yuan, Global onvergence of a class of quasi-Newton methods on convex problems. SIAMJ. Numer. Anal. 1987, 24: 1171–1189
    [32] W. C. Davidon, Variable metric methods for minimization, Argonne National Labs Report, ANL-5990
    [33] A. Griewank, On automatic differentiation, in: M. Iri, K. Tanabe(Eds.), Mathematical Programming: Recent Developments and Applications, Kluwer Academic Publishers, 1989, pp. 84–108
    [34] A. Griewank, Ph. L. Toint, Local convergence analysis for partitioned quasi-Newton updates, Numer. Math. 1982, 39: 429–448
    [35] D. Li, M. Fukushima, A modi?ed BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 2001, 129: 15–35
    [36] D. Li, M. Fukushima. A global and superlinear convergent Gauss–Newton-method for symmetric nonlinear equations. SIAMJ. Numer. Anal. 1999, 37:152-172
    [37] J. M. Perry, A class of conjugate algorithms with a two step variable metric memory, Discussion paper 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University, 1977
    [38] M. J. D. Powell, A new algorithm for unconstrained optimization, in: J.B. Rosen, O. L. Mangasarian, K. Ritter(Eds.), Nonlinear Programming, Academic Press, New York, 1970
    [39] D. F. Shanno, On the convergence of a new conjugate gradient algorithm, SIAM J. Numer. Anal. 1978, 15: 1247–1257
    [40] J. Stoer, R. Bulirsch. Introduction to Numerical Analysis, second ed. Springer-Verlag, 1996
    [41] Z. Wei, Q. Li, G. Yuan, New BFGS-trust region methods for unconstrained minimization problems, Department of Mathematics and Information Science, Guangxi University, Nannlng, Guangxi, PR China, October 2003
    [42] Y. Yuan, W. Sun, Theory and Methods of Optimization, Science Press of China, 1999

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700