Novel Limited Memory Quasi-Newton Methods Based On Optimal Matrix Approximation

Erik Berglund,Mikael Johansson

arxiv(2024)

引用 0|浏览4
暂无评分
摘要
Update formulas for the Hessian approximations in quasi-Newton methods such as BFGS can be derived as analytical solutions to certain nearest-matrix problems. In this article, we propose a similar idea for deriving new limited memory versions of quasi-Newton methods. Most limited memory quasi-Newton methods make use of Hessian approximations that can be written as a scaled identity matrix plus a symmetric matrix with limited rank. We derive a way of finding the nearest matrix of this type to an arbitrary symmetric matrix, in either the Frobenius norm, the induced l^2 norm, or a dissimilarity measure for positive definite matrices in terms of trace and determinant. In doing so, we lay down a framework for more general matrix optimization problems with unitarily invariant matrix norms and arbitrary constraints on the set of eigenvalues. We then propose a trust region method in which the Hessian approximation, after having been updated by a Broyden class formula and used to solve a trust-region problem, is replaced by one of its closest limited memory approximations. We propose to store the Hessian approximation in terms of its eigenvectors and eigenvalues in a way that completely defines its eigenvalue decomposition, as this simplifies both the solution of the trust region subproblem and the nearest limited memory matrix problem. Our method is compared to a reference trust region method with the usual limited memory BFGS updates, and is shown to require fewer iterations and the storage of fewer vectors for a variety of test problems.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要