Citation
Abstract
This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving large-scale unconstrained optimization problems is developed. The basic idea is to incorporate the SR1 update within the framework of the memoryless quasi-Newton method. However, it is well-known that the SR1 update may not preserve positive definiteness even when updated from a positive definite matrix. Therefore we propose the memoryless SR1 method, which is updated from a positive scaled of the identity, where the scaling factor is derived in such a way that positive definiteness of the updating matrices are preserved and at the same time improves the condition of the scaled memoryless SR1 update. Under very mild conditions it is shown that, for strictly convex objective functions, the method is globally convergent with a linear rate of convergence. Numerical results show that the optimally scaled memoryless SR1 method is very encouraging.
Download File
Official URL or Download Paper: http://www.elsevier.com/
|
Additional Metadata
Item Type: | Article |
---|---|
Divisions: | Faculty of Science Institute for Mathematical Research |
DOI Number: | https://doi.org/10.1016/j.amc.2011.05.080 |
Publisher: | Elsevier |
Keywords: | Large-scale optimization; Memoryless quasi-Newton method; Optimal scaling; Symmetric rank one update |
Depositing User: | Nur Farahin Ramli |
Date Deposited: | 03 Sep 2013 08:25 |
Last Modified: | 14 Jan 2016 03:42 |
Altmetrics: | http://www.altmetric.com/details.php?domain=psasir.upm.edu.my&doi=10.1016/j.amc.2011.05.080 |
URI: | http://psasir.upm.edu.my/id/eprint/24641 |
Statistic Details: | View Download Statistic |
Actions (login required)
View Item |