UPM Institutional Repository

An improved multi-step gradient-type method for large scale optimization


Citation

Farid, Mahboubeh and Leong, Wah June (2011) An improved multi-step gradient-type method for large scale optimization. Computers and Mathematics with Applications, 61 (11). pp. 3312-3318. ISSN 0898-1221; ESSN: 1873-7668

Abstract

In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrained optimization. Our approach is based on constructing a new gradient-type method by means of interpolating curves. We measure the distances required to parameterize the interpolating polynomials via a norm defined by a positive-definite matrix. By developing on implicit updating approach we can obtain an improved version of Hessian approximation in diagonal matrix form, while avoiding the computational expenses of actually calculating the improved version of the approximation matrix. The effectiveness of our proposed method is evaluated by means of computational comparison with the BB method and its variants. We show that our method is globally convergent and only requires O(n) memory allocations.


Download File

[img]
Preview
PDF (Abstract)
An improved multi.pdf

Download (85kB) | Preview

Additional Metadata

Item Type: Article
Divisions: Institute for Mathematical Research
DOI Number: https://doi.org/10.1016/j.camwa.2011.04.030
Publisher: Elsevier
Keywords: Diagonal updating; Generalized weak secant equation; Global convergence; Multi-step gradient method
Depositing User: Nur Farahin Ramli
Date Deposited: 03 Sep 2013 03:18
Last Modified: 11 Oct 2019 00:43
Altmetrics: http://www.altmetric.com/details.php?domain=psasir.upm.edu.my&doi=10.1016/j.camwa.2011.04.030
URI: http://psasir.upm.edu.my/id/eprint/24643
Statistic Details: View Download Statistic

Actions (login required)

View Item View Item