Citation
Modarres, Farzin and Abu Hassan, Malik and Leong, Wah June
(2009)
Memoryless modified symmetric rank-one method for large-scale unconstrained optimization.
American Journal of Applied Sciences, 6 (12).
pp. 2054-2059.
ISSN 1546-9239; ESSN: 1554-3641
Abstract
Problem statement: Memoryless QN methods have been regarded effective techniques for solving large-scale problems that can be considered as one step limited memory QN methods. In this study, we present a scaled memoryless modified Symmetric Rank-One (SR1) algorithm and investigate the numerical performance of the proposed algorithm for solving large-scale unconstrained optimization problems.
Approach: The basic idea is to apply the modified Quasi-Newton (QN)equations, which uses both the gradients and the function values in two successive points in the frame of the scaled memoryless SR1 update, in which the modified SR1 update is reset, at every iteration, to the positive multiple of the identity matrix. The scaling of the identity is chosen such that the positive definiteness of the memoryless modified SR1 update is preserved.
Results: Under some suitable conditions, the global convergence and rate of convergence are established. Computational results, for a test set consisting of 73 unconstrained optimization problems, show that the proposed algorithm is very encouraging.
Conclusion/Recommendations: In this study a memoryless QN method developed for solving large-scale unconstrained optimization problems, in which the SR1 update based on the modified QN equation have applied. An important feature of the proposed method is that it preserves positive definiteness of the updates. The presented method owns global and R-linear convergence.Numerical results showed that the proposed method is encouraging comparing with the methods MMBFGS and FRCG.
Download File
|
PDF
ajassp.2009.2054.2059.pdf
Restricted to Repository staff only
Download (81kB)
|
|
Additional Metadata
Actions (login required)
|
View Item |