UPM Institutional Repository

Implementation of Symmetric Rank-One Methods for Unconstrained Optimization


Citation

Khiyaban, Farzin Modarres (2010) Implementation of Symmetric Rank-One Methods for Unconstrained Optimization. PhD thesis, Universiti Putra Malaysia.

Abstract

The focus of this thesis is on analyzing the theoretical and computational aspects of some quasi-Newton (QN) methods for locating a minimum of a real valued function f over all vectors x 2 Rn. In many practical applications, the Hessian of the objective function may be too expensive to calculate or may even be unavailable in the explicit form. QN methods endeavor to circumvent the deciencies of Newtons method (while retaining the basic structure and thus preserving, as far as possible, its advantages) by constructing approximations for the Hessian iteratively. Among QN updates, symmetric rank-one (SR1) update has been shown to be an e®ective and reliable method of such algorithms. However, SR1 is an awkward method, even though its performance is in general better than well known QN updates. The problem is that the SR1 update may not retain positive deniteness and may become undened because the denominator becomes zero. In recent years considerable attention has been directed towards preserving and ensuring the positive deniteness of SR1 update, but improving the quality of the estimates has rarely been studied in depth. Our purpose in this thesis is to improve the Hessian approximation updates and study the computational performance and convergence property of this update. First, we brie°y give some mathematical background. A review of di®erent minimization methods that can be used to solve unconstrained optimization problems is also given. We consider a modification of secant equation for the SR1 update. In this method, the Hessian approximation is updated based on modifed secant equation, which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objec- tive function. We then examine a new scaled memoryless SR1 method based on modied secant equation for solving large-scale unconstrained optimization problems. We prove that the new method possesses global convergence. The rate of convergence of such algorithms are also discussed. Due to the presence of SR1 deciencies, we introduce a restarting procedure using eigenvalue of the SR1 update. We also introduce a variety of techniques to improve Hessian approximations of the SR1 method for small to large-sized problems, including multi-step, extra updating methods along with the structured method which uses partial information on Hessian. Variants of SR1 update are tested numerically and compared to several other famous minimization methods. Finally, we comment on some achievement in our research. Possible extensions are also given to conclude this thesis.


Download File

[img]
Preview
PDF
FS_2010_41_F.pdf

Download (394kB)

Additional Metadata

Item Type: Thesis (PhD)
Subject: Newton-Raphson method.
Call Number: FS 2010 41
Chairman Supervisor: Prof. Malik Hj. Abu Hassan, Ph.D
Divisions: Faculty of Science
Depositing User: Haridan Mohd Jais
Date Deposited: 22 May 2013 06:06
Last Modified: 27 May 2013 08:02
URI: http://psasir.upm.edu.my/id/eprint/19590
Statistic Details: View Download Statistic

Actions (login required)

View Item View Item