UPM Institutional Repository

Multiple Alternate Steps Gradient Methods For Unconstrained Optimization


Citation

Lee, Sui Fong (2009) Multiple Alternate Steps Gradient Methods For Unconstrained Optimization. Masters thesis, Universiti Putra Malaysia.

Abstract

The focus of this thesis is on finding the unconstrained minimizer of a function by using the alternate steps gradient methods. Specifically, we will focus on the well-known classes of gradient methods called the steepest descent (SD) method and Barzilai-Borwein (BB) method. First we briefly give some mathematical background on unconstrained optimization as well as the gradient methods. Then we discuss the SD and BB methods, the fundamental gradient methods which are used in the gradient method alternately to solve the problems of optimization. Some general and local convergence analyses of SD and BB methods are given, as well as the related so-called line search method.A review on the alternate step (AS) gradient method with brief numerical results and convergence analyses are also presented. The main practical deficiency of SD method is the directions generated along the line tend to two different directions, which causes the SD method performs poorly and requires more computational work. Though BB method does not guarantee a descent in the objective function at each iteration due to it nonmonotone behavior, it performs better than SD method in this case. Motivated by these limitations, we introduce a new gradient method for improving the SD and BB method namely the Multiple Alternate Steps (MAS) gradient methods. The convergence of MAS method is investigated. Analysis on the behavior of MAS method is also performed. Furthermore, we also presented the numerical results on quadratics test problems in order to compare the numerical performance of MAS method with SD, BB and AS methods. The purpose of this research is to study a working knowledge of optimization theory and methods. We hope that the new MAS gradient method can give significant research contribution in our daily life application. For example, in maximizing the profit of a manufacturing operation or improving a system in certain ways to reduce the effective runtime in computer science. Finally we comment on some achievements in our researches. Possible extensions are also given to conclude this thesis.


Download File

[img]
Preview
PDF
IPM_2009_11A.pdf

Download (729kB)

Additional Metadata

Item Type: Thesis (Masters)
Subject: Conjugate gradient methods
Subject: Mathematical optimization
Call Number: IPM 2009 11
Chairman Supervisor: Dr. Leong Wah June, PhD
Divisions: Institute for Mathematical Research
Depositing User: Mohd Nezeri Mohamad
Date Deposited: 22 Jul 2011 07:16
Last Modified: 27 May 2013 07:51
URI: http://psasir.upm.edu.my/id/eprint/12367
Statistic Details: View Download Statistic

Actions (login required)

View Item View Item