Citation
Ibrahim, Arzuka
(2015)
Scaled three-term conjugate gradient method via Davidon-Fletcher-Powell update for unconstrained optimization.
Masters thesis, Universiti Putra Malaysia.
Abstract
This thesis focus on the development of Scaled Three-Term Conjugate Gradient
Method via the Davidon-Fletcher-Powell (DFF) quasi-Newton update for unconstrained
optimization. The DFP method possess the merits of Newton’s method
and steepest descent method while overcoming their disadvantages. Over the years
the DFP update has been neglected as a result of lacking the self correcting property
for bad Hessian approximation. In this thesis, we proposed a Scaled Three-Term
Conjugate Gradient Method by utilizing the DFP update for the inverse Hessian approximation
via memoryless quasi Newton’s method which satisfies both the sufficient
descent and the conjugacy conditions. The basic philosophy is to restart the DFP
update with a multiple of identity matrix in every iteration. An acceleration scheme
is incorporated in the proposed method to enhance reduction in function value. Numerical
results from an implementation of the proposed method on some standard
unconstrained optimization problem shows that the proposed method is promising
and exhibits superior numerical performance in comparison with other well-known
conjugate gradient methods.
Download File
Additional Metadata
Actions (login required)
|
View Item |