Citation
Lim, Keat Hee
(2021)
Quasi-Newton type method via weak secant equations for unconstrained optimization.
Doctoral thesis, Universiti Putra Malaysia.
Abstract
In this thesis, variants of quasi-Newton methods are developed for solving unconstrained
optimization problems. These quasi-Newton type methods are differed from the standard
quasi-Newton methods in the calculation and storage of the approximate Hessian matrix
at every iteration. Using the concept of least change updating strategy, two updating
formulas are derived by the mean of variational problem, via weak secant equation and
some other non-secant equations. The convergence analysis for these methods are
presented under inexact line search with Armijo condition.
Motivated by the idea of memoryless scheme, the proposed formulas are further
modified such that only vector computations and storage are required at every iteration.
In other words, these memoryless updating formulas can provide some approximation to
the quasi-Newton direction in matrix free setting. Armijo condition is implemented in
the algorithms to generate monotone property in each iteration. The convergence analysis
is presented for these memoryless methods under some standard assumptions. Numerical
experiments are carried out using standard test problems and show that the proposed
methods are superior to some existing conjugate gradient methods, in terms of iterations
and function evaluations required to reach the optimal solutions.
The possible variants of matrix free quasi-Newton methods are further explored, using
the weak secant equation. A diagonal updating formula is derived by the mean of
minimizing the magnitude of the updating matrix under the Frobenius norm. This
formula is then generalized by using weighted Frobenius norm in the derivation, which
gives a new version of diagonal updating formula where the previous diagonal matrix is
chosen to be the weighting matrix, for the weighted diagonal updating formula. The
convergence analysis is established for these formulas under two sets of line search
strategies, namely monotone and non-monotone line search. Numerical experiments are
conducted to test their effectiveness in solving standard test set. The results obtained
indicate that the diagonal updating formula is superior to its weighted version and some
conjugate gradient methods. The formula works best when monotone line search is
implemented, which often requires fewer number of iterations and function evaluations
to obtain the solutions.
Overall, numerical results prove that these proposed methods are superior in terms of
number of iterations and function evaluations. Furthermore, the diagonal updating
formulas with non-monotone line search are more efficient than some classical conjugate
gradient methods in all three performance measures, including the CPU time.
Download File
Additional Metadata
Actions (login required)
|
View Item |