Citation
Sim, Hong Sen
(2011)
Preconditioning Subspace Quasi-Newton Method for Large Scale Unconstrained Optimization.
Masters thesis, Universiti Putra Malaysia.
Abstract
Subspace quasi-Newton (SQN) method has been widely used in large scale unconstrained optimization problem. Its popularity is due to the fact that this method can construct subproblems in low dimensions so that the storage requirement as well as the computation cost can be reduced. Besides of this, it also can offer a possible way to handle large scale optimization problems and yet it has vast ap- plications in almost every branch of science and technology such as tomography, signal and image deionizing with Basis Pursuit, pattern recognition with Support Vector Machine, and many others. This method can be implemented extremely fast when the objective function is a combination of mappings with computationally cheap non-linear functions for example, quadratics functions. However, the main drawback of the SQN method is that it can be very slow on certain type of nonlinear problem such as ill-conditioned problems. Thus, the focus of this thesis is to overcome this deficiency via preconditioning on the SQN method. In practise, preconditioners can be often adopted to speed up the convergence of the quasi-Newton methods. Hence, we propose a preconditioned SQN method which is generally more effective than the SQN method. For this purpose, we construct a preconditioner which is computationally cheap and is a good approximation to the actual Hessian since the evaluation of actual Hessian is considered as impractical and costly. In order to do this, we propose to use a diagonal updating matrix that has been derived based on the weak quasi-Newton relation instead of using the identity matrix to approximate the initial inverse Hessian. Numerical experiments are performed on quadratics test problems to compare the efficiency and performance of the preconditioned SQN method with the standard SQN method. Our computational results show that the proposed preconditioned SQN method performs better than SQN method that without preconditioning. In addition, the convergence of this method is also presented. Finally, some possible future extensions are to be given to conclude this thesis.
Download File
Additional Metadata
Actions (login required)
|
View Item |