Scaled memoryless symmetric rank one method for large-scale unconstrained optimization
Leong, Wah June and Abu Hassan, Malik (2008) Scaled memoryless symmetric rank one method for large-scale unconstrained optimization. In: 4th Sino-Japanese Optimization Meeting (SJOM 2008), 27-31 Aug. 2008, Tainan, Taiwan. pp. 1-10.
Full text not available from this repository.
Memoryless quasi-Newton method is precisely the quasi-Newton method for which the initial approximation to the inverse of Hessian, at each step, is taken as the identity matrix. Hence the memoryless quasi-Newton direction can be computed without the storage of matrices, namely n2 storages. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving very large (with dimensions up to 106) unconstrained optimization problems is presented. The idea is to incorporate the SR1 update in the framework of the memoryless quasi-Newton method. However, it is well-known that the SR1 update may not preserve positive definiteness even when updated from a positive definite matrix. Therefore, instead of using the identity matrix within the memoryless SR1 method we use a positive multiple of the identity (hence our choice of terminology). The used scaling factor is derived in such a way to improve the condition and preserves the positive definiteness of the scaled memoryless SR1 update. Under very mild conditions it is shown that, for strictly convex objective functions, the method is globally and R−linearly convergent. Numerical results show that the scaled memoryless SR1 method is very encouraging.
Repository Staff Only: Edit item detail