Citation
Abstract
Gradient methods are popular due to the fact that only gradient of the objective function is required. On the other hand, the methods can be very slow if the objective function is very ill-conditioned. One possible reason for the inefficiency of the gradient methods is that a constant criterion, which aims only at reducing the function value, has been used in choosing the steplength, and this leads to a stable dynamic system giving slow convergence. To overcome this, we propose a new gradient method with multiple damping, which works on the objective function and the norm of the gradient vector simultaneously. That is, the proposed method is constructed by combining damping with line search strategies, in which an individual adaptive parameter is proposed to damp the gradient vector while line searches are used to reduce the function value. Global convergence of the proposed method is established under both backtracking and nonmonotone line search. Finally, numerical results show that the proposed algorithm performs better than some well-known CG-based methods.
Download File
Full text not available from this repository.
Official URL or Download Paper: https://link.springer.com/article/10.1007/s11590-0...
|
Additional Metadata
Item Type: | Article |
---|---|
Divisions: | Institute for Mathematical Research |
DOI Number: | https://doi.org/10.1007/s11590-018-1247-9 |
Publisher: | Springer |
Keywords: | Gradient method; Backtracking line search; Nonmonotone line search; Multiple damping; Large-scale optimization |
Depositing User: | Nurul Ainie Mokhtar |
Date Deposited: | 31 May 2023 02:00 |
Last Modified: | 31 May 2023 02:00 |
Altmetrics: | http://www.altmetric.com/details.php?domain=psasir.upm.edu.my&doi=10.1007/s11590-018-1247-9 |
URI: | http://psasir.upm.edu.my/id/eprint/80005 |
Statistic Details: | View Download Statistic |
Actions (login required)
View Item |