Citation
Zulkifli, Munierah and Abd Rahmin, Nor Aliza and Wah, June Leong
(2023)
Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems.
Menemui Matematik, 45 (2).
pp. 224-230.
ISSN 2231-7023
Abstract
Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solving optimization problems. It is an iterative algorithm with descent properties that reduces computational cost by using derivatives of random data points. This paper proposes a new SGD algorithm with modified stepsize that employs function scaling strategy. Particularly, the stepsize parameter is coupled with function scaling by storing the mean of gradients in the denominator. The performance of the method is evaluated based on the ability to reduce function value after each iteration, ability to attain the lowest function value when applied to solve the well-known zebra-strip problem. Our results indicate that the proposed method performed favourable to the existing method.
Download File
Additional Metadata
Item Type: |
Article
|
Divisions: |
Faculty of Science |
Publisher: |
Persatuan Sains Matematik Malaysia |
Keywords: |
Optimization; Large-scale optimization; Binary classification; Stochastic gradient method; Adaptive stepsize; Function scaling; Industry; Innovation and infrastructure |
Depositing User: |
Mohamad Jefri Mohamed Fauzi
|
Date Deposited: |
04 Sep 2024 03:28 |
Last Modified: |
04 Sep 2024 03:28 |
URI: |
http://psasir.upm.edu.my/id/eprint/110372 |
Statistic Details: |
View Download Statistic |
Actions (login required)
|
View Item |