Citation
Yeong, Lin Koay and Hong, Seng Sim and Yong, Kheng Goh and Sing, Yee Chua and Wah, June Leong
(2024)
Optimising neural network training efficiency through spectral parameter-based multiple adaptive learning rates.
In: The 7th International Conference on Computational Intelligence and Intelligent Systems 2024 (CIIS 2024), 22-24 Nov. 2024, Japan. .
Abstract
The process of training neural networks heavily involves solving
optimization problems. Most optimization algorithms use a !xed
learning rate or a simpli!ed adaptive updating scheme in every
iteration. In this paper, we propose a stochastic gradient descent
method with multiple adaptive learning rates (MAdaGrad) and
Adam with multiple adaptive learning rates (MAdaGrad Adam).
The proposed algorithm updates the learning rate in every iteration
based on the approximated spectrum of the Hessian of the loss
function. The method is compared to the existing optimization
methods in machine learning, namely stochastic gradient descent
method (SGD) and Adam. Selected datasets are used to evaluate
the performance of the proposed method. The proposed algorithm
is used to train the neural networks with di"erent hidden layer
sizes and di"erent neurons. The numerical results prove that the
proposed methods perform better than SGD and Adam.
Download File
![[img]](http://psasir.upm.edu.my/style/images/fileicons/text.png) |
Text
121559.pdf
- Accepted Version
Restricted to Repository staff only
Download (6MB)
|
|
Additional Metadata
| Item Type: |
Conference or Workshop Item
(Oral/Paper)
|
| Divisions: |
Faculty of Science |
| Publisher: |
Association for Computing Machinery |
| Keywords: |
Stochastic gradient descent algorithm; Variations; Adaptive learning rates; Neural networks |
| Depositing User: |
Mr. Mohamad Syahrul Nizam Md Ishak
|
| Date Deposited: |
06 Nov 2025 03:07 |
| Last Modified: |
06 Nov 2025 03:07 |
| URI: |
http://psasir.upm.edu.my/id/eprint/121559 |
| Statistic Details: |
View Download Statistic |
Actions (login required)
 |
View Item |