UPM Institutional Repository

Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks


Citation

Al-Asaady, Maher Talal and Mohd Aris, Teh Noranis and Mohd Sharef, Nurfadhlina and Hamdan, Hazlina (2025) Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks. International Journal on Informatics Visualization, 9 (2). pp. 658-673. ISSN 2549-9610; eISSN: 2549-9904

Abstract

Artificial Neural Networks (ANNs) have demonstrated applicability and effectiveness in several domains, including classification tasks. Researchers have emphasized the training techniques of ANNs to identify appropriate weights and biases. However, conventional training techniques such as Gradient Descent (GD) and Backpropagation (BP) often suffer from early convergence, dependence on initial parameters, and susceptibility to local optima, limiting their efficiency in complex, high-dimensional problems. Meta-heuristic algorithms (MHAs) offer a promising alternative as practical approaches for training ANNs, providing global search capabilities, robustness, and improved computational efficiency. Despite the growing use of MHAs, existing studies often focus on specific subsets of algorithms or narrow application domains, leaving a gap in understanding their comprehensive potential and comparative performance across diverse classification tasks. This paper addresses this gap by presenting a systematic review of advancements in training Multilayer Perceptron (MLP) neural networks using MHAs, analyzing 53 publications from 2014 to 2024. The research papers were chosen explicitly from four widely used databases: ScienceDirect, Scopus, Springer, and IEEE Xplore. Key contributions include a comparative analysis of evolutionary, swarm intelligence, physics-based, human-inspired algorithms, and hybrid approaches benchmarked on classification datasets. The study also highlights bibliometric trends, identifies underexplored areas such as adaptive and hybrid algorithms, and emphasizes the practical application of MHAs in optimizing ANN performance. This work is a significant resource for researchers, facilitating the identification of effective optimization methodologies and bridging the gap between theoretical advancements and real-world applications.


Download File

[img] Text
118626.pdf - Published Version
Available under License Creative Commons Attribution Share Alike.

Download (4MB)
Official URL or Download Paper: http://joiv.org/index.php/joiv/article/view/3109

Additional Metadata

Item Type: Article
Divisions: Faculty of Computer Science and Information Technology
DOI Number: https://doi.org/10.62527/joiv.9.2.3109
Publisher: Politeknik Negeri Padang
Keywords: Artificial neural network training; Classification; Meta-heuristic algorithms; Multilayer perceptron; Optimization
Depositing User: Ms. Che Wa Zakaria
Date Deposited: 21 Jul 2025 00:17
Last Modified: 21 Jul 2025 00:17
Altmetrics: http://www.altmetric.com/details.php?domain=psasir.upm.edu.my&doi=10.62527/joiv.9.2.3109
URI: http://psasir.upm.edu.my/id/eprint/118626
Statistic Details: View Download Statistic

Actions (login required)

View Item View Item