Publication:
Advances of metaheuristic algorithms in training neural networks for industrial applications

dc.citedby15
dc.contributor.authorChong H.Y.en_US
dc.contributor.authorYap H.J.en_US
dc.contributor.authorTan S.C.en_US
dc.contributor.authorYap K.S.en_US
dc.contributor.authorWong S.Y.en_US
dc.contributor.authorid55654589300en_US
dc.contributor.authorid35319362200en_US
dc.contributor.authorid7403366395en_US
dc.contributor.authorid24448864400en_US
dc.contributor.authorid55812054100en_US
dc.date.accessioned2023-05-29T09:06:33Z
dc.date.available2023-05-29T09:06:33Z
dc.date.issued2021
dc.descriptionBackpropagation; Gradient methods; Neural networks; Artificial neural network models; Complex applications; Exploration and exploitation; Gradient-based learning; Industry applications; Meta heuristic algorithm; Meta-heuristic search algorithms; Near-optimal solutions; Optimizationen_US
dc.description.abstractIn recent decades, researches on optimizing the parameter of the artificial neural network (ANN) model has attracted significant attention from researchers. Hybridization of superior algorithms helps improving optimization performance and capable of solving complex applications. As a traditional gradient-based learning algorithm, ANN suffers from a slow learning rate and is easily trapped in local minima when training techniques such as gradient descent (GD) and back-propagation (BP) algorithm are used. The characteristics of randomization and selection of the best or near-optimal solution of metaheuristic algorithm provide an effective and robust solution; therefore, it has always been used in training of ANN to improve and overcome the above problems. New metaheuristic algorithms are proposed every year. Therefore, the review of its latest developments is essential. This article attempts to summarize the metaheuristic algorithms which have been proposed from the year 1975 to 2020 from various journals, conferences, technical papers, and books. The comparison of the popularity of the metaheuristic algorithm is presented in two time frames, such as algorithms proposed in the recent 20�years and those proposed earlier. Then, some of the popular metaheuristic algorithms and their working principle are reviewed. This article further categorizes the latest metaheuristic search algorithm in the literature to indicate their efficiency in training ANN for various industry applications. More and more researchers tend to develop new hybrid optimization tools by combining two or more metaheuristic algorithms to optimize the training parameters of ANN. Generally, the algorithm�s optimal performance must be able to achieve a fine balance of their exploration and exploitation characteristics. Hence, this article tries to compare and summarize the properties of various metaheuristic algorithms in terms of their convergence rate and the ability to avoid the local minima. This information is useful for researchers working on algorithm hybridization by providing a good understanding of the convergence rate and the ability to find a global optimum. � 2021, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.en_US
dc.description.natureFinalen_US
dc.identifier.doi10.1007/s00500-021-05886-z
dc.identifier.epage11233
dc.identifier.issue16
dc.identifier.scopus2-s2.0-85106739708
dc.identifier.spage11209
dc.identifier.urihttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85106739708&doi=10.1007%2fs00500-021-05886-z&partnerID=40&md5=a2425a1c29c407743765bd946157b208
dc.identifier.urihttps://irepository.uniten.edu.my/handle/123456789/26073
dc.identifier.volume25
dc.publisherSpringer Science and Business Media Deutschland GmbHen_US
dc.sourceScopus
dc.sourcetitleSoft Computing
dc.titleAdvances of metaheuristic algorithms in training neural networks for industrial applicationsen_US
dc.typeArticleen_US
dspace.entity.typePublication
Files
Collections