TY - JOUR N1 - cited By 3 SP - 377 TI - Hyperparameter Optimization of Evolving Spiking Neural Network for Time-Series Classification AV - none EP - 397 PB - Springer Japan SN - 02883635 N2 - Spiking neural networks are the third generation of artificial neural networks that are inspired by a new brain-inspired computational model of ANN. Spiking neural network encodes and processes neural information through precisely timed spike trains. eSNN is an enhanced version of SNN, motivated by the principles of Evolving Connectionist System (ECoS), which is relatively a new classifier in the neural information processing area. The performance of eSNN is highly influenced by the values of its significant hyperparametersâ?? modulation factor (mod), threshold factor (c), and similarity factor (sim). In contrast to the manual tuning of hyperparameters, automated tuning is more reliable. Therefore, this research presents an optimizer-based eSNN architecture, intended to solve the issue regarding optimum hyperparametersâ?? valuesâ?? selection of eSNN. The proposed model is named eSNN-SSA where SSA stands for salp swarm algorithm, which is a metaheuristic optimization technique integrated with eSNN architecture. For the integration of eSNN-SSA, Thorpeâ??s standard model of eSNN is used with population rate encoding. To examine the performance of eSNN-SSA, various benchmarking data sets from the UCR/UAE time-series classification repository are utilized. From the experimental results, it is concluded that the salp swarm algorithm plays an effective role in improving the flexibility of the eSNN. The proposed eSNN-SSA offers solutions to conquer the disadvantages of eSNN in determining the best number of pre-synaptic neurons for time-series classification problems. The performance accuracy obtained by eSNN-SSA was on datasets spoken Arabic digits, articulatory word recognition, character trajectories, wafer, and GunPoint, i.e., 0.96, 0.97, 0.94, 1.0, and 0.94, respectively. The proposed approach outperformed standard eSNN in terms of time complexity. © 2022, Ohmsha, Ltd. and Springer Japan KK, part of Springer Nature. IS - 1 ID - scholars16924 KW - Benchmarking; Classification (of information); Encoding (symbols); Network architecture; Optimization; Swarm intelligence; Time series KW - ESNN; Hyper-parameter; Hyper-parameter optimizations; Neural-networks; Optimisations; Performance; Salp swarms; Swarm algorithms; Third generation; Time series classifications KW - Neural networks A1 - Ibad, T. A1 - Abdulkadir, S.J. A1 - Aziz, N. A1 - Ragab, M.G. A1 - Al-Tashi, Q. JF - New Generation Computing UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-85127539295&doi=10.1007%2fs00354-022-00165-3&partnerID=40&md5=1292a91ea1f1dcb8a29a68bbe48db037 VL - 40 Y1 - 2022/// ER -