Abstract
Nine population-based optimization algorithms, six of which are sports-inspired and three classical natural-inspired, are applied in this work to neural network hyperparameter tuning. The results of the sport-inspired algorithms, namely the Most Valuable Player Algorithm, Soccer League Champions Algorithm, Golden Ball Algorithm, Tiki-Taka Algorithm, Tug of War Optimization, and World Cup Optimization, will be compared against classical metaheuristics, namely Particle Swarm Optimization, Differential Evolution, and Genetic Algorithm. Each optimizer was evaluated under identical experimental conditions on a neural network classifier using the Iris dataset to optimize four key hyperparameters: the number of hidden layers, neurons per layer, learning rate, and L2 regularization coefficient. Due to its hierarchical competition structure, MVPA had the highest classification accuracy (99.33%), followed by SLCA (98.67%), but it came at a higher computational cost. GBA and PSO showed very strong performances with efficient convergence, while DE, TOW, and WCO resulted in relatively lower accuracies of ~97–98% but achieved fast convergence, which is very suitable for time-critical applications. The results show that sports-inspired metaheuristics are robust and effective frameworks for neural network hyperparameter optimization. Their competition-driven strategic mechanisms enable efficient search behavior that is on par with, and occasionally even better than, that of more conventional evolutionary algorithms. These results provide a solid basis for hybrid, adaptive algorithms in the future related to sports.
References
Liao, Lizhi, Heng Li, Weiyi Shang, and Lei Ma. "An Empirical Study of the Impact of Hyperparameter Tuning and Model Optimization on the Performance Properties of Deep Neural Networks." ACM Transactions on Software Engineering and Methodology (TOSEM) 31, no. 3 (2022): 1-40.
[Sarsa, Sami, Juho Leinonen, and Arto Hellas. "Empirical Evaluation of Deep Learning Models for Knowledge Tracing: Of Hyperparameters and Metrics on Performance and Replicability." arXiv preprint arXiv:2112.15072 (2021).
Ying, Hejie, Mengmeng Song, Yaohong Tang, Shungen Xiao, and Zimin Xiao. "Enhancing Deep Neural Network Training Efficiency and Performance Through Linear Prediction." Scientific Reports 14, no. 1 (2024): 15197.
Narayanan, Ramachandran, and Narayanan Ganesh. "A Comprehensive Review of Metaheuristics for Hyperparameter Optimization in Machine Learning." Metaheuristics for Machine Learning: Algorithms and Applications (2024): 37-72.
Bacanin, Nebojsa, Catalin Stoean, Miodrag Zivkovic, Miomir Rakic, Roma Strulak-Wójcikiewicz, and Ruxandra Stoean. "On the Benefits of Using Metaheuristics in the Hyperparameter Tuning of Deep Learning Models for Energy Load Forecasting." Energies 16, no. 3 (2023): 1434.
Tian, Zhirui, and Mei Gai. "Football Team Training Algorithm: A Novel Sport-Inspired Meta-Heuristic Optimization Algorithm for Global Optimization." Expert Systems with Applications 245 (2024): 123088.
Alhijawi, Bushra, and Arafat Awajan. "Genetic algorithms: Theory, Genetic Operators, Solutions, and Applications." Evolutionary Intelligence 17, no. 3 (2024): 1245-1256.
Dhar, Sandipan, Anuvab Sen, Aritra Bandyopadhyay, Nanda Dulal Jana, Arjun Ghosh, and Zahra Sarayloo. "Differential Evolution Algorithm Based Hyper-Parameters Selection of Convolutional Neural Network for Speech Command Recognition." arXiv preprint arXiv:2310.08914 (2023).
Dhar, Sandipan, Anuvab Sen, Aritra Bandyopadhyay, Nanda Dulal Jana, Arjun Ghosh, and Zahra Sarayloo. "Differential Evolution Algorithm Based Hyper-Parameters Selection of Convolutional Neural Network for Speech Command Recognition." arXiv preprint arXiv:2310.08914 (2023).
Worawattawechai, Tanawat, Boonyarit Intiyot, Chawalit Jeenanunta, and William G. Ferrell Jr. "A Learning Enhanced Golden Ball Algorithm for the Vehicle Routing Problem with Backhauls and Time Windows." Computers & Industrial Engineering 168 (2022): 108044.
Bouchekara, H. R. E. H. "Most Valuable Player Algorithm: A Novel Optimization Algorithm Inspired from Sport." Operational Research 20, no. 1 (2020): 139-195.
Moosavian, Naser, and Babak Kasaee Roodsari. "Soccer League Competition Algorithm, A New Method for Solving Systems of Nonlinear Equations." International journal of intelligence science 4, no. 01 (2013): 7.
Ab. Rashid, Mohd Fadzil Faisae. "Tiki-taka Algorithm: A Novel Metaheuristic Inspired by Football Playing Style." Engineering Computations 38, no. 1 (2021): 313-343.
Kaveh, Ali. "Tug of war optimization." In Advances in Metaheuristic Algorithms for Optimal Design of Structures, Cham: Springer International Publishing, 2021, 467-503.
Razmjooy, Navid, Mohsen Khalilpour, and Mehdi Ramezani. "A New Meta-Heuristic Optimization Algorithm Inspired by FIFA World Cup Competitions: Theory and Its Application in PID Designing for AVR System." Journal of Control, Automation and Electrical Systems 27, no. 4 (2016): 419-440.
Dahl, George E., Frank Schneider, Zachary Nado, Naman Agarwal, Chandramouli Shama Sastry, Philipp Hennig, Sourabh Medapati et al. "Benchmarking Neural Network Training Algorithms." arXiv preprint arXiv:2306.07179 (2023).
Syaharuddin, Syaharuddin, Fatmawati Fatmawati, and Herry Suprajitno. "The Formula Study in Determining the Best Number of Neurons in Neural Network Backpropagation Architecture with Three Hidden Layers." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 6, no. 3 (2022): 397-402.
Sharkawy, Abdel-Nasser. "The Effect of Increasing Hidden Layers on the Performance of the Deep Neural Network: Modelling, Investigation, and Evaluation." (2024).
Sun, Ruo-Yu. "Optimization for Deep Learning: An Overview." Journal of the Operations Research Society of China 8, no. 2 (2020): 249-294.
Iiduka, Hideaki. "Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks." IEEE Transactions on Cybernetics 52, no. 12 (2021): 13250-13261.
Yang, Mei, Ming K. Lim, Yingchi Qu, Xingzhi Li, and Du Ni. "Deep Neural Networks with L1 and L2 Regularization for High Dimensional Corporate Credit Risk Prediction." Expert Systems with Applications 213 (2023): 118873.
Raiaan, Mohaimenul Azam Khan, Sadman Sakib, Nur Mohammad Fahad, Abdullah Al Mamun, Md Anisur Rahman, Swakkhar Shatabda, and Md Saddam Hossain Mukta. "A Systematic Review of Hyperparameter Optimization Techniques in Convolutional Neural Networks." Decision Analytics Journal 11 (2024): 100470.
El-Hassani, Fatima Zahrae, Meryem Amri, Nour-Eddine Joudar, and Khalid Haddouch. "A New Optimization Model for MLP hyperparameter Tuning: Modeling and Resolution by Real-Coded Genetic Algorithm." Neural Processing Letters 56, no. 2 (2024): 105.
Salem, Hend S., Mohamed A. Mead, and Ghada S. El-Taweel. "Particle Swarm Optimization-Based Hyperparameters Tuning of Machine Learning Models for Big COVID-19 Data Analysis." Journal of Computer and Communications 12, no. 3 (2024): 160-183.
Wong, Tzu-Tsung, and Po-Yang Yeh. "Reliable Accuracy Estimates From K-Fold Cross Validation." IEEE Transactions on Knowledge and Data Engineering 32, no. 8 (2019): 1586-1594.
