Volume - 7 | Issue - 4 | december 2025
Published
12 November, 2025
Despite the state-of-the-art results obtained with transformer-based models like AraBERT, Arabic NER still appears to be very sensitive to the setting of hyperparameters. This usually requires a lot of manual tuning; the process is inefficient, time-consuming, and often results in suboptimal performance. In the context of this paper, an automatic framework for hyperparameter optimization was suggested using metaheuristic algorithms: PSO and TLBO. This is the first comparative research on the efficiency of metaheuristic algorithms for the optimization of Arabic NER. In this regard, this paper has applied these algorithms using the Wojood dataset and the aubmindlab/bert-base-arabertv2 to optimize the main hyperparameters: learning rate, batch size, and the number of epochs. The results revealed that PSO outperforms TLBO and a traditional approach for all test Micro-F1 scores, achieving a maximum of 0.8813, compared to 0.8755 for TLBO. Furthermore, PSO outperformed TLBO in terms of mean performance: 0.8708 versus 0.8561, with better convergence stability: a standard deviation of 0.0164 versus 0.0286. Although TLBO converged slightly faster than PSO, PSO demonstrated more robust generalization and reliability across runs. Overall, the findings confirm that metaheuristic optimization can substantially improve Arabic NER, with PSO standing out as the most effective, stable, and efficient optimizer for transformer-based Arabic NLP models.
KeywordsArtificial Intelligence Optimization Metaheuristic Particle Swarm Optimization Teaching–Learning-Based Optimization Natural Language Processing Named Entity Recognition Machine Learning Deep Neural Network

