Design of Improved Version of Sigmoidal Function with Biases for Classification Task in ELM Domain
PDF
PDF

How to Cite

Mugunthan, S. R., and T. Vijayakumar. 2021. “Design of Improved Version of Sigmoidal Function With Biases for Classification Task in ELM Domain”. Journal of Soft Computing Paradigm 3 (2): 70-82. https://doi.org/10.36548/jscp.2021.2.002.

Keywords

— Extreme Learning Machine
Published: 25-05-2021

Abstract

Extreme Learning Machine (ELM) is one of the latest trends in learning algorithm, which can provide a good recognition rate within less computation time. Therefore, the algorithm can sustain for a faster response application by utilizing a feed-forward neural network. In this research article, the ELM method has been designed with the presence of sigmoidal function of biases in the hidden nodes to perform the classification task. The classification task is very challenging with the existing learning algorithm and increased computation time due to the huge amount of dataset. While handling of the stochastic matrix for hidden layer, output provides the lower performance for learning rate and robustness in the determination. To address these issues, the modified version of ELM has been developed to obtain better accuracy and minimize the classification error. This research article includes the mathematical proof of sigmoidal activation function with biases of the hidden nodes present in the networks. The output matrix maintains the column rank in order to improve the speed of the training output weights (β). The proposed improved version of ELM leverages better accuracy and efficacy in classification and regression problems as well. Due to the inclusion of matrix column ranking in mathematical proof, the proposed improved version of ELM solves the slow training speed and over-fitting problems present in the existing learning approach.

References

  1. G. B. Huang, L. Chen, “Enhanced random search based incremental extreme learning machine,” Neurocomputing, vol. 71, pp. 3460–3468, 2008.
  2. G. B. Huang, L. Chen, C. K. Siew, “Universal approximation using incremental constructive feedforward networks with random hidden nodes,” IEEE Transactions on Neural Networks, vol. 17, pp. 879–892, 2006.
  3. Hou, H.R.; Meng, Q.H.; Zhang, X.N. “A voting-near-extreme-learning-machine classification algorithm” In Proceedings of the 2018 24th International Conference on Pattern Recognition (ICPR), Beijing, China, 20–24 August 2018; pp. 237–241.
  4. J. Zhang, W. Xiao, and Y. Li, ``Residual compensation extreme learning machine for regression,'' Neurocomputing, vol. 311, pp. 126_136, Oct. 2018.
  5. Z. L. Sun, T. M. Choi, and K. F. Au, ``Sales forecasting using extreme learning machine with applications in fashion retailing,'' Decis. Support Syst., vol. 46, no. 1, pp. 411_419, 2009.
  6. Y. Lan, Y. C. Soh, and G.-B. Huang, ``Ensemble of online sequential extreme learning machine,'' Neurocomputing, vol. 72, nos. 13_15, pp. 3391_3395, Aug. 2009.
  7. J. Cao, Z. Lin, and G. B. Huang, ``Voting based extreme learning machine,'' Inf. Sci., vol. 185, no. 1, pp. 66_77, 2012.
  8. M. Heeswijk, Y. Miche, and T. Lindh-Knuutila, ``Adaptive ensemble models of extreme learning machines for time series prediction,'' in Proc. Int. Conf. Artif. Neural Netw., Berlin, Germany: Springer-Verlag, 2009, pp. 305_314.
  9. S. M. Yang, Y. L. Wang, and B. Sun, ``ELM weighted hybrid modelling and its online modi_cation,'' in Proc. Control Decis. Conf., 2016, pp. 3443_3448.
  10. J. Cao, S. Kwong, and R. Wang, ``Class-speci_c soft voting based multiple extreme learning machines ensemble,'' Neurocomputing, vol. 149, pp. 275_284, Feb. 2015.
  11. M. Rahhal, Y. Bazi, and N. Alajlan, ``Classi_cation of AAMI heartbeat classes with an interactive ELM ensemble learning approach,'' Biomed. Signal Process., Control, vol. 19, pp. 56_67, May 2015.
  12. Xu, X.; Li, S.; Liang, T.; Sun, T. Sample selection-based hierarchical extreme learning machine. Neurocomputing 2020, 377, 95–102.
  13. G. Cybenko, “Approximation by superposition of sigmoidal function,” Mathematics of Control, Signals and Systems, vol. 2, pp. 303–314, 1989.
  14. K. I. Funahashi, “On the approximate realization of continuous mappings by neural networks,” Neural Networks, vol. 2, pp. 183–192, 1989.
  15. K. Hornik, “Approximation capabilities of multilayer feedforward networks,” Neural Networks, vol. 4, pp. 251–257, 1991.
  16. Y. Lan, Y. C. Soh, G. B. Huang, “Random search enhancement of error minimized extreme learning machine,” European Symposium on Artificial Neural Networks, pp. 327–332, April 2010.
  17. F. L. Cao, S. B. Lin, Z. B. Xu, “Approximation capabilities of interpolation neural networks,” Neurocomputing, vol. 74, pp. 457–460, 2010.
  18. G. B. Huang, H. A. Babri, “Upper bounds on the number of hidden neurons in ffeedforward networks with arbitrary bounded nonlinear activation functions,” IEEE Transactions on Neural Networks, vol. 9, pp. 224–229, 1998.
  19. G. B. Huang, Q. Y. Zhu, C. K. Siew, “Extreme learning machine: theory and applications,” Neurocomputing, vol. 70, pp. 489–501, 2006.
  20. G. B. Huang, Q. Y. Zhu, C. K. Siew, “Extreme learning machine: a new learning scheme of feedforward neural networks,” Proceedings of 2004 IEEE International Joint Conference on Neural Networks, vol. 2, pp. 985–990, July 2004.
  21. P. L. Bartlett, “The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network,” IEEE Transactions on Information Theory, vol. 44, pp. 525–536, 1998.
  22. Cheng, C.; Tay, W.P.; Huang, G.B. Extreme learning machines for intrusion detection. In Proceedings of the 2012 International Joint Conference on Neural Networks (IJCNN), Brisbane, QLD, Australia, 10–15 June 2012; pp. 1–8.
  23. Singh, D.; Bedi, S.S. Multiclass ELM based smart trustworthy IDS for MANETs. Arab. J. Sci. Eng. 2016, 41, 3127–3137.
  24. Singh, R.; Kumar, H.; Singla, R.K. Performance analysis of an intrusion detection system using Panjab university intrusion dataSet. In Proceedings of the 2015 2nd International Conference on Recent Advances in Engineering & Computational Sciences (RAECS), Chandigarh, India, 21–22 December 2015; pp. 1–6.
  25. Manzoor, I.; Kumar, N. A feature reduced intrusion detection system using ANN classifier. Expert Syst. Appl. 2017, 88, 249–257.
  26. Zhang, M.; Guo, J.; Xu, B.; Gong, J. Detecting network intrusion using probabilistic neural network. In Proceedings of the 2015 11th International Conference on Natural Computation (ICNC), Zhangjiajie, China, 15–17 August 2015; pp. 1151–1158.
  27. Brown, J.; Anwar, M.; Dozier, G. “An evolutionary general regression neural network classifier for intrusion detection”. In Proceedings of the 2016 25th International Conference on Computer Communication and Networks (ICCCN),Waikoloa, HI, USA, 1–4 August 2016; pp. 1–6.
  28. M. Eshtay, H. Faris, and N. Obeid, “Improving Extreme Learning Machine by Competitive Swarm Optimization and its application for medical diagnosis problems,” Expert Syst. Appl., vol. 104, pp. 134–152, 2018.
  29. P. P. Das, R. Bisoi, and P. K. Dash, “Data decomposition based fast reduced kernel extreme learning machine for currency exchange rate forecasting and trend analysis,” Expert Syst. Appl., vol. 96, pp. 427– 449, 2018.
  30. P. Yuan, D. Chen, T. Wang, S. Cao, Y. Cai, and L. Xue, “A compensation method based on extreme learning machine to enhance absolute position accuracy for aviation drilling robot,” Adv. Mech. Eng., vol. 10, no. 3, p. 1687814018763411, 2018.
  31. G.-B. Huang, H. Zhou, X. Ding, and R. Zhang,“Extreme learning machine for regression and multiclass classification” IEEE Trans. Syst. Man. Cybern. B. Cybern., 2012.