A Novel Non-Dominated Sorting Dragonfly Optimization With Evolutionary Population Dynamics Based Multi-Objective Approach For Feature Selection Problems

Authors

  • Anitha G, Rosiline Jeetha B

Keywords:

Feature selection; Dragonfly optimization; Multi-objective Optimization; Evolutionary Population Dynamics; Non-dominated Sorting

Abstract

Feature selection is a multi-objective problem which includes two contradictory objectives. It is an effective method in classification to eradicate noise, inappropriate and redundant features to maximize the classification precision and reduce the number of chosen features. In this study, meta-heuristic algorithm with multi-objective approach have been tried to explore feature selection problem with a combination of non-dominated sorting dragonfly algorithm and evolutionary population dynamics strategy. First, to enhance the value of non-dominated solutions, an evolutionary population dynamic strategy is integrated with a heuristic natural selection operators. Second, to avoid the local optimum trap and enrich the population variety, to upgrade the step size and to maintain exploration and exploitation balance, a strategy is planned to optimize these issues. Finally a Pareto optimal solutions are obtained from the non-dominated sorting strategy which makes the algorithm appropriate for handling multi-objective feature selection problems. Simulations are performed on 18 datasets from UCI repository. The proposed NDSDA, NDSDA_EPD and NDSDA_EPD_CM approaches are compared with the existing dragonfly algorithms. The proposed algorithms outperforms the other techniques by enhancing the grouping accuracy and decreasing the preferred features count.

Downloads

Download data is not yet available.

References

Jiao, R., Nguyen, B. H., Xue, B., & Zhang, M. (2023). A survey on evolutionary multiobjectivefeature selection in classification: Approaches, applications, and challenges. IEEE Transactions on Evolutionary Computation.

Wang, J., Zhang, Y., Hong, M., He, H., & Huang, S. (2022). A self-adaptive level-based learning artificial bee colony algorithm for feature selection on high-dimensional classification. Soft Computing, 26(18), 9665-9687.

Yang, S., Gu, L., Li, X., Jiang, T., & Ren, R. (2020). Crop classification method based on optimal feature selection and hybrid CNN-RF networks for multi-temporal remote sensing imagery. Remote sensing, 12(19), 3119.

Kumar, V., &Minz, S. (2014). Feature selection: a literature review. SmartCR, 4(3), 211-229.

Hennessy, A., Clarke, K., & Lewis, M. (2020). Hyperspectral classification of plants: A review of waveband selection generalisability. Remote Sensing, 12(1), 113.

Guo, A., Huang, W., Ye, H., Dong, Y., Ma, H., Ren, Y., &Ruan, C. (2020). Identification of wheat yellow rust using spectral and texture features of hyperspectral images. Remote Sensing, 12(9), 1419.

Bommert, A., Sun, X., Bischl, B., Rahnenfuhrer, J., & Lang, M. (2020). Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics & Data Analysis, 143, 106839.

Gonzalez, J., Ortega, J., Damas, M., Martín-Smith, P., &Gan, J. Q. (2019). A new multi-objective wrapper method for feature selection–Accuracy and stability analysis for BCI. Neurocomputing, 333, 407-418.

Cheng, F., Cui, J., Wang, Q., & Zhang, L. (2022). A Variable Granularity Search-Based Multiobjective Feature Selection Algorithm for High-Dimensional Data Classification. IEEE Transactions on Evolutionary Computation, 27(2), 266-280.

Alickovic, E., &Subasi, A. (2017). Breast cancer diagnosis using GA feature selection and Rotation Forest. Neural Computing and applications, 28, 753-763.

Huda, R. K., & Banka, H. (2019). Efficient feature selection and classification algorithm based on PSO and rough sets. Neural Computing and Applications, 31, 4287-4303.

Kashef, S., &Nezamabadi-pour, H. (2015). An advanced ACO algorithm for feature subset selection. Neurocomputing, 147, 271-279.

Nakamura, R. Y. M., Pereira, L. A. M., Rodrigues, D., Costa, K. A. P., Papa, J. P., & Yang, X. S. (2013). Binary bat algorithm for feature selection. In Swarm intelligence and bio-inspired computation (pp. 225-237). Elsevier.

Emary, E., Zawbaa, H. M., &Hassanien, A. E. (2016). Binary grey wolf optimization approaches for feature selection. Neurocomputing, 172, 371-381.

Mirjalili, S. (2016). Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural computing and applications, 27, 1053-1073.

KS, S. R., &Murugan, S. (2017). Memory based hybrid dragonfly algorithm for numerical optimization problems. Expert Systems with Applications, 83, 63-78.

Tubishat, M., Idris, N., Shuib, L., Abushariah, M. A., &Mirjalili, S. (2020). Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Systems with Applications, 145, 113122.

Hussain, K., Neggaz, N., Zhu, W., &Houssein, E. H. (2021). An efficient hybrid sine-cosine Harris hawks optimization for low and high-dimensional feature selection. Expert Systems with Applications, 176, 114778.

Arora, S., Singh, H., Sharma, M., Sharma, S., & Anand, P. (2019). A new hybrid algorithm based on grey wolf optimization and crow search algorithm for unconstrained function optimization and feature selection. Ieee Access, 7, 26343-26361.

Ibrahim, R. A., AbdElaziz, M., & Lu, S. (2018). Chaotic opposition-based grey-wolf optimization algorithm based on differential evolution and disruption operator for global optimization. Expert Systems with Applications, 108, 1-27.

Varghese, N. V., Singh, A., Suresh, A., &Rahnamayan, S. (2020, October). Binary hybrid differential evolution algorithm for multi-label feature selection. In 2020 IEEE international conference on systems, man, and cybernetics (SMC) (pp. 4386-4391). IEEE.

Zhang, Y., Gong, D. W., &Rong, M. (2015). Multi-objective differential evolution algorithm for multi-label feature selection in classification. In Advances in Swarm and Computational Intelligence: 6th International Conference, ICSI 2015, held in conjunction with the Second BRICS Congress, CCI 2015, Beijing, China, June 25-28, 2015, Proceedings, Part I 6 (pp. 339-345). Springer International Publishing.

Mafarja, M. M., Eleyan, D., Jaber, I., Hammouri, A., & Mirjalili, S. (2017, October). Binary dragonfly algorithm for feature selection. In 2017 International conference on new trends in computing sciences (ICTCS) (pp. 12-17). IEEE.

Aghdam, M. H., & Kabiri, P. (2016). Feature selection for intrusion detection system using ant colony optimization. Int. J. Netw. Secur.,18(3), 420-432.

Raman, M. G., Somu, N., Kirthivasan, K., Liscano, R., & Sriram, V. S. (2017). An efficient intrusion detection system based on hypergraph-Genetic algorithm for parameter optimization and feature selection in support vector machine. Knowledge-Based Systems, 134, 1-12.

Acharya, N., & Singh, S. (2018). An IWD-based feature selection method for intrusion detection system. Soft Computing, 22, 4407-4416.

Selvakumar, B., &Muneeswaran, K. (2019). Firefly algorithm based feature selection for network intrusion detection. Computers & Security, 81, 148-155.

Mafarja, M. M., & Mirjalili, S. (2017). Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing, 260, 302-312.

Emary, E., &Zawbaa, H. M. (2019). Feature selection via Lèvy Antlion optimization. Pattern Analysis and Applications, 22, 857-876.

Alamiedy, T. A., Anbar, M., Alqattan, Z. N., & Alzubi, Q. M. (2020). Anomaly-based intrusion detection system using multi-objective grey wolf optimisation algorithm. Journal of Ambient Intelligence and Humanized Computing, 11, 3735-3756.

SoleimanianGharehchopogh, F., & Mousavi, S. K. (2020). A new feature selection in email spam detection by particle swarm optimization and fruit fly optimization algorithms. Computer and Knowledge Engineering, 2(2), 49-62.

Zhang, Y., Gong, D. W., Gao, X. Z., Tian, T., & Sun, X. Y. (2020). Binary differential evolution with self-learning for multi-objective feature selection. Information Sciences, 507, 67-85.

Sohrabi, M. K., & Tajik, A. (2017). Multi-objective feature selection for warfarin dose prediction. Computational biology and chemistry, 69, 126-133.

Wan, Y., Ma, A., Zhong, Y., Hu, X., & Zhang, L. (2020). Multiobjective hyperspectral feature selection based on discrete sine cosine algorithm. IEEE Transactions on Geoscience and Remote Sensing, 58(5), 3601-3618.

Wang, L., & Zheng, X. L. (2018). A knowledge-guided multi-objective fruit fly optimization algorithm for the multi-skill resource constrained project scheduling problem. Swarm and Evolutionary Computation, 38, 54-63.

Wu, L., Wang, H. Y., Zuo, C., & Wei, H. L. (2018, July). Multi-objective Fruit Fly Optimization Based on Cloud Model. In 2018 13th World Congress on Intelligent Control and Automation (WCICA) (pp. 335-340). IEEE.

Ma, Q., He, Y., & Zhou, F. (2016, October). Multi-objective fruit fly optimization algorithm for test point selection. In 2016 IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC) (pp. 272-276). IEEE.

Du, P., Wang, J., Hao, Y., Niu, T., & Yang, W. (2020). A novel hybrid model based on multi-objective Harris hawks optimization algorithm for daily PM2. 5 and PM10 forecasting. Applied Soft Computing, 96, 106620.

Amoozegar, M., & Minaei-Bidgoli, B. (2018). Optimizing multi-objective PSO based feature selection method using a feature elitism mechanism. Expert Systems with Applications, 113, 499-514.

Rodrigues, D., de Albuquerque, V. H. C., & Papa, J. P. (2020). A multi-objective artificial butterfly optimization approach for feature selection. Applied Soft Computing, 94, 106442.

Al-Tashi, Q., Abdulkadir, S. J., Rais, H. M., Mirjalili, S., Alhussian, H., Ragab, M. G., &Alqushaibi, A. (2020). Binary multi-objective grey wolf optimizer for feature selection in classification. IEEE Access, 8, 106247-106263.

Zhang, Y., Cheng, S., Shi, Y., Gong, D. W., & Zhao, X. (2019). Cost-sensitive feature selection using two-archive multi-objective artificial bee colony algorithm. Expert Systems with Applications, 137, 46-58.

Wang, X. H., Zhang, Y., Sun, X. Y., Wang, Y. L., & Du, C. H. (2020). Multi-objective feature selection based on artificial bee colony: An acceleration approach with variable sample size. Applied Soft Computing, 88, 106041.

He, C. L., Zhang, Y., Gong, D. W., & Wu, B. (2020). Multi-objective feature selection based on artificial bee colony for hyperspectral images. In Bio-inspired Computing: Theories and Applications: 14th International Conference, BIC-TA 2019, Zhengzhou, China, November 22–25, 2019, Revised Selected Papers, Part I 14 (pp. 611-621). Springer Singapore.

Xue, B., Zhang, M., & Browne, W. N. (2012). Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics, 43(6), 1656-1671.

Bouraoui, A., Jamoussi, S., &BenAyed, Y. (2018). A multi-objective genetic algorithm for simultaneous model and feature selection for support vector machines. Artificial Intelligence Review, 50, 261-281.

Ghosh, M., Guha, R., Mondal, R., Singh, P. K., Sarkar, R., & Nasipuri, M. (2018). Feature selection using histogram-based multi-objective GA for handwritten Devanagari numeral recognition. In Intelligent Engineering Informatics: Proceedings of the 6th International Conference on FICTA (pp. 471-479). Springer Singapore.

Hammouri, A. I., Mafarja, M., Al-Betar, M. A., Awadallah, M. A., & Abu-Doush, I. (2020). An improved dragonfly algorithm for feature selection. Knowledge-Based Systems, 203, 106131.

Altman, N. S. (1992). An introduction to kernel and nearest-neighbor nonparametric regression. The American Statistician, 46(3), 175-185.

Mirjalili, S., & Lewis, A. (2015). Novel performance metrics for robust multi-objective optimization algorithms. Swarm and Evolutionary Computation, 21, 1-23.

CoelloCoello, C. A. (2009). Evolutionary multi-objective optimization: some current research trends and topics that remain to be explored. Frontiers of Computer Science in China, 3, 18-30.

Ngatchou, P., Zarei, A., & El-Sharkawi, A. (2005, November). Pareto multi objective optimization. In Proceedings of the 13th international conference on, intelligent systems application to power systems (pp. 84-91). IEEE.

Bak, P., Tang, C., &Wiesenfeld, K. (1987). Self-organized criticality: An explanation of the 1/f noise. Physical review letters, 59(4), 381.

Lewis, A., Mostaghim, S., & Randall, M. (2008). Evolutionary population dynamics and multi-objective optimisation problems. In Multi-objective optimization in computational intelligence: theory and practice (pp. 185-206). IGI Global.

C. L. Blake and C. J. Merz. (1998, 1 June). UCI Repository of machine learning databases. Available:http://www.ics.uci.edu/~mlearn/

Chen, Y., Gao, B., Lu, T., Li, H., Wu, Y., Zhang, D., & Liao, X. (2023). A Hybrid Binary Dragonfly Algorithm with an Adaptive Directed Differential Operator for Feature Selection. Remote Sensing, 15(16), 3980.

Li, J., Kang, H., Sun, G., Feng, T., Li, W., Zhang, W., & Ji, B. (2020). IBDA: improved binary dragonfly algorithm with evolutionary population dynamics and adaptive crossover for feature selection. IEEE Access, 8, 108032-108051.

Downloads

Published

26.03.2024

How to Cite

Anitha G. (2024). A Novel Non-Dominated Sorting Dragonfly Optimization With Evolutionary Population Dynamics Based Multi-Objective Approach For Feature Selection Problems. International Journal of Intelligent Systems and Applications in Engineering, 12(21s), 2052–2063. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/5774

Issue

Section

Research Article