Enhancing Multi-Class KNN in Ball-Trees using Adaptive Pruning

Authors

  • Dayaker P. Research Scholar, Department of CSE, Sri Satya Sai University of Technology & Medical Sciences, Sehore, Bhopal, M.P, INDIA
  • Harsh Lohiya Associate Professor, Department of CSE, Sri Satya Sai University of Technology & Medical Sciences, Sehore, Bhopal, M.P, INDIA

Keywords:

KNN, Adaptive Ball Tree Pruning, Classification, Ball Trees, Adaptive Pruning

Abstract

The K-Nearest Neighbors (KNN) algorithm stands as a prominent tool for classification tasks, leveraging proximity to neighboring data points to assign labels. However, in multi-class scenarios, traditional KNN encounters challenges related to expansive search spaces within Ball Trees, suboptimal k-value determinations, and imbalanced class distributions. To overcome these hurdles, an adaptive pruning algorithm adapted for Ball Trees is introduced, aiming to dynamically modify the tree structure while retaining classification accuracy. Results reveal notable advancements in the efficiency and accuracy of the multi-class KNN algorithm empowered by adaptive ball tree pruning. The proposed method effectively reduces search space while maintaining or even enhancing classification accuracy across diverse datasets. Comparative analyses demonstrate the superiority of the proposed approach in handling multi-class complexities and dynamic data distributions. Datasets showcasing high dimensionality, imbalanced class distributions and dynamic data shifts are employed to assess the algorithm's adaptability and performance. The conclusion propose that adaptive ball tree pruning serves as a pivotal mechanism to mitigate the limitations of traditional KNN in multi-class scenarios, offering a promising avenue for refining nearest neighbor classifiers in real-world applications.

Downloads

Download data is not yet available.

References

Suyanto Suyanto, Prasti Eko Yunanto, Tenia Wahyuningrum, Siti Khomsah, A multi-voter multi-commission nearest neighbor classifier, Journal of King Saud University - Computer and Information Sciences, Volume 34, Issue 8, Part B, 2022, Pages 6292-6302.

Wang, J., Liu, F. Computer-Assisted Collaborative Learning for Enhancing Students Intellectual Ability Using Machine Learning Techniques. Wireless Pers Commun 127, 2443–2460 (2022).

Behera, Santosh Kumar, Dash, Rajashree, A novel feature selection technique for enhancing performance of unbalanced text classification problem, Intelligent Decision Technologies, vol. 16, no. 1, pp. 51-69, 2022

Dr.Belwin J Brearley, Dr.K. Regin Bose, Dr.K.Senthil, Dr.G.Ayyappan, knn approaches by using ball tree searching algorithm with minkowski distance function on smart grid data, Vol. 13, No.4, pages 1210-1226.

I. Iswanto, T. Tulus, and P. Poltak, “comparison of feature selection to performance improvement of k-nearest neighbor algorithm in data classification”, J. Tek. Inform. (JUTIF), vol. 3, no. 6, pp. 1709-1716, Dec. 2022.

Liu, W., Chawla, S. (2011). Class Confidence Weighted kNN Algorithms for Imbalanced Data Sets. In: Huang, J.Z., Cao, L., Srivastava, J. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2011. Lecture Notes in Computer Science(), vol 6635. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-20847-8_29

Li, J. An improved K-nearest neighbor algorithm using tree structure and pruning technology. Intelligent Automation and Soft Computing 25, 35–48 (2019).

Afia, A., Gougam, F., Touzout, W. et al. Spectral proper orthogonal decomposition and machine learning algorithms for bearing fault diagnosis. J Braz. Soc. Mech. Sci. Eng. 45, 550 (2023).

Collins, T. (2020). Facing gender bias in facial recognition technology - Help Net Security.

Suguna, N., & Thanushkodi, K. (2010). An Improved k-Nearest Neighbor Classification Using Genetic Algorithm. International Journal of Computer Science Issues, 7(4), 18–21.

Haghir Chehreghani, M. Unsupervised representation learning with Minimax distance measures. Mach Learn 109, 2063–2097 (2020).

X. Li, J. Lei, Z. Shi and F. Yu, "An Efficient and Accurate Encrypted Image Retrieval Scheme via Ball Tree," 2022 8th International Conference on Big Data Computing and Communications (BigCom), Xiamen, China, 2022, pp. 365-371

Koutsoukas, A., Monaghan, K.J., Li, X. et al. Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data. J Cheminform 9, 42 (2017).

J. G. Cavalcanti Costa, Y. Mei and M. Zhang, "An Evolutionary Hyper-Heuristic Approach to the Large Scale Vehicle Routing Problem," 2021 IEEE Congress on Evolutionary Computation (CEC), Kraków, Poland, 2021, pp. 2109-2116, doi: 10.1109/CEC45853.2021.9504818.

Qinghe Pan, Zeguo Qiu, Yaoqun Xu and Guilin Yao, "Predicting the Price of Second-Hand Housing Based on Lambda Architecture and KD Tree", Infocommunications Journal, Vol. XIV, No 1, March 2022, pp. 2-10., https://doi.org/10.36244/ICJ.2022.1.1

Dhanabal, S., Chandramathi, S., 2011. A Review of various k-Nearest Neighbor Query Processing Techniques. International Journal of Computer Applications 31, 14–22.

Yoshida, R. Tropical Balls and Its Applications to K Nearest Neighbor over the Space of Phylogenetic Trees. Mathematics 2021, 9,779. https://doi.org/10.3390/math9070779

Shuyin Xia, Yunsheng Liu, Xin Ding, Guoyin Wang, Hong Yu, Yuoguo Luo, Granular ball computing classifiers for efficient, scalable and robust learning, Information Sciences, Volume 483, 2019, Pages 136-152.

S. He et al., "Game Player Strategy Pattern Recognition and How UCT Algorithms Apply Pre-knowledge of Player's Strategy to Improve Opponent AI," 2008 International Conference on Computational Intelligence for Modelling Control & Automation, Vienna, Austria, 2008, pp. 1177-1181, doi: 10.1109/CIMCA.2008.82.

Afia, A., Gougam, F., Touzout, W. et al. Spectral proper orthogonal decomposition and machine learning algorithms for bearing fault diagnosis. J Braz. Soc. Mech. Sci. Eng. 45, 550 (2023). https://doi.org/10.1007/s40430-023-04451-z

Wan, W., Lee, H.J. Deep feature representation and ball-tree for face sketch recognition. Int J Syst Assur Eng Manag 11, 818–823 (2020). https://doi.org/10.1007/s13198-019-00882-x

Asmaa Maher, Saeed Mian Qaisar, N. Salankar, Feng Jiang, Ryszard Tadeusiewicz, Paweł Pławiak, Ahmed A. Abd El-Latif, Mohamed Hammad, Hybrid EEG-fNIRS brain-computer interface based on the non-linear features extraction and stacking ensemble learning, Biocybernetics and Biomedical Engineering, Volume 43, Issue 2, 2023, Pages 463-475, ISSN 0208-5216, https://doi.org/10.1016/j.bbe.2023.05.001.

Mayanglambam, S.D., Pamula, R., Horng, S.J., 2023. Clustering-Based Outlier Detection Technique Using PSO-KNN. Journal of Applied Science and Engineering (Taiwan) 26, 1703–1721. doi:10.6180/jase.202312_26(12).0003

S. A. Thomas, Y. Jin, J. Bunch and I. S. Gilmore, "Enhancing classification of mass spectrometry imaging data with deep neural networks," 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Honolulu, HI, USA, 2017, pp. 1-8, doi: 10.1109/SSCI.2017.8285223.

ÇELİK, A., 2022. Improving Iris Dataset Classification Prediction Achievement By Using Optimum k Value of kNN Algorithm. Eskişehir Türk Dünyası Uygulama ve Araştırma Merkezi Bilişim Dergisi 3, 23–30. doi:10.53608/estudambilisim.1071335

Zhang, S., Cheng, D., Deng, Z., Zong, M., & Deng, X. (2018). A novel kNN algorithm with data-driven k parameter computation. Pattern Recognition Letters, 109, 44–54. https://doi.org/10.1016/j.patrec.2017.09.036

Yang, F., Zhou, X., Wu, D., & Sun, T. (2011). A fast improved knn algorithm based on the tree structure. ICIC Express Letters, Part B: Applications, 2(5), 1039–1044.

Downloads

Published

29.01.2024

How to Cite

P., D. ., & Lohiya, H. . (2024). Enhancing Multi-Class KNN in Ball-Trees using Adaptive Pruning. International Journal of Intelligent Systems and Applications in Engineering, 12(13s), 265–277. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/4594

Issue

Section

Research Article