Optimizing Machine Learning Models: An Adaptive Hyperparameter Tuning Approach

Authors

  • Pavitha N. PhD Research Scholar at Dr. Vishwanath Karad MIT World Peace University, Pune, India. and Assistant Professor at Vishwakarma University, Pune, India.
  • Shounak Sugave Dr. Vishwanath Karad MIT World Peace University, Pune, India

Keywords:

Hyperparameter optimization, Machine learning, Resource allocation, Acquisition function, Performance enhancement

Abstract

Hyperparameter optimization is a critical task in enhancing machine learning model performance. This paper introduces a novel approach, for hyperparameter tuning without making assumptions about the underlying hyperparameter distribution or convergence behavior. This approach treats hyperparameter configurations as indices and focuses solely on the associated loss sequences. The objective is to efficiently search for the configuration by minimizing the selected configuration's validation error. The algorithm employs an acquisition function to determine the next configuration to evaluate and leverages a classification model to guide the search process. The proposed methodology is agnostic to the structure of hyperparameter relationships, aiming to strike a balance between resource usage and performance improvement. Experimental results demonstrate the proposed approach’s effectiveness in identifying optimal configurations while being adaptive to various domains and data types.

Downloads

Download data is not yet available.

References

S. Jaiswal and C. M. Balasubramanian, “An advanced deep learning model for maneuver prediction in real-time systems using alarming-based hunting optimization,” International Journal of Advances in Intelligent Informatics, vol. 9, no. 2, 2023, doi: 10.26555/ijain.v9i2.1048.

A. K. Bitto, Md. H. I. Bijoy, S. Yesmin, I. Mahmud, Md. J. Mia, and K. B. B. Biplob, “Tumor-Net: convolutional neural network modeling for classifying brain tumors from MRI images,” International Journal of Advances in Intelligent Informatics, vol. 9, no. 2, 2023, doi: 10.26555/ijain.v9i2.872.

K. Kohv and O. Lukason, “What best predicts corporate bank loan defaults? An analysis of three different variable domains,” Risks, vol. 9, no. 2, pp. 1–19, Feb. 2021, doi: 10.3390/risks9020029.

S. Kumar and S. Ratnoo, “Multi-objective hyperparameter tuning of classifiers for disease diagnosis,” Indian Journal of Computer Science and Engineering, vol. 12, no. 5, 2021, doi: 10.21817/INDJCSE/2021/V12I5/211205081.

I. Jamaleddyn, R. El ayachi, and M. Biniz, “An improved approach to Arabic news classification based on hyperparameter tuning of machine learning algorithms,” Journal of Engineering Research, vol. 11, no. 2, 2023, doi: 10.1016/j.jer.2023.100061.

M. Papouskova and P. Hajek, “Two-stage consumer credit risk modelling using heterogeneous ensemble learning,” Decis Support Syst, vol. 118, pp. 33–45, Mar. 2019, doi: 10.1016/j.dss.2019.01.002.

S. Guo, H. He, and X. Huang, “A Multi-Stage Self-Adaptive Classifier Ensemble Model With Application in Credit Scoring,” IEEE Access, vol. 7, pp. 78549–78559, 2019, doi: 10.1109/ACCESS.2019.2922676.

M. Jervis, M. Liu, and R. Smith, “Deep learning network optimization and hyperparameter tuning for seismic lithofacies classification,” Leading Edge, vol. 40, no. 7, 2021, doi: 10.1190/tle40070514.1.

L. Wen, X. Ye, and L. Gao, “A new automatic machine learning based hyperparameter optimization for workpiece quality prediction,” Measurement and Control (United Kingdom), vol. 53, no. 7–8, 2020, doi: 10.1177/0020294020932347.

M. Matulis and C. Harvey, “A robot arm digital twin utilising reinforcement learning,” Computers and Graphics (Pergamon), vol. 95, 2021, doi: 10.1016/j.cag.2021.01.011.

M. Zhang, H. Li, S. Pan, J. Lyu, S. Ling, and S. Su, “Convolutional Neural Networks-Based Lung Nodule Classification: A Surrogate-Assisted Evolutionary Algorithm for Hyperparameter Optimization,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 5, 2021, doi: 10.1109/TEVC.2021.3060833.

M. Shahhosseini, G. Hu, and H. Pham, “Optimizing ensemble weights and hyperparameters of machine learning models for regression problems,” Machine Learning with Applications, vol. 7, 2022, doi: 10.1016/j.mlwa.2022.100251.

N. Bakhashwain and A. Sagheer, “Online Tuning of Hyperparameters in Deep LSTM for Time Series Applications,” International Journal of Intelligent Engineering and Systems, vol. 14, no. 1, 2020, doi: 10.22266/IJIES2021.0228.21.

U. K. Mohamad Yusof, Syamsiah Mashohor, Marsyita Hanafi, Sabariah Md Noor, and Norsafina Zainal, “Hyperparameter Tuning in Deep Learning Approach for Classification of Classical Myeloproliferative Neoplasm,” Malaysian Journal of Science and Advanced Technology, 2022, doi: 10.56532/mjsat.v2i3.64.

F. Farhangi, “Investigating the role of data preprocessing, hyperparameters tuning, and type of machine learning algorithm in the improvement of drowsy EEG signal modeling,” Intelligent Systems with Applications, vol. 15, 2022, doi: 10.1016/j.iswa.2022.200100.

E. K. Hashi and Md. Shahid Uz Zaman, “Developing a Hyperparameter Tuning Based Machine Learning Approach of Heart Disease Prediction,” Journal of Applied Science & Process Engineering, vol. 7, no. 2, 2020, doi: 10.33736/jaspe.2639.2020.

T. Marlaithong, V. C. Barroso, and P. Phunchongharn, “A hyperparameter tuning approach for an online log parser,” in ECTI-CON 2021 - 2021 18th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology: Smart Electrical System and Technology, Proceedings, 2021. doi: 10.1109/ECTI-CON51831.2021.9454924.

S. Sah, B. Surendiran, R. Dhanalakshmi, and M. Yamin, “Covid-19 cases prediction using SARIMAX Model by tuning hyperparameter through grid search cross-validation approach,” Expert Syst, 2022, doi: 10.1111/exsy.13086.

M. Daviran, A. Maghsoudi, R. Ghezelbash, and B. Pradhan, “A new strategy for spatial predictive mapping of mineral prospectivity: Automated hyperparameter tuning of random forest approach,” Comput Geosci, vol. 148, 2021, doi: 10.1016/j.cageo.2021.104688.

A. Panichella, “A Systematic Comparison of search-Based approaches for LDA hyperparameter tuning,” Inf Softw Technol, vol. 130, 2021, doi: 10.1016/j.infsof.2020.106411.

K. Maass, A. Aravkin, and M. Kim, “A hyperparameter-tuning approach to automated inverse planning,” Med Phys, vol. 49, no. 5, 2022, doi: 10.1002/mp.15557.

K. Shankar et al., “An Automated Hyperparameter Tuning Recurrent Neural Network Model for Fruit Classification,” Mathematics, vol. 10, no. 13, 2022, doi: 10.3390/math10132358.

M. A. Amirabadi, M. H. Kahaei, and S. A. Nezamalhosseini, “Novel suboptimal approaches for hyperparameter tuning of deep neural network [under the shelf of optical communication],” Physical Communication, vol. 41, 2020, doi: 10.1016/j.phycom.2020.101057.

Z. Czako, G. Sebestyen, and A. Hangan, “AutomaticAI – A hybrid approach for automatic artificial intelligence algorithm selection and hyperparameter tuning,” Expert Syst Appl, vol. 182, 2021, doi: 10.1016/j.eswa.2021.115225.

C. G. Siji George and B. Sumathi, “Grid search tuning of hyperparameters in random forest classifier for customer feedback sentiment prediction,” International Journal of Advanced Computer Science and Applications, vol. 11, no. 9, 2020, doi: 10.14569/IJACSA.2020.0110920.

R. Ghawi and J. Pfeffer, “Efficient Hyperparameter Tuning with Grid Search for Text Categorization using kNN Approach with BM25 Similarity,” Open Computer Science, vol. 9, no. 1, 2019, doi: 10.1515/comp-2019-0011.

H. T. Vo, H. T. Ngoc, and L. Da Quach, “An Approach to Hyperparameter Tuning in Transfer Learning for Driver Drowsiness Detection Based on Bayesian Optimization and Random Search,” International Journal of Advanced Computer Science and Applications, vol. 14, no. 4, 2023, doi: 10.14569/IJACSA.2023.0140492.

F. Arden and C. Safitri, “Hyperparameter Tuning Algorithm Comparison with Machine Learning Algorithms,” in Proceeding - 6th International Conference on Information Technology, Information Systems and Electrical Engineering: Applying Data Sciences and Artificial Intelligence Technologies for Environmental Sustainability, ICITISEE 2022, 2022. doi: 10.1109/ICITISEE57756.2022.10057630.

S. Mezzah and A. Tari, “Practical hyperparameters tuning of convolutional neural networks for EEG emotional features classification,” Intelligent Systems with Applications, vol. 18, 2023, doi: 10.1016/j.iswa.2023.200212.

G. Atteia, N. Abdel Samee, E. S. M. El-Kenawy, and A. Ibrahim, “CNN-Hyperparameter Optimization for Diabetic Maculopathy Diagnosis in Optical Coherence Tomography and Fundus Retinography,” Mathematics, vol. 10, no. 18, 2022, doi: 10.3390/math10183274.

W. H. Nugroho, S. Handoyo, H. C. Hsieh, Y. J. Akri, Zuraidah, and D. DwinitaAdelia, “Modeling Multioutput Response Uses Ridge Regression and MLP Neural Network with Tuning Hyperparameter through Cross Validation,” International Journal of Advanced Computer Science and Applications, vol. 13, no. 9, 2022, doi: 10.14569/IJACSA.2022.0130992.

A. L. C. Ottoni and M. S. Novo, “A Deep Learning Approach to Vegetation Images Recognition in Buildings: A Hyperparameter Tuning Case Study,” IEEE Latin America Transactions, vol. 19, no. 12, 2021, doi: 10.1109/TLA.2021.9480148.

M. Kim, “The generalized extreme learning machines: Tuning hyperparameters and limiting approach for the Moore–Penrose generalized inverse,” Neural Networks, vol. 144, 2021, doi: 10.1016/j.neunet.2021.09.008.

C. Y. Wang, C. Y. Huang, and Y. H. Chiang, “Solutions of Feature and Hyperparameter Model Selection in the Intelligent Manufacturing,” Processes, vol. 10, no. 5, 2022, doi: 10.3390/pr10050862.

H.-C. Cheng, C.-L. Ma, and Y.-L. Liu, “Development of ANN-Based Warpage Prediction Model for FCCSP via Subdomain Sampling and Taguchi Hyperparameter Optimization,” Micromachines (Basel), vol. 14, no. 7, 2023, doi: 10.3390/mi14071325.

J. Lorraine, P. Vicol, and D. Duvenaud, “Optimizing Millions of Hyperparameters by Implicit Differentiation,” in Proceedings of Machine Learning Research, 2020.

L. Zahedi, F. G. Mohammadi, M. H. Amini, and M. H. Amini, “OptABC: An Optimal Hyperparameter Tuning Approach for Machine Learning Algorithms,” in Proceedings - 20th IEEE International Conference on Machine Learning and Applications, ICMLA 2021, 2021. doi: 10.1109/ICMLA52953.2021.00186.

P. S. Pravin, J. Z. M. Tan, K. S. Yap, and Z. Wu, “Hyperparameter optimization strategies for machine learning-based stochastic energy efficient scheduling in cyber-physical production systems,” Digital Chemical Engineering, vol. 4, 2022, doi: 10.1016/j.dche.2022.100047.

Srivastava, A. ., & Kumar, A. . (2023). Secure Authentication Scheme for the Internet of Things. International Journal on Recent and Innovation Trends in Computing and Communication, 11(4s), 182–192. https://doi.org/10.17762/ijritcc.v11i4s.6368

Thota, D. S. ., Sangeetha, D. M., & Raj , R. . (2022). Breast Cancer Detection by Feature Extraction and Classification Using Deep Learning Architectures. Research Journal of Computer Systems and Engineering, 3(1), 90–94. Retrieved from https://technicaljournals.org/RJCSE/index.php/journal/article/view/48

Sharma, R., Dhabliya, D. Attacks on transport layer and multi-layer attacks on manet (2019) International Journal of Control and Automation, 12 (6 Special Issue), pp. 5-11.

Downloads

Published

21.09.2023

How to Cite

N., P. ., & Sugave, S. . (2023). Optimizing Machine Learning Models: An Adaptive Hyperparameter Tuning Approach. International Journal of Intelligent Systems and Applications in Engineering, 11(4), 344–354. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/3532

Issue

Section

Research Article