An Innovative Human-Computer Interaction (HCI) for Surface Electromyography (EMG) Gesture Recognition
Keywords:
hand gestures (HG), human-computer interaction (HCI), principal component analysis (PCA), surface electromyogram (SEMG), augmented partial swarm optimization, modified k-nearest neighbor (APSO-MKNN)Abstract
The interface between citizens and elegant cities is human computer interaction (HCI), a place that is crucial in bridging the application gap for information technology in contemporary cities. Hand gestures (HG) are generally recognized as a potential HCI technique, and the use of Surface Electromyograms (SEMG) to recognize Human Hand Gestures (HHG) is a significant area of study. Modern signal processing techniques, instead, are not robust in feature extraction utilizing Principal Component Analysis (PCA), using feature re-extraction, and guide respect with SEMG signals; there be motionless several technical issues that need to be resolved. The way for instance, can myoelectric control be kept available in intermittent use, as time variability has a significant negative impact on pattern recognition quality yet is unavoidable in regular use. Developing a solid HCI also requires ensuring the myoelectric control system's efficacy and dependability. In this study, Augmented Partial Swarm Optimization and Modified K-Nearest Neighbor (APSO-MKNN) are used in the HGR system that can eliminate redundant information in SEMG signals and increase the effectiveness and precision of recognition. The investigational findings help lower the time differences in Gesture Recognition (GR) based on SEMG.This study is focused on optimizing the time differences in SEMG pattern recognition. The identification approach that is proposed in this study has the possibility of increasing the long-term accuracy of the generalization of an HCI system. Additionally, the proposed framework can simplify the process of data collecting prior to having a gadget prepared and ready for usage.
Downloads
References
Jaramillo-Yánez, A., Benalcázar, M. E., & Mena-Maldonado, E. (2020). Real-time hand gesture recognition using surface electromyography and machine learning: A systematic literature review. Sensors, 20(9), 2467.
Tripathi, A., Prathosh, A. P., Muthukrishnan, S. P., & Kumar, L. (2023). SurfMyoAiR: A Surface Electromyography-Based Framework for Airwriting Recognition. IEEE Transactions on Instrumentation and Measurement, 72, 1-12.
Ramadoss, J., Venkatesh, J., Joshi, S., Shukla, P. K., Jamal, S. S., Altuwairiqi, M., & Tiwari, B. (2021). Computer vision for human-computer interaction using noninvasive technology. Scientific Programming, 2021, 1-15.
Ovur, S. E., Zhou, X., Qi, W., Zhang, L., Hu, Y., Su, H., ... & De Momi, E. (2021). A novel autonomous learning framework to enhance sSEMG-based hand gesture recognition using depth information. Biomedical Signal Processing and Control, 66, 102444.
Tripathi, A., Prathosh, A. P., Muthukrishnan, S. P., & Kumar, L. (2023). TripCEAiR: A Multi-Loss minimization approach for surface SEMG-based Airwriting Recognition. Biomedical Signal Processing and Control, 85, 104991.
Neacsu, A. A., Cioroiu, G., Radoi, A., & Burileanu, C. (2019, July). Automatic EMG-based hand gesture recognition system using time-domain descriptors and fully-connected neural networks. In 2019 42nd International Conference on Telecommunications and Signal Processing (TSP) (pp. 232-235). IEEE.
Khan, M. U., Khan, H., Muneeb, M., Abbasi, Z., Abbasi, U. B., & Baloch, N. K. (2021, August). Supervised machine learning-based fast hand gesture recognition and classification using electromyography (emg) signals. In 2021 international conference on applied and engineering mathematics (ICAEM) (pp. 81-86). IEEE.
Ketykó, I., Kovács, F., & Varga, K. Z. (2019, July). Domain adaptation for semg-based gesture recognition with recurrent neural networks. In 2019 International Joint Conference on Neural Networks (IJCNN) (pp. 1-7). IEEE.
Burileanu, C. (2019). Real-Time Gesture Recognition System (Doctoral dissertation, University „Politehnica” Bucharest).
Tan, P., Han, X., Zou, Y., Qu, X., Xue, J., Li, T., ... & Wang, Z. L. (2022). Self‐Powered Gesture Recognition Wristband Enabled by Machine Learning for Full Keyboard and Multicommand Input. Advanced Materials, 34(21), 2200793.
Botros, F. S., Phinyomark, A., & Scheme, E. J. (2020). Electromyography-based gesture recognition: Is it time to change focus from the forearm to the wrist?. IEEE Transactions on Industrial Informatics, 18(1), 174-184.
Qureshi, M. F., Mushtaq, Z., ur Rehman, M. Z., & Kamavuako, E. N. (2022). Spectral Image-Based Multiday Surface Electromyography Classification of Hand Motions Using CNN for Human–Computer Interaction. IEEE Sensors Journal, 22(21), 20676-20683.
Bahador, A., Yousefi, M., Marashi, M., & Bahador, O. (2020). High accurate lightweight deep learning method for gesture recognition based on surface electromyography. Computer Methods and Programs in Biomedicine, 195, 105643.
Chhabra, G. (2023). Comparison of Imputation Methods for Univariate Time Series. International Journal on Recent and Innovation Trends in Computing and Communication, 11(2s), 286–292. https://doi.org/10.17762/ijritcc.v11i2s.6148
Li, Q., & Langari, R. (2021, December). Myoelectric Human Computer Interaction Using CNN-LSTM Neural Network for Dynamic Hand Gestures Recognition. In 2021 IEEE International Conference on Big Data (Big Data) (pp. 5947-5949). IEEE.
Lai, Z., Kang, X., Wang, H., Zhang, W., Zhang, X., Gong, P., ... & Huang, H. (2021). Stcn-gr: Spatial-temporal convolutional networks for surface-electromyography-based gesture recognition. In Neural Information Processing: 28th International Conference, ICONIP 2021, Sanur, Bali, Indonesia, December 8–12, 2021, Proceedings, Part III 28 (pp. 27-39). Springer International Publishing.
Li, Q., & Langari, R. (2022). SEMG -based HCI Using CNN-LSTM Neural Network for Dynamic Hand Gestures Recognition. IFAC-PapersOnLine, 55(37), 426-431.
Rupom, F. F., Jannat, S., Tamanna, F. F., Al Johan, G. M., & Islam, M. M. (2020, June). SEMG controlled bionic robotic arm using artificial intelligence and machine learning. In 2020 IEEE Region 10 Symposium (TENSYMP) (pp. 334-339). IEEE.
Wang, Q., & Wang, X. (2020, October). Deep convolutional neural network for decoding SEMG for human computer interaction. In 2020 11th IEEE Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON) (pp. 0554-0557). IEEE.
Zhang, X., Yang, Z., Chen, T., Chen, D., & Huang, M. C. (2019). Cooperative sensing and wearable computing for sequential hand gesture recognition. IEEE Sensors Journal, 19(14), 5775-5783.
Khatri, K. ., & Sharma, D. A. . (2020). ECG Signal Analysis for Heart Disease Detection Based on Sensor Data Analysis with Signal Processing by Deep Learning Architectures. Research Journal of Computer Systems and Engineering, 1(1), 06–10. Retrieved from https://technicaljournals.org/RJCSE/index.php/journal/article/view/11
Wei, W., Dai, Q., Wong, Y., Hu, Y., Kankanhalli, M., & Geng, W. (2019). Surface-electromyography-based gesture recognition by multi-view deep learning. IEEE Transactions on Biomedical Engineering, 66(10), 2964-2973.
Ozdemir, M. A., Kisa, D. H., Guren, O., & Akan, A. (2022). Dataset for multi-channel surface electromyography (SEMG) signals of hand gestures. Data in brief, 41, 107921
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.