Emotional Recognition Edge Computing Model for Emotion Classification with Immersive Theatre Experience

Authors

  • Xiaoxuan Cui International College, Krirk University, Bangkok 10220, Thailand

Keywords:

Emotion Recognition, Optimal Model, Edge Computing, Deep Learning, Probability Model

Abstract

Emotion recognition refers to the process of identifying and analyzing human emotions based on various cues, such as facial expressions, vocal intonations, gestures, and physiological responses. This technology utilizes machine learning algorithms and artificial intelligence techniques to interpret these cues and classify the emotions being expressed, which can include happiness, sadness, anger, fear, surprise, and more nuanced emotional states. An immersive theatre interactive experience is a unique and dynamic form of entertainment that transcends traditional passive spectatorship by fully engaging the audience through sensory stimulation, participation, and interaction. This paper presents an innovative framework, the Optimal Edge Computing Probability (OECP) model, designed to assess and enhance the immersive theatre interactive experience through the integration of an emotion recognition algorithm. With the power of edge computing, the OECP model dynamically allocates computational tasks between edge devices and centralized cloud resources. Through a comprehensive evaluation within the context of immersive theatre, the model optimally determines when to process emotion recognition data locally on edge devices and when to offload to the cloud. Simulation results demonstrate the model's effectiveness in accurately classifying a range of user emotions, yielding high classification accuracy, precision, recall, and F1-score values. This model not only ensures real-time responsiveness and efficient resource utilization but also opens new horizons for the intersection of emotion recognition, immersive experiences, and edge computing. As technological landscapes continue to evolve, the OECP model offers a robust foundation for refining user engagement and enriching the convergence of emotion analysis and interactive technologies.

Downloads

Download data is not yet available.

References

Tabbaa, L., Searle, R., Bafti, S. M., Hossain, M. M., Intarasisrisawat, J., Glancy, M., & Ang, C. S. (2021). Vreed: Virtual reality emotion recognition dataset using eye tracking & physiological measures. Proceedings of the ACM on interactive, mobile, wearable and ubiquitous technologies, 5(4), 1-20.

Su, P. (2021). Immersive online biometric authentication algorithm for online guiding based on face recognition and cloud-based mobile edge computing. Distributed and Parallel Databases, 1-22.

Amara, K., Ramzan, N., Zenati, N., Djekoune, O., Larbes, C., Guerroudji, M. A., & Aouam, D. (2021, May). Towards emotion recognition in immersive virtual environments: A method for Facial emotion recognition. In ICCSA 2021 Conference on Computer Science’s Complex Systems and their Applications 2021 (Vol. 2904, pp. 253-263).

Ahmed, N., Al Aghbari, Z., & Girija, S. (2023). A systematic survey on multimodal emotion recognition using learning algorithms. Intelligent Systems with Applications, 17, 200171.

Marín-Morales, J., Higuera-Trujillo, J. L., Guixeres, J., Llinares, C., Alcañiz, M., & Valenza, G. (2021). Heart rate variability analysis for the assessment of immersive emotional arousal using virtual reality: Comparing real and virtual scenarios. PloS one, 16(7), e0254098.

Yu, M., Xiao, S., Hua, M., Wang, H., Chen, X., Tian, F., & Li, Y. (2022). EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features. Biomedical Signal Processing and Control, 72, 103349.

Polydorou, N., & Edalat, A. (2021). An interactive VR platform with emotion recognition for self-attachment intervention. EAI Endorsed Transactions on Pervasive Health and Technology, 7(29).

Pal, S., Mukhopadhyay, S., & Suryadevara, N. (2021). Development and progress in sensors and technologies for human emotion recognition. Sensors, 21(16), 5554.

Valaskova, K., Popp, J., & Balica, R. Ş. (2022). Visual and Spatial Analytics, Immersive Virtual Simulation Technologies, and Motion Planning and Object Recognition Algorithms in the Retail Metaverse. Economics, Management and Financial Markets, 17(3), 58-74.

Cacciatori, F., Nikolaev, S., & Grigorev, D. (2022). On Developing Facial Stress Analysis and Expression Recognition Platform. arXiv preprint arXiv:2209.07916.

Wu, L. (2022). Multimodal Opera Performance Form Based on Human-Computer Interaction Technology. International Transactions on Electrical Energy Systems, 2022.

Pistola, T., Diplaris, S., Stentoumis, C., Stathopoulos, E. A., Loupas, G., Mandilaras, T., ... & Kompatsiaris, I. (2021, May). Creating immersive experiences based on intangible cultural heritage. In 2021 IEEE International Conference on Intelligent Reality (ICIR) (pp. 17-24). IEEE.

Wen, J., & Piao, Y. (2022). Human–Computer Interaction-Oriented African Literature and African Philosophy Appreciation. Frontiers in Psychology, 12, 808414.

Šumak, B., Brdnik, S., & Pušnik, M. (2021). Sensors and artificial intelligence methods and algorithms for human–computer intelligent interaction: A systematic mapping study. Sensors, 22(1), 20.

Zhong, L., Wang, W., Zhu, S., Ji, S., Zha, C., Wan, M., & Gu, J. (2022, December). Contactless Interaction System Based on Facial Expression Recognition for Humanoid Piano Robot. In 2022 IEEE International Conference on Robotics and Biomimetics (ROBIO) (pp. 351-356). IEEE.

Xue, J., Wang, J., Hu, S., Bi, N., & Lv, Z. (2022). OVPD: Odor-video elicited physiological signal database for emotion recognition. IEEE Transactions on Instrumentation and Measurement, 71, 1-12.

Balica, R. S. (2022). Geospatial Mapping Technologies, Predictive Modeling Algorithms, and Immersive Visualization Systems in the Virtual Economy of the Metaverse. Review of Contemporary Philosophy, (21), 138-153.

Fu, X., Xue, C., Yin, Q., Jiang, Y., Li, Y., Cai, Y., & Sun, W. (2021, September). Gesture based fear recognition using nonperformance dataset from VR horror games. In 2021 9th International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 1-8). IEEE.

Amara, K., Kerdjidj, O., & Ramzan, N. (2023). Emotion Recognition for Affective human digital twin by means of virtual reality enabling technologies. IEEE Access.

Utami, P., Hartanto, R., Soesanti, I., Utami, P., Hartanto, R., & Soesanti, I. (2022). A Brief Study of The Use of Pattern Recognition in Online Learning: Recommendation for Assessing Teaching Skills Automatically Online Based. Elinvo (Electronics, Informatics, Vocat. Educ., 7(1), 48-62.

Cuțitoi, A. C. (2022). Machine vision algorithms, sensory data mining techniques, and geospatial mapping tools in the blockchain-based virtual economy. Review of Contemporary Philosophy, (21), 223-238.

Li, S., Guo, L., & Liu, J. (2022). Towards East Asian facial expression recognition in the real world: A new database and deep recognition baseline. Sensors, 22(21), 8089.

Jenkins, T. (2022). Immersive virtual shopping experiences in the retail metaverse: Consumer-driven E-commerce, blockchain-based digital assets, and data visualization tools. Linguistic and Philosophical Investigations, (21), 154-169.

Downloads

Published

30.11.2023

How to Cite

Cui, X. . (2023). Emotional Recognition Edge Computing Model for Emotion Classification with Immersive Theatre Experience. International Journal of Intelligent Systems and Applications in Engineering, 12(6s), 355–368. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/3982

Issue

Section

Research Article