Drivers Emotion Recognition using Deep Learning model with Cognitive Intelligence
Keywords:
Emotion Recognition, Deep Learning, Cognitive Intelligence, Smart Cockpit, Driver SafetyAbstract
Research & Development of Intelligent/Smart systems is in progress now-a-days. One of the most interesting areas among this is smart cockpits. Smart Cockpit can identify emotions & help drivers to be more productive. This ecosystem includes self-learning of correct emotion recognition, interpretation & decision making. This Research will cover behavioral analysis/psychology, machine learning, artificial intelligence, signal processing, computer vision and human & computer interaction. Emotions are directly linked with productivity, performance & efficiency of any human. People with happy emotions can focus on activity which leads to more efficient results. Emotion recognition is one of complex & difficult to identify, hence many researches are going on in the same field. Majority of research is based on a single input that is facial expressions and images. Since Facial muscles around the nose & eyes contribute a lot in emotions expressions, it is most used. Using a single mode of input signals (facial expressions recognition) has challenges of manipulation of expressions & thus results are not so accurate. Along with facial expressions additional input signals such as EEG, ECG, Skin Conductivity, respiration, eye movement signal helps to measure emotions more accurately & on that basis further actions can be taken. A smart cockpit having capability of Human-Machine interaction (identifies correct emotions & take needful action) will help to improvise safety, comfort, and driver’s acceptance. Emotions such as Anger, Sadness, Fear, and Disgust are having a negative impact on safety driving where Happiness & Neutral emotions help in improving driving safety and Surprise shows alertness of individuals while driving. Music plays a vital role in controlling emotions, controlling, or converting negative impacting emotions to positive impacting emotions. Thus, this module can be used for increasing safety & comfort of driving. Multiple Machine Learning algorithms such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Long Short-Term Memory Networks (LSTM), Deep Canonical Correlation Analysis (DCCA), Bimodal Deep Auto Encoder (BDAE) helps in identify Emotions.
Downloads
References
W. Li et al., "CogEmoNet: A Cognitive-Feature-Augmented Driver Emotion Recognition Model for Smart Cockpit," in IEEE Transactions on Computational Social Systems, vol. 9, no. 3, pp. 667-678, June 2022, doi: 10.1109/TCSS.2021.3127935.
W. Liu, J. -L. Qiu, W. -L. Zheng and B. -L. Lu, "Comparing Recognition Performance and Robustness of Multimodal Deep Learning Models for Multimodal Emotion Recognition," in IEEE Transactions on Cognitive and Developmental Systems, vol. 14, no. 2, pp. 715-729, June 2022, doi: 10.1109/TCDS.2021.3071170.
G. Muhammad and M. S. Hossain, "Emotion Recognition for Cognitive Edge Computing Using Deep Learning," in IEEE Internet of Things Journal, vol. 8, no. 23, pp. 16894-16901, 1 Dec.1, 2021, doi: 10.1109/JIOT.2021.3058587.
X. Gu, W. Cai, M. Gao, Y. Jiang, X. Ning and P. Qian, "Multi-Source Domain Transfer Discriminative Dictionary Learning Modeling for Electroencephalogram-Based Emotion Recognition," in IEEE Transactions on Computational Social Systems, doi: 10.1109/TCSS.2022.3153660.
H. -D. Nguyen, S. -H. Kim, G. -S. Lee, H. -J. Yang, I. -S. Na and S. -H. Kim, "Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks," in IEEE Transactions on Affective Computing, vol. 13, no. 1, pp. 226-237, 1 Jan.-March 2022, doi: 10.1109/TAFFC.2019.2946540.
Z. Li and D. Hoiem, "Learning without Forgetting," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, no. 12, pp. 2935-2947, 1 Dec. 2018, doi: 10.1109/TPAMI.2017.2773081.
D. Ayata, Y. Yaslan and M. E. Kamasak, "Emotion Based Music Recommendation System Using Wearable Physiological Sensors," in IEEE Transactions on Consumer Electronics, vol. 64, no. 2, pp. 196-203, May 2018, doi: 10.1109/TCE.2018.2844736.
Shin et al., "Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis," 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2014, pp. 6402-6405, doi: 10.1109/EMBC.2014.6945093.
K. Yoon, J. Lee and M. -U. Kim, "Music recommendation system using emotion triggering low-level features," in IEEE Transactions on Consumer Electronics, vol. 58, no. 2, pp. 612-618, May 2012, doi: 10.1109/TCE.2012.6227467.
Li, W., Tan, R., Xing, Y. et al. A multimodal psychological, physiological and behavioral dataset for human emotions in driving tasks. Sci Data 9, 481 (2022). https://doi.org/10.1038/s41597-022-01557-2
D. Ayata, Y. Yaslan and M. Kamaşak, "Emotion recognition via random forest and galvanic skin response: Comparison of time based feature sets, window sizes and wavelet approaches," 2016 Medical Technologies National Congress (TIPTEKNO), Antalya, Turkey, 2016, pp. 1-4, doi: 10.1109/TIPTEKNO.2016.7863130.
V. Vijay Mohan Dattada and M. Jeevan, "Analysis of Concealed Anger Emotion in a Neutral Speech Signal," 2019 IEEE International Conference on Distributed Computing, VLSI, Electrical Circuits and Robotics (DISCOVER), Manipal, India, 2019, pp. 1-5, doi: 10.1109/DISCOVER47552.2019.9008037.
S. Thuseethan, S. Rajasegarar and J. Yearwood, "Deep Continual Learning for Emerging Emotion Recognition," in IEEE Transactions on Multimedia, vol. 24, pp. 4367-4380, 2022, doi: 10.1109/TMM.2021.3116434.
S. -H. Park, B. -C. Bae and Y. -G. Cheong, "Emotion Recognition from Text Stories Using an Emotion Embedding Model," 2020 IEEE International Conference on Big Data and Smart Computing (BigComp), Busan, Korea (South), 2020, pp. 579-583, doi: 10.1109/BigComp48618.2020.00014.
"Klara Steinhauser, Felix Leist, Kathrin Maier, Vera Michel, Nikolai Pärsch, Philip Rigley, Franz Wurm, Marco Steinhauser, ""Effects of emotions on driving behavior"", in Nov 2018 Transportation Research Part F: Traffic Psychology and Behaviour, Volume 59, Part A, doi: 10.1016/j.trf.2018.08.012."
Sanders RD, Schuepbach D, Goldstein G, Haas GL, Sweeney JA, Keshavan MS. "Relationships between cognitive and neurological performance in neuroleptic-naïve psychosis". J Neuropsychiatry Clin Neurosci. 2004 Fall;16(4):480-7. doi: 10.1176/jnp.16.4.480. PMID: 15616175.
Vijayanand. G, Karthick. S, Hari. B, Jaikrishnan. V, 2020, "Emotion Detection using Machine Learning'', INTERNATIONAL JOURNAL OF ENGINEERING RESEARCH & TECHNOLOGY (IJERT) NCICCT – 2020 (Volume 8 – Issue 08),
C. Joesph, A. Rajeswari, B. Premalatha and C. Balapriya, "Implementation of physiological signal based emotion recognition algorithm," 2020 IEEE 36th International Conference on Data Engineering (ICDE), Dallas, TX, USA, 2020, pp. 2075-2079, doi: 10.1109/ICDE48307.2020.9153878.
E. A. Veltmeijer, C. Gerritsen and K. V. Hindriks, "Automatic Emotion Recognition for Groups: A Review," in IEEE Transactions on Affective Computing, vol. 14, no. 1, pp. 89-107, 1 Jan.-March 2023, doi: 10.1109/TAFFC.2021.3065726.
H. Xiao et al., “On-Road Driver Emotion Recognition Using Facial Expression,” Applied Sciences, vol. 12, no. 2, p. 807, Jan. 2022, doi: 10.3390/app12020807.
Zepf, S., Hernandez, J., Schmitt, A., Minker, W., & Picard, R. W. (2020). “Driver Emotion Recognition for Intelligent Vehicles”. ACM Computing Surveys, 53(3), 1–30. doi:10.1145/3388790
Y. Wu, R. Valdez and C. Forlines, "Cognitive and Emotional Monitoring with Inexpensive Wrist-Worn Consumer-Grade Wearables," 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), Atlanta, GA, USA, 2023, pp. 665-670, doi: 10.1109/PerComWorkshops56833.2023.10150338.
Xing Luan, Quan Wen, Bo Hang, “Driver emotion recognition based on attentional convolutional network”, Frontiers in Physics, 2024-Apr, Vol.12, https://doi.org/10.3389/fphy.2024.1387338
Yunan Wu; Roxana Valdez; Clifton Forlines, “Cognitive and Emotional Monitoring with Inexpensive Wrist-Worn Consumer-Grade Wearables”, 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), Atlanta, GA, USA, 2023, pp. 665-670, doi: 10.1109/PerComWorkshops56833.2023.10150338
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.