An Innovative Approach for Revolutionizing Pediatric Health Monitoring in Real-Time Activity Recognition Utilizing CNN-LSTM-ELM
Keywords:
Paediatric Activity Recognition, Convolutional Neural Networks, Long Short-Term Memory Networks, Extreme Learning Machine, Real-Time Recognition, Healthcare Monitoring.Abstract
Pediatric activity recognition is an essential part of many healthcare and childcare applications, allowing for the monitoring and evaluation of children's physical development. In this study, a novel real-time pediatric activity recognition system is proposed, which combines the advantages of Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) networks for extraction of features, followed by an Extreme Learning Machine (ELM) classifier for accurate activity categorization. It initially generated an extensive dataset made up of footage of kid-friendly activities that had been carefully labelled with activity categories. A two-step procedure is employed, starting with the use of a CNN model to extract discriminative spatial features from video frames that has been pre-trained on a large dataset. The image signals available in pediatric activities are richly represented by these elements. In order to capture temporal relationships within the series of feature vectors, it incorporates an LSTM network after feature extraction. Further improving the recognition accuracy, this LSTM-based sequence modelling is skilled at identifying subtle activity patterns and transitions over time. The key component of this development is the addition of an ELM classifier after the LSTM layer. ELM, which is renowned for its ability to train quickly and effectively, utilizes the temporal context stored by the LSTM to conduct real-time activity classification with astounding speed and accuracy. As a result, pediatric actions are recognized effectively and robustly. The CNN-LSTM-ELM model is utilized to analyze receiving images in order to do real-time recognition. The system is equipped with this framework to enable real-time decision-making in scenarios including healthcare and child care. The findings show that the suggested CNN-LSTM-ELM architecture demonstrates outstanding accuracy of 90.5% and efficiency in identifying a wide spectrum of pediatric activities, hence enhancing the capabilities of child-focused healthcare and wellbeing applications
Downloads
References
S. M. Badawy and A. Radovic, “Digital approaches to remote pediatric health care delivery during the COVID-19 pandemic: existing evidence and a call for further research,” JMIR pediatrics and parenting, vol. 3, no. 1, p. e20049, 2020.
C. R. Shuhart et al., “Executive summary of the 2019 ISCD position development conference on monitoring treatment, DXA cross-calibration and least significant change, spinal cord injury, peri-prosthetic and orthopedic bone health, transgender medicine, and pediatrics,” Journal of Clinical Densitometry, vol. 22, no. 4, pp. 453–471, 2019.
A. Vidal-Balea, Ó. Blanco-Novoa, P. Fraga-Lamas, and T. M. Fernández-Caramés, “Developing the next generation of augmented reality games for pediatric healthcare: An open-source collaborative framework based on ARCore for implementing teaching, training and monitoring applications,” Sensors, vol. 21, no. 5, p. 1865, 2021.
A. Schmidt, S. M. Ilango, M. A. McManus, K. K. Rogers, and P. H. White, “Outcomes of pediatric to adult health care transition interventions: an updated systematic review,” Journal of pediatric nursing, vol. 51, pp. 92–107, 2020.
M. Alshamrani, “IoT and artificial intelligence implementations for remote healthcare monitoring systems: A survey,” Journal of King Saud University-Computer and Information Sciences, vol. 34, no. 8, pp. 4687–4701, 2022.
V. Schwierzeck et al., “First reported nosocomial outbreak of severe acute respiratory syndrome coronavirus 2 in a pediatric dialysis unit,” Clinical Infectious Diseases, vol. 72, no. 2, pp. 265–270, 2021.
S. Farrell, E. K. Schaeffer, and K. Mulpuri, “Recommendations for the care of pediatric orthopaedic patients during the COVID pandemic,” The Journal of the American Academy of Orthopaedic Surgeons, 2020.
D. Chowdhury et al., “Telehealth for pediatric cardiology practitioners in the time of COVID-19,” Pediatric Cardiology, vol. 41, no. 6, pp. 1081–1091, 2020.
A. C. Shah and S. M. Badawy, “Telemedicine in pediatrics: systematic review of randomized controlled trials,” JMIR pediatrics and parenting, vol. 4, no. 1, p. e22696, 2021.
A. Curfman et al., “Pediatric telehealth in the COVID-19 pandemic era and beyond,” Pediatrics, vol. 148, no. 3, 2021.
P. Bourgoin et al., “The prognostic value of early amplitude-integrated electroencephalography monitoring after pediatric cardiac arrest,” Pediatric Critical Care Medicine, vol. 21, no. 3, pp. 248–255, 2020.
P. N. Huu and H. N. T. Thu, “Proposal gesture recognition algorithm combining cnn for health monitoring,” in 2019 6th NAFOSTED Conference on Information and Computer Science (NICS), IEEE, 2019, pp. 209–213.
X. Zhou, Y. Li, and W. Liang, “CNN-RNN based intelligent recommendation for online medical pre-diagnosis support,” IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 18, no. 3, pp. 912–921, 2020.
G. A. Oguntala et al., “SmartWall: Novel RFID-Enabled Ambient Human Activity Recognition Using Machine Learning for Unobtrusive Health Monitoring,” IEEE Access, vol. 7, pp. 68022–68033, 2019, doi: 10.1109/ACCESS.2019.2917125.
X. Zhou, W. Liang, K. I.-K. Wang, H. Wang, L. T. Yang, and Q. Jin, “Deep-Learning-Enhanced Human Activity Recognition for Internet of Healthcare Things,” IEEE Internet of Things Journal, vol. 7, no. 7, pp. 6429–6438, Jul. 2020, doi: 10.1109/JIOT.2020.2985082.
N. Tufek, M. Yalcin, M. Altintas, F. Kalaoglu, Y. Li, and S. K. Bahadir, “Human Action Recognition Using Deep Learning Methods on Limited Sensory Data,” IEEE Sensors Journal, vol. 20, no. 6, pp. 3101–3112, Mar. 2020, doi: 10.1109/JSEN.2019.2956901.
N. Tziolas, N. Tsakiridis, E. Ben-Dor, J. Theocharis, and G. Zalidis, “Employing a Multi-Input Deep Convolutional Neural Network to Derive Soil Clay Content from a Synergy of Multi-Temporal Optical and Radar Imagery Data,” Remote Sensing, vol. 12, no. 9, Art. no. 9, Jan. 2020, doi: 10.3390/rs12091389.
M. Chen et al., “Iterative integration of deep learning in hybrid Earth surface system modelling,” Nat Rev Earth Environ, vol. 4, no. 8, Art. no. 8, Aug. 2023, doi: 10.1038/s43017-023-00452-7.
J. N. et Al, “Innovative AI-driven Automation System Leveraging Advanced Perceptive Technologies to Establish an Ideal Self-Regulating Video Surveillance Model,” Tuijin Jishu/Journal of Propulsion Technology, vol. 44, no. 2, Art. no. 2, Sep. 2023, doi: 10.52783/tjjpt.v44.i2.220.
A. Ali et al., “Blockchain-Powered Healthcare Systems: Enhancing Scalability and Security with Hybrid Deep Learning,” Sensors, vol. 23, no. 18, Art. no. 18, Jan. 2023, doi: 10.3390/s23187740.
S. Mekruksavanich and A. Jitpattanakul, “LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes,” Sensors, vol. 21, no. 5, Art. no. 5, Jan. 2021, doi: 10.3390/s21051636.
I. Priyadarshini, R. Sharma, D. Bhatt, and M. Al-Numay, “Human activity recognition in cyber-physical systems using optimized machine learning techniques,” Cluster Computing, vol. 26, no. 4, pp. 2199–2215, 2023.
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.