Deep Learning Wildfire Detection to Increase Fire Safety with Yolov8
Keywords:
: computer vision, deep learning, forest fire, wildfire detection, YOLOAbstract
Object detection involves using computer vision algorithms to identify and locate objects in an image or video. In the context of wildfire detection, object detection can be used to identify features such as flames, smoke, and heat sources in satellite or drone imagery and to alert authorities to the presence of a wildfire. YOLOv8 (You Only Look Once) is a popular object detection algorithm widely used for various applications, including wildfire detection. The YOLOv8 object detection algorithm could be used to help reduce the impact of wildfires on communities and the environment. After extensive preprocessing and a well-structured 25-epoch experimental phase, the model performed well with a mean Average Precision (mAP) of 0.6, precision of 0.7, and recall of 0.57. This study advances wildfire detection methods and provides significant information.
Downloads
References
J. V Barbosa, R. A. O. Nunes, M. C. M. Alvim-Ferraz, F. G. Martins, and S. I. V Sousa, “Health and Economic Burden of the 2017 Portuguese Extreme Wildland Fires on Children,” Int J Environ Res Public Health, vol. 19, no. 1, 2022, doi: 10.3390/ijerph19010593.
G. Mazzeo et al., “Integrated Satellite System for Fire Detection and Prioritization,” Remote Sens (Basel), vol. 14, no. 2, 2022, doi: 10.3390/rs14020335.
K. Thangavel et al., “Autonomous Satellite Wildfire Detection Using Hyperspectral Imagery and Neural Networks: A Case Study on Australian Wildfire,” Remote Sens (Basel), vol. 15, no. 3, 2023, doi: 10.3390/rs15030720.
Y. Chen et al., “California wildfire spread derived using VIIRS satellite observations and an object-based tracking system,” Sci Data, vol. 9, no. 1, p. 249, 2022, doi: 10.1038/s41597-022-01343-0.
D. Stipaničev et al., “Vision based wildfire and natural risk observers,” in 2012 3rd International Conference on Image Processing Theory, Tools and Applications (IPTA), 2012, pp. 37–42. doi: 10.1109/IPTA.2012.6469518.
A. K. Agirman and K. Tasdemir, “BLSTM based night-time wildfire detection from video,” PLoS One, vol. 17, no. 6, pp. e0269161-, Jun. 2022, [Online]. Available: https://doi.org/10.1371/journal.pone.0269161
J. Shi, W. Wang, Y. Gao, and N. Yu, “Optimal Placement and Intelligent Smoke Detection Algorithm for Wildfire-Monitoring Cameras,” IEEE Access, vol. 8, pp. 72326–72339, 2020, doi: 10.1109/ACCESS.2020.2987991.
T. Barmpoutis Panagiotis and Stathaki, “A Novel Framework for Early Fire Detection Using Terrestrial and Aerial 360-Degree Images,” in Advanced Concepts for Intelligent Vision Systems, P. and P. W. and P. D. and S. P. Blanc-Talon Jacques and Delmas, Ed., Cham: Springer International Publishing, 2020, pp. 63–74.
K. Govil, M. L. Welch, J. T. Ball, and C. R. Pennypacker, “Preliminary Results from a Wildfire Detection System Using Deep Learning on Remote Camera Images,” Remote Sens (Basel), vol. 12, no. 1, 2020, doi: 10.3390/rs12010166.
K. Dimitropoulos, P. Barmpoutis, and N. Grammalidis, “Higher Order Linear Dynamical Systems for Smoke Detection in Video Surveillance Applications,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 27, no. 5, pp. 1143–1154, 2017, doi: 10.1109/TCSVT.2016.2527340.
P. Barmpoutis, T. Stathaki, K. Dimitropoulos, and N. Grammalidis, “Early Fire Detection Based on Aerial 360-Degree Sensors, Deep Convolution Neural Networks and Exploitation of Fire Dynamic Textures,” Remote Sens (Basel), vol. 12, no. 19, 2020, doi: 10.3390/rs12193177.
J. Redmon, S. K. Divvala, R. B. Girshick, and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 779–788, 2015, [Online]. Available: https://api.semanticscholar.org/CorpusID:206594738
B. Dwyer, J. Nelson, J. Solawetz, and others, Roboflow (version 1.0)[software].
D. Kinaneva, G. Hristov, G. Georgiev, P. Kyuchukov, and P. Zahariev, “An artificial intelligence approach to real-time automatic smoke detection by unmanned aerial vehicles and forest observation systems,” in 2020 International Conference on Biomedical Innovations and Applications (BIA), 2020, pp. 133–138. doi: 10.1109/BIA50171.2020.9244498.
S. Jazebi, F. de León, and A. Nelson, “Review of Wildfire Management Techniques—Part II: Urgent Call for Investment in Research and Development of Preventative Solutions,” IEEE Transactions on Power Delivery, vol. 35, no. 1, pp. 440–450, 2020, doi: 10.1109/TPWRD.2019.2930095.
Z. Jiao et al., “A Deep Learning Based Forest Fire Detection Approach Using UAV and YOLOv3,” in 2019 1st International Conference on Industrial Artificial Intelligence (IAI), 2019, pp. 1–5. doi: 10.1109/ICIAI.2019.8850815.
A. Shamsoshoara, F. Afghah, A. Razi, L. Zheng, P. Z. Fulé, and E. Blasch, “Aerial imagery pile burn detection using deep learning: The FLAME dataset,” Computer Networks, vol. 193, p. 108001, 2021, doi: https://doi.org/10.1016/j.comnet.2021.108001.
Y. Wang, C. Hua, W. Ding, and R. Wu, “Real-time detection of flame and smoke using an improved YOLOv4 network,” Signal Image Video Process, vol. 16, no. 4, pp. 1109–1116, 2022, doi: 10.1007/s11760-021-02060-8.
Y. Cao, F. Yang, Q. Tang, and X. Lu, “An Attention Enhanced Bidirectional LSTM for Early Forest Fire Smoke Recognition,” IEEE Access, vol. 7, pp. 154732–154742, 2019, doi: 10.1109/ACCESS.2019.2946712.
Downloads
Published
How to Cite
Issue
Section
License
![Creative Commons License](http://i.creativecommons.org/l/by-sa/4.0/88x31.png)
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.