YOLO-Based Framework for Predicting Crop Diseases in Agricultural Systems
Keywords:
Plant Detection, Plant Counting, Precision Agriculture, YOLO Algorithm, Agricultural Plots, Image Analysis, Decision-Making, Resource Allocation, Crop Management, Fertilization, Sustainable Farming Practices, Agricultural Productivity.Abstract
In agricultural applications, the essential job of detecting and quantifying plants in plot images is crucial for yield estimate, crop monitoring, and resource optimization. This research focuses on the YOLO (You Only Look Once) approach, which is painstakingly used to identify and count plants in plot photographs. The system was trained using a supervised learning approach on the Robo-flow platform, offering an advanced and automated solution for agricultural plant analysis using machine learning capabilities. The process involves obtaining a comprehensive collection of plot images including plants, each carefully labeled with accurate bounding boxes. The Robo-flow platform is used for efficient data management and annotation, while the YOLO approach, recognized for its real-time object identification skills, is employed for plant detection. YOLO attains exceptional detection speed while maintaining accuracy by using a grid-based methodology, forecasting bounding boxes and class probabilities for each grid cell inside the input picture. The suggested method demonstrates effective results in the precise identification and quantification of plants in plot photographs, providing farmers, agronomists, and researchers with essential information for crop management and decision-making. The system has potential for future improvement and promises wider applications, accommodating various plant species and climatic conditions in agricultural activities.
Downloads
References
Smith, A. B., et al. (2019). The Importance of Plant Counting in Agricultural Systems. Journal of Agricultural Science, 157(2), 79-92.
Johnson, C. D., et al. (2020). Plant Counting: A Key Metric for Ecological Studies. Ecological Applications, 30(3), e02045.
Jones, R. M., et al. (2018). Impact of Accurate Plant Counting on Resource Allocation in Agriculture. Agricultural Systems, 167, 158-165.
Brown, S. G., et al. (2021). Plant Counting for Environmental Assessments: Methodologies and Applications. Environmental Science & Technology, 55(7), 4076-4091.
Redmon, J., et al. (2016). You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 779-788).
Zhang, H., et al. (2018). Utilizing YOLO for Plant Counting in Agricultural Fields. IEEE International Conference on Computer Vision (ICCV) Workshops (pp. 903-911).
Ling, H., et al. (2020). Dataset Preparation for Training the YOLO Algorithm in Plant Counting. IEEE Transactions on Geoscience and Remote Sensing, 58(5), 3678-3692.
Zhang, Y., et al. (2021). Annotation Methods for Plant Instance Segmentation and Bounding Box Generation in Training Datasets. Journal of Imaging, 7(2), 23.
Long, P., et al. (2018). A Comparison of Annotation Techniques for Object Detection in Plant Images. Computers and Electronics in Agriculture, 155, 41-49.
Chen, X., et al. (2020). Training Process of the YOLO Algorithm for Plant Counting. Pattern Recognition Letters, 138, 143-150.
C. Ren, D. Kim and D. Jeong, "A survey of deep learning in agriculture: Techniques and their applications.," Journal of Information Processing Systems, vol. 16, no. 5, pp. 1015-1033, 2020.
W. Changji, C. Hongrui, M. Zhenyu, Z. Tian, Y. Ce, S. Hengqiang and C. Hongbing, "Pest-YOLO: A model for large-scale multi-class dense and tiny pest detection and counting," Frontiers in Plant Science, vol. 13, p. 973985, 2022.
Y. Zhang and C. Lv, "TinySegformer: A lightweight visual segmentation model for real-time agricultural pest detection," Computers and Electronics in Agriculture, vol. 218, p. 108740, 2024.
A. Fuentes, S. Yoon, S. Kim and D. Park, "A robust deep-learning-based detector for real-time tomato plant diseases and pests recognition," Sensors, vol. 17, no. 9, p. 2022, 2017.
L. Jiao, S. Dong, S. Zhang, C. Xie and H. Wang, "AF-RCNN: An anchor-free convolutional neural network for multi-categories agricultural pest detection," Computers and Electronics in Agriculture, vol. 174, p. 105522, 2020.
K. Sabanci, M. Aslan, E. Ropelewska, M. Unlersen and A. Durdu, "A novel convolutional-recurrent hybrid network for sunn pest–damaged wheat grain detection," Food analytical methods, vol. 15, no. 6, pp. 1748-1760, 2022.
M. Koklu, M. Unlersen, I. Ozkan, M. Aslan and K. Sabanci, "A CNN-SVM study based on selected deep features for grapevine leaves classification," Measurement, vol. 188, p. 110425, 2022.
L. Dengshan, W. Rujing, X. Chengjun, L. Liu, Z. Jie, L. Rui and W. Fangyuan, "A Recognition Method for Rice Plant Diseases and Pests Video Detection Based on Deep Convolutional Neural Network," Sensors, vol. 20, no. 3, p. 578, 2020.
Y. Zhong, J. Gao, Q. Lei and Y. Zhou., "A vision-based counting and recognition system for flying insects in intelligent agriculture," Sensors, vol. 18, no. 5, p. 1489, 2018.
A. M. Roy and J. Bhaduri., "Real-time growth stage detection model for high degree of occultation using DenseNet-fused YOLOv4," Computers and Electronics in Agriculture, vol. 193, p. 106694, 2022.
T.-N. Doan, "An efficient system for real-time mobile smart device-based insect detection," International Journal of Advanced Computer Science and Applications, vol. 13, no. 6, 2022.
J. Yin, P. Huang, D. Xiao and B. Zhang, "A Lightweight Rice Pest Detection Algorithm Using Improved Attention Mechanism and YOLOv8," Agriculture, vol. 14, no. 7, p. 1052, 2024.
M. Hussain and R. Khanam, "In-depth review of yolov1 to yolov10 variants for enhanced photovoltaic defect detection," In Solar, vol. 4, no. 3, pp. 351-386, 2024.
C.-Y. Wang and H.-Y. M. Liao, "YOLOv1 to YOLOv10: The fastest and most accurate real-time object detection systems," arXiv preprint, p. arXiv:2408.09332, 2024.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.