Performance Evaluation of YOLOv8 and Segment Anything Model for Auto Annotation of Crop and Weed Images in Pigeon Pea Production System
Keywords:
Auto Annotation, Crop and Weed Detection, YOLOv8 model, Segment Anything Model (SAM)Abstract
India's agricultural industry generates more than $375 billion every year. India ranks second in agricultural output. To improve the agricultural output Precision Agriculture is used. But one of the most important concerns in Precision Agriculture is crop and weed detection. As a result, robotic weeding techniques are utilized to control weeds. In robotic weeding, accurate crop and weed detection and localization in the unstructured field remains a substantial challenge, necessitating supervised modeling using annotated data. The process of creating annotated data is quite time-consuming. Also, the dataset for all the crops and weeds is not present.
In this work, an attempt is made to collect a real-time dataset of the Pigeon Pea dataset using a mobile camera and drone. There are a total of 1727 images in the dataset. Initially, 137 images are manually annotated using roboflow.com which are then used to train all of the YOLOv8 variants for epochs ranging from 10 to 1,000. The YOLOv8n variant has the shortest inference time at 2.8ms which is selected to train the remaining 1590 unannotated images. The predicted bounding boxes of YOLOv8n are given input to the segment anything model that generates the annotations.
The manually and automatically annotated images are merged to create the new dataset. Again, all of the YOLOv8 variants are trained and tested on the new dataset for epochs ranging from 10 to 1,000. Following the inclusion of automated annotations, the values of accuracy, recall, mean average precision@50, and mean average precision@50-95 rose by 9.79%, 38.63%, 13.99%, and 18.43%, respectively. The YOLOv8n provides the shortest inference time of 3.8ms on a new data set. Also, using automatically annotated data, approximately 132.5 hours required for annotation of unlabelled images are saved. This effort will contribute to the advancement of crop and weed detection studies in the pigeon pea production system, including disease prediction, yield prediction, and automated weed removal.
Downloads
References
P. Radoglou-Grammatikis, P. Sarigiannidis, T. Lagkas, and I. Moscholios, “A compilation of UAV applications for precision agriculture,” Computer Networks, vol. 172, p. 107148, 2020.
R. Lal, “Soil structure and sustainability,” Journal of sustainable agriculture, vol. 1, no. 4, pp. 67–92, 1991.
S. K. Seelan, S. Laguette, G. M. Casady, and G. A. Seielstad, “Remote sensing applications for precision agriculture: A learning community approach,” Remote sensing of environment, vol. 88, no. 1-2, pp. 157–169, 2003.
D. Patel and B. Kumbhar, “Weed and its management: A major threats to crop economy,” J. Pharm. Sci. Biosci. Res, vol. 6, pp. 453–758, 2016.
N. Iqbal, S. Manalil, B. S. Chauhan, and S. W. Adkins, “Investigation of alternate herbicides for effective weed management in glyphosate-tolerant cotton,” Archives of Agronomy and Soil Science, 2019.
J. S. Holt, “Principles of weed management in agroecosystems and wildlands1,” Weed Technology, vol. 18, no. sp1, pp. 1559–1562, 2004.
B. Liu and R. Bruch, “Weed detection for selective spraying: A review, “Current Robotics Reports, vol. 1, no. 1, pp. 19–26, 2020.
P. Lameski, E. Zdravevski, and A. Kulakov, “Review of automated weed control approaches An environmental impact perspective,” in International Conference on Telecommunications, pp. 132–147. Springer, 2018.
A. M. Hasan, F. Sohel, D. Diepeveen, H. Laga, and M. G. Jones, “A survey of deep learning techniques for weed detection from images,” Computers and Electronics in Agriculture, vol. 184, p. 106067, 2021.
S. Shanmugam, E. Assuncao, R. Mesquita, A. Veiros, and P. D. Gaspar, “Automated weed detection systems: A review,” KnE Engineering, pp. 271–284, 2020.
Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” nature, vol. 521, no.7553, pp. 436–444, 2015.
J. Gu, Z. Wang, J. Kuen, L. Ma, A. Shahroudy, B. Shuai, T. Liu, X. Wang,G. Wang, J. Cai, et al., “Recent advances in convolutional neural networks, “Pattern recognition, vol. 77, pp. 354–377, 2018.
J. Yu, S. M. Sharpe, A. W. Schumann, and N. S. Boyd, “Deep learning for image-based weed detection in turfgrass,” European Journal of Agronomy, vol. 104, pp. 78–84, 2019.
Kirillov, Alexander, Eric Mintun, Nikhila Ravi, Hanzi Mao, Chloe Rolland, Laura Gustafson, Tete Xiao, et al. "Segment anything." arXiv preprint arXiv:2304.02643 (2023).
Terven, J.; Cordova-Esparza, D. A Comprehensive Review of YOLO: From YOLOv1 to YOLOv8 and Beyond. arXiv 2023, arXiv:2304.00501
Wang, Y. A Deep Learning-based Approach for Vision-based Weeds Detection.
Zhang, C., Liu, J., Li, H., Chen, H., Xu, Z., & Ou, Z. (2023). Weed Detection Method Based on Lightweight and Contextual Information Fusion. Applied Sciences, 13(24), 13074.
Hasan, A. M., Diepeveen, D., Laga, H., Jones, M. G., & Sohel, F. (2024). Object-level benchmark for deep learning-based detection and classification of weed species. Crop protection, 177, 106561.
Li, J., Zhang, W., & Li, Q. (2024). Weed detection in soybean fields using improved YOLOv7 and evaluating herbicide reduction efficacy. Frontiers in Plant Science, 14, 1284338.
Guo, B., Ling, S., Tan, H., Wang, S., Wu, C., & Yang, D. (2023). “Detection of the Grassland Weed Phlomoides umbrosa Using Multi-Source Imagery and an Improved YOLOv8 Network’. Agronomy, 13(12), 3001.
Zhao, K., Zhao, L., Zhao, Y., & Deng, H. (2023). “Study on lightweight model of maize seedling object detection based on YOLOv7”. Applied Sciences, 13(13), 7731.
Khalid, S., Oqaibi, H. M., Aqib, M., & Hafeez, Y. (2023). “Small Pests Detection in Field Crops Using Deep Learning Object Detection”. Sustainability, 15(8), 6815.
Rana, S., Gerbino, S., Barretta, D., Carillo, P., Crimaldi, M., Cirillo, V & Sarghini, F. (2024)” Rafanoset: Dataset of Manually and Automatically Annotated Raphanus Raphanistrum Weed Images for Object Detection and Segmentation” Available at SSRN 4720646.
Boysen, J. J., & Stein, A. (2022), “ AI-supported data annotation in the context of UAV-based weed detection in sugar beet fields using Deep Neural Networks”. In GIL Jahrestagung (pp. 63-68).
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.