Drone View Segmentation: Deep Learning and Transfer Insights

Authors

  • Vivek Gurve Dept. of Artificial Intelligence and Machine Learning Symbiosis Institute of Technology Pune, India
  • Gunjan Singh Dept. of Artificial Intelligence and Machine Learning Symbiosis Institute of Technology Pune, India
  • Shilpa Gite SCAAI, Symbiosis International University, Pune, India
  • K. Nandhini Dept. of Artificial Intelligence and Machine Learning Symbiosis Institute of Technology Pune, India
  • Santosh Borde JSPM’S Rajarshi Shahu College of Engineering, Pune-411033, Maharashtra, India

Keywords:

Computer Vision, Image Segmentation, Unmanned Aerial Vehicles (UAVs), Transfer Learning, Simulated Environment

Abstract

This project focuses on the development and application of Deep Learning techniques for Aerial Image Transfer Learning and Segmentation. Leveraging a UNet-based deep learning model with preconfigured weights, the main goal is to obtain high-quality aerial image segmentation, especially for drone-captured photos, for a range of applications including infrastructure evaluation and environmental monitoring. Semantic Drone Data set is a carefully chosen source data set of high quality Drone Aerial Images and matching masks is used to train model. Transfer learning is employ on the target dataset to adapt the model for segmentation tasks in a simulated environment with QGroundControl, PX4Autopilot, and the Gazebo simulator. This simulation-based approach enables the evaluation of the model’s performance in various scenarios, enhancing its robustness and generalization capabilities. Additionally, the generation of a self-captured data set through the simulation environment, emphasizes the integration of synthetic data into the pipeline. The outcome of this project not only contributes to advancing image segmentation in drone-based applications but also explores the effectiveness of Transfer Learning in adapting models to novel environments, fostering advancements in the broader field of computer vision for Unmanned Aerial Systems.

Downloads

Download data is not yet available.

References

Ko¨se, T., 2021. Autonomous Fruit Picking With a Team of Aerial Manipulators (Master’s thesis, Middle East Techni- cal University).

Upadhyay, R., Phlypo, R., Saini, R. and Liwicki, M., 2021. Sharing to learn and learning to share–Fitting together Meta-Learning, Multi-Task Learning, and Transfer Learn- ing: A meta review. arXiv preprint arXiv:2111.12146.

Duffy, J.P., Anderson, K., Fawcett, D., Curtis, R.J. and Maclean, I.M., 2021. Drones provide spatial and volu- metric data to deliver new insights into microclimate mod- elling. Landscape Ecology, 36, pp.685-702.

Karimi, D., Dou, H., Warfield, S.K. and Gholipour, A., 2020. Deep learning with noisy labels: Exploring tech- niques and remedies in medical image analysis. Medical image analysis, 65, p.101759.

Li, X., Zhou, Y., Pan, Z. and Feng, J., 2019. Partial or- der pruning: for best speed/accuracy trade-off in neural architecture search. In Proceedings of the IEEE/CVF Con- ference on Computer Vision and Pattern Recognition (pp. 9145-9153).

Alzubaidi, L., Bai, J., Al-Sabaawi, A., Santamar´ıa, J., Al- bahri, A.S., Al-dabbagh, B.S.N., Fadhel, M.A., Manoufali, M., Zhang, J., Al-Timemy, A.H. and Duan, Y., 2023. A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications. Journal of Big Data, 10(1), p.46.

Azmat, U., Alotaibi, S.S., Abdelhaq, M., Alsufyani, N., Shorfuzzaman, M., Jalal, A. and Park, J., 2023. Aerial In- sights: Deep Learning-based Human Action Recognition in Drone Imagery. IEEE Access.

Kentsch, S., Lopez Caceres, M.L., Serrano, D., Roure, F. and Diez, Y., 2020. Computer vision and deep learning techniques for the analysis of drone-acquired forest im- ages, a transfer learning study. Remote Sensing, 12(8), p.1287.

Bhatnagar, S., Gill, L. and Ghosh, B., 2020. Drone image segmentation using machine and deep learning for map- ping raised bog vegetation communities. Remote Sensing, 12(16), p.2602.

Kim, Y.H. and Park, K.R., 2022. MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds. Computers and Electronics in Agriculture, 199, p.107146.

Anderegg, J., Tschurr, F., Kirchgessner, N., Treier, S., Schmucki, M., Streit, B. and Walter, A., 2023. On-farm evaluation of UAV-based aerial imagery for season-long weed monitoring under contrasting management and pe- doclimatic conditions in wheat. Computers and Electron- ics in Agriculture, 204, p.107558.

Karimi, D., Warfield, S.K. and Gholipour, A., 2021. Transfer learning in medical image segmentation: New insights from analysis of the dynamics of model param- eters and learned representations. Artificial intelligence in medicine, 116, p.102078.

Bodin, T.B., 2018. Behavior flexibility for autonomous unmanned aerial systems.

http://dronedataset.icg.tugraz.at/

A. Abdollahi, B. Pradhan, S. Gite and A. Alamri, "Building Footprint Extraction from High Resolution Aerial Images Using Generative Adversarial Network (GAN) Architecture," in IEEE Access, vol. 8, pp. 209517-209527, 2020, doi: 10.1109/ACCESS.2020.3038225.

Joshi, A.; Pradhan, B.; Gite, S.; Chakraborty, S. Remote-Sensing Data and Deep-Learning Techniques in Crop Mapping and Yield Prediction: A Systematic Review. Remote Sens. 2023, 15, 2014. https://doi.org/10.3390/rs15082014

Khade, S.; Gite, S.; Pradhan, B. Iris Liveness Detection Using Multiple Deep Convolution Networks. Big Data Cogn. Comput. 2022, 6, 67. https://doi.org/10.3390/bdcc6020067

S. Gite, B. Pradhan, A. Alamri and K. Kotecha, "ADMT: Advanced Driver’s Movement Tracking System Using Spatio-Temporal Interest Points and Maneuver Anticipation Using Deep Neural Networks," in IEEE Access, vol. 9, pp. 99312-99326, 2021, doi: 10.1109/ACCESS.2021.3096032.

Gite, S., Mishra, A. & Kotecha, K. Enhanced lung image segmentation using deep learning. Neural Comput & Applic 35, 22839–22853 (2023). https://doi.org/10.1007/s00521-021-06719-8

Gite, Shilpa and Himanshu Agrawal. “Early Prediction of Driver's Action Using Deep Neural Networks.” Int. J. Inf. Retr. Res. 9 (2019): 11-27.

Downloads

Published

07.01.2024

How to Cite

Gurve, V. ., Singh, G. ., Gite, S. ., Nandhini, K. ., & Borde, S. . (2024). Drone View Segmentation: Deep Learning and Transfer Insights. International Journal of Intelligent Systems and Applications in Engineering, 12(10s), 448–455. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/4393

Issue

Section

Research Article

Most read articles by the same author(s)