Automatic Camera Calibration Using a Single Image to extract Intrinsic and Extrinsic Parameters
Keywords:
automatic detection, camera calibration, extrinsic parameters, intrinsic parameters, vanishing pointsAbstract
This article presents a methodology for accurately locating vanishing points in undistorted images, enabling the determination of a camera's intrinsic and extrinsic parameters as well as facilitating measurements within the image. Additionally, the development of a vanishing point filtering algorithm is introduced. The algorithm's effectiveness is validated by extracting real-world coordinates using only three points and their corresponding distances. Finally, the obtained vanishing points are compared with extrinsic parameters derived from multiple objects and with intrinsic parameters obtained from various shapes and images sourced from different test sites. Results show that through a single image, the intrinsic parameters are extracted accurately. Moreover, Using 3 points to determine the extrinsic parameters is an excellent alternative to the checkerboard, making the method more practical since it does not imply the manual positioning of the checkerboard to perform the camera calibration.
Downloads
References
Y.-J. Zhang, “Camera Calibration,” in 3-D Computer Vision: Principles, Algorithms and Applications, Singapore: Springer Nature Singapore, 2023, pp. 37–65. doi: 10.1007/978-981-19-7580-6_2.
S. Sels, B. Ribbens, S. Vanlanduit, and R. Penne, “Camera Calibration Using Gray Code,” Sensors, vol. 19, no. 2, 2019, doi: 10.3390/s19020246.
S. Placht et al., “ROCHADE: Robust Checkerboard Advanced Detection for Camera Calibration,” in Computer Vision – ECCV 2014, D. Fleet, T. Pajdla, B. Schiele, and T. Tuytelaars, Eds., Cham: Springer International Publishing, 2014, pp. 766–779.
B. Chen, Y. Liu, and C. Xiong, “Automatic Checkerboard Detection for Robust Camera Calibration,” in 2021 IEEE International Conference on Multimedia and Expo (ICME), 2021, pp. 1–6. doi: 10.1109/ICME51207.2021.9428389.
I. V. Crombrugge, R. Penne, and S. Vanlanduit, “Extrinsic camera calibration for non-overlapping cameras with Gray code projection,” Optics and Lasers in Engineering, vol. 134, p. 106305, 2020, doi: https://doi.org/10.1016/j.optlaseng.2020.106305.
F. Yang, Y. Zhao, and X. Wang, “Camera Calibration Using Projective Invariants of Sphere Images,” IEEE Access, vol. 8, pp. 28324–28336, 2020, doi: 10.1109/ACCESS.2020.2972029.
S. J. Lee and S. S. Hwang, “Fast and Accurate Self-calibration Using Vanishing Point Detection in Manmade Environments,” International Journal of Control, Automation and Systems, vol. 18, no. 10, pp. 2609–2620, Oct. 2020, doi: 10.1007/s12555-019-0284-1.
R. Juarez-Salazar, J. Zheng, and V. H. Diaz-Ramirez, “Distorted pinhole camera modeling and calibration,” Appl. Opt., vol. 59, no. 36, pp. 11310–11318, Dec. 2020, doi: 10.1364/AO.412159.
S. Inagaki, A. Sanpei, and H. Himura, “Multiple-pinhole camera for monitoring three-dimensional plasma shape,” Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, vol. 1036, p. 166857, 2022, doi: https://doi.org/10.1016/j.nima.2022.166857.
R. K. Megalingam, V. Shriram, B. Likhith, G. Rajesh, and S. Ghanta, “Monocular distance estimation using pinhole camera approximation to avoid vehicle crash and back-over accidents,” in 2016 10th International Conference on Intelligent Systems and Control (ISCO), 2016, pp. 1–5. doi: 10.1109/ISCO.2016.7727017.
J. Wang, F. Shi, J. Zhang, and Y. Liu, “A new calibration model of camera lens distortion,” Pattern Recognition, vol. 41, no. 2, pp. 607–615, 2008, doi: https://doi.org/10.1016/j.patcog.2007.06.012.
A. Langbein, D. A. Plecher, F. Pankratz, C. Eghtebas, F. Palmas, and G. Klinker, “Gamifying Stereo Camera Registration for Augmented Reality,” in 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 2018, pp. 125–126. doi: 10.1109/ISMAR-Adjunct.2018.00049.
Y. Chen, M. Zhang, P. Lu, X. Zeng, and Y. Wang, “Multi-dimensional Game Interface with Stereo Vision,” in Entertainment Computing - ICEC 2005, F. Kishino, Y. Kitamura, H. Kato, and N. Nagata, Eds., Berlin, Heidelberg: Springer Berlin Heidelberg, 2005, pp. 368–376.
A. Polanski, K. Wojciechowski, and A. Borek, “Stereo calibration by planar grid lines,” in Computer Analysis of Images and Patterns, V. Hlaváč and R. Šára, Eds., Berlin, Heidelberg: Springer Berlin Heidelberg, 1995, pp. 456–463.
P. Rathnayaka, S.-H. Baek, and S.-Y. Park, “An Efficient Calibration Method for a Stereo Camera System with Heterogeneous Lenses Using an Embedded Checkerboard Pattern,” Journal of Sensors, vol. 2017, p. 6742615, Sep. 2017, doi: 10.1155/2017/6742615.
F. A. Van Den Heuvel, “Vanishing point detection for architectural photogrammetry,” International archives of photogrammetry and remote sensing, vol. 32, pp. 652–659, 1998.
A. G. Kucukkaya, “Photogrammetry and remote sensing in archeology,” Journal of Quantitative Spectroscopy and Radiative Transfer, vol. 88, no. 1, pp. 83–88, 2004, doi: https://doi.org/10.1016/j.jqsrt.2003.12.030.
A. Criminisi, I. Reid, and A. Zisserman, “Single View Metrology,” International Journal of Computer Vision, vol. 40, no. 2, pp. 123–148, Nov. 2000, doi: 10.1023/A:1026598000963.
J. Canny, “A Computational Approach to Edge Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-8, no. 6, pp. 679–698, 1986, doi: 10.1109/TPAMI.1986.4767851.
P. V. C. Hough, “METHOD AND MEANS FOR RECOGNIZING COMPLEX PATTERNS,” Dec. 1962, [Online]. Available: https://www.osti.gov/biblio/4746348
M. A. Krasnosel’skii and A. V. Pokrovskii, Systems with hysteresis. Springer Science & Business Media, 2012.
Y. Amit, P. Felzenszwalb, and R. Girshick, “Object Detection,” in Computer Vision: A Reference Guide, Cham: Springer International Publishing, 2020, pp. 1–9. doi: 10.1007/978-3-030-03243-2_660-1.
D. Wang, M. Moh, and T.-S. Moh, “Using Deep Learning to Solve Google reCAPTCHA v2’s Image Challenges,” in 2020 14th International Conference on Ubiquitous Information Management and Communication (IMCOM), 2020, pp. 1–5. doi: 10.1109/IMCOM48794.2020.9001774.
R. Verschae and J. Ruiz-del-Solar, “Object Detection: Current and Future Directions,” Frontiers in Robotics and AI, vol. 2, 2015, doi: 10.3389/frobt.2015.00029.
N. Zheng, G. Loizou, X. Jiang, X. Lan, and X. Li, “Computer vision and pattern recognition,” International Journal of Computer Mathematics, vol. 84, no. 9, pp. 1265–1266, 2007, doi: 10.1080/00207160701303912.
P. Moghadam, J. A. Starzyk, and W. S. Wijesoma, “Fast Vanishing-Point Detection in Unstructured Environments,” IEEE Transactions on Image Processing, vol. 21, no. 1, pp. 425–430, 2012, doi: 10.1109/TIP.2011.2162422.
Y. Zhang, Y. Su, J. Yang, J. Ponce, and H. Kong, “When Dijkstra Meets Vanishing Point: A Stereo Vision Approach for Road Detection,” IEEE Transactions on Image Processing, vol. 27, no. 5, pp. 2176–2188, 2018, doi: 10.1109/TIP.2018.2792910.
“Sebastian Zanlongo, Matthew Turk, and Sanjay Parajuli, (2019). Vanishing Point Detection.”
B. Caprile and V. Torre, “Using vanishing points for camera calibration,” International Journal of Computer Vision, vol. 4, no. 2, pp. 127–139, Mar. 1990, doi: 10.1007/BF00127813.
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.