DOI QR코드

DOI QR Code

Vision-based Autonomous Landing System of an Unmanned Aerial Vehicle on a Moving Vehicle

무인 항공기의 이동체 상부로의 영상 기반 자동 착륙 시스템

  • Received : 2016.09.04
  • Accepted : 2016.11.11
  • Published : 2016.11.30

Abstract

Flight of an autonomous unmanned aerial vehicle (UAV) generally consists of four steps; take-off, ascent, descent, and finally landing. Among them, autonomous landing is a challenging task due to high risks and reliability problem. In case the landing site where the UAV is supposed to land is moving or oscillating, the situation becomes more unpredictable and it is far more difficult than landing on a stationary site. For these reasons, the accurate and precise control is required for an autonomous landing system of a UAV on top of a moving vehicle which is rolling or oscillating while moving. In this paper, a vision-only based landing algorithm using dynamic gimbal control is proposed. The conventional camera systems which are applied to the previous studies are fixed as downward facing or forward facing. The main disadvantage of these system is a narrow field of view (FOV). By controlling the gimbal to track the target dynamically, this problem can be ameliorated. Furthermore, the system helps the UAV follow the target faster than using only a fixed camera. With the artificial tag on a landing pad, the relative position and orientation of the UAV are acquired, and those estimated poses are used for gimbal control and UAV control for safe and stable landing on a moving vehicle. The outdoor experimental results show that this vision-based algorithm performs fairly well and can be applied to real situations.

Keywords

References

  1. S. D. Manning C. E. Rash, P. A. LeDuc, R. K. Noback, and J. McKeon, "The role of human causal factors in us army unmanned aerial vehicle accidents," DTIC Document, Tech. Rep., 2004
  2. V. Kumar and N. Michael, "Opportunities and challenge with autonomous micro aerial vehicles," The International Journal of Robotics Research, vol.31, no.11, pp.1279-1291, 2012. https://doi.org/10.1177/0278364912455954
  3. C.S. Sharp, O. Shakernia, and S.S. Sastry, "A vision system for landing an unmanned aerial vehicle," in Proc. IEEE International Conference on Robotics and Automation (ICRA), vol.2, pp.1720-1727, 2001.
  4. O. Shakernia, R. Vidal, C.S. Sharp, Y. Ma, and S. Sastry, "Multiple view motion estimation and control for landing an unmanned aerial vehicle," in Proc. IEEE International Conference on Robotics and Automation (ICRA), vol.3, pp.2793-2798, 2002.
  5. S. Saripalli, J.E. Montgomery, and G.S. Sukhatme, "Vision-based autonomous landing of an unmanned aerial vehicle," in Proc. IEEE International Conference on Robotics and Automation (ICRA), vol.3, pp.2799-2804, 2002.
  6. J. Kim, Y. Jung, D. Lee, and D.H. Shim, "Outdoor autonomous landing on a moving platform for quadrotors using an omnidirectional camera," in Proc. IEEE International Conference on Unmanned Aircraft Systems (ICUAS), pp.1243-1252, 2014.
  7. K. Ling, D. Chow, A. Das, and S.L. Waslander, "Autonomous maritime landing for low-cost vtol aerial vehicle," in Proc. IEEE Canadian Conference on Computer and Robot Vision (CRV), pp.32-39, 2014.
  8. E. Olson, "AprilTag: A robust and flexible visual fiducial system," in Proc. IEEE International Conference on Robotics and Automation (ICRA), pp.3400-3407, 2011.
  9. D. Kim, J. Shin, H. Kim, H. Kim, D. Lee, S.-M. Lee, and H. Myung, "Design and Implementation of Unmanned Surface Vehicle JEROS for Jellyfish Removal," Journal of KROS (Korea Robotics Society) (in Korean), vol.8, no.1, pp.51-57, March, 2013.
  10. AprilTags_ros, 2014. [Online]. Available: http://wiki.ros.org/apriltags_ros, Accessed on: Nov. 1, 2016.

Cited by

  1. Adaptive planar vision marker composed of LED arrays for sensing under low visibility vol.2, pp.2, 2016, https://doi.org/10.12989/arr.2018.2.2.141
  2. 피드백 선형화를 이용한 쿼드로터의 자이로 효과 제어 vol.15, pp.3, 2016, https://doi.org/10.7746/jkros.2020.15.3.248