DOI QR코드

DOI QR Code

Design of Deep Learning-Based Automatic Drone Landing Technique Using Google Maps API

구글 맵 API를 이용한 딥러닝 기반의 드론 자동 착륙 기법 설계

  • Lee, Ji-Eun (IT Convergence and Electronics Engineering, Hansei University) ;
  • Mun, Hyung-Jin (Department of Information and Communication Engineering, SungKyul University)
  • 이지은 (한세대학교 대학원 IT융합전자공학과) ;
  • 문형진 (성결대학교 정보통신공학과)
  • Received : 2019.01.28
  • Accepted : 2019.02.20
  • Published : 2020.02.29

Abstract

Recently, the RPAS(Remote Piloted Aircraft System), by remote control and autonomous navigation, has been increasing in interest and utilization in various industries and public organizations along with delivery drones, fire drones, ambulances, agricultural drones, and others. The problems of the stability of unmanned drones, which can be self-controlled, are also the biggest challenge to be solved along the development of the drone industry. drones should be able to fly in the specified path the autonomous flight control system sets, and perform automatically an accurate landing at the destination. This study proposes a technique to check arrival by landing point images and control landing at the correct point, compensating for errors in location data of the drone sensors and GPS. Receiving from the Google Map API and learning from the destination video, taking images of the landing point with a drone equipped with a NAVIO2 and Raspberry Pi, camera, sending them to the server, adjusting the location of the drone in line with threshold, Drones can automatically land at the landing point.

최근 원격조종과 자율조종이 가능한 무인항공기(RPAS:Remotely Piloted Aircraft System)가 택배 드론, 소방드론, 구급 드론, 농업용 드론, 예술 드론, 드론 택시 등 각 산업 분야와 공공기관에서의 관심과 활용이 높아지고 있다. 자율조종이 가능한 무인드론의 안정성 문제는 앞으로 드론 산업의 발달과 함께 진화하면서 해결해야 할 가장 큰 과제이기도 하다. 드론은 자율비행제어 시스템이 지정한 경로로 비행하고 목적지에 정확하게 자동 착륙을 수행할 수 있어야 한다. 본 연구는 드론의 센서와 GPS의 위치 정보의 오류를 보완하는 방법으로서 착륙지점 영상을 통해 드론의 도착 여부를 확인하고 정확한 위치에서의 착륙을 제어하는 기법을 제안한다. 서버에서 도착지 영상을 구글맵 API로부터 수신받아 딥러닝으로 학습하고, 드론에 NAVIO2와 라즈베리파이, 카메라를 장착하여 착륙지점의 이미지를 촬영한 다음 이미지를 서버에 전송한다. Deep Learning으로 학습된 결과와 비교하여 임계치에 맞게 드론의 위치를 조정한 후 착륙지점에 자동으로 착륙할 수 있다.

Keywords

References

  1. Y. K. Ju, H. J Mun & K. H Han. (2019). Image Processing Based Drone Landing Technique Considering GPS error and wind Direction, IJ ITEE, 8(2), 245-254
  2. J. M. Jeong, J. S. Kim, K.W Lee, N.W Hwang, T.S. Yoon & J. B. Park. (2015). Development and Validation of Image Sensor Based Landing Site Recognition Algorithm for Autonomous Landing of the Unmanned Aerial Vehicle. Journal of The Korean Society for Aeronautical & Space Sciences, 1547-1550
  3. H. N. Kim & Y. J. Jung. (2016). Automatic take-off and landing method and device of drones, KR. Patent No. 10-1749578 "https://patents.google.com/patent/KR101749578B1/ko"
  4. Y. Feng, C. Zhang, S. Baek, S. Rawashdeh & A. Mohammadi (2018). Autonomous landing of a UAV on a moving platform using model predictive control. Drones, 2(4), 34. DOI: 10.3390/drones2040034
  5. N. Jayanthi & S. Indu. (2016). Comparison of image matching techniques. International Journal of Latest Trends in Engineering and Technology, 7(3), 396-401 DOI: 10.21172/1.73.552
  6. OpenCV Library Package-"IntRoLab [internet]. Available
  7. KNN Euclidean Distance, "https://ratsgo.github.io/machine%20learning/2017/04/17/KNN/"
  8. H. J. Mun & G. H. Kim. (2019). A Survey on Deep Learning based Face Recognition for User Authentication, Journal of Industrial Convergence, 17(3), 23-29. DOI: 10.22678/jic.2019.17.3.023
  9. MissionPlanner, https://ardupilot.org/planner/docs/mission-planner-overview.html
  10. Comparison of Features Using SIFT Technicians. "https://github.com/yildbs/SearchImageWithPython/blob/master/Example-OpenCV/brute_force_matching_knn.py"
  11. D.G Lowe. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2), 91-110. DOI:10.1023/b:visi.0000029664.99615.94
  12. J.P. Lee, J.W. Lee, & K.H. Lee. (2016). A Scheme of Security Drone Convergence Service using Cam-Shift Algorithm. Journal of the Korea Convergence Society, 7(5), 29-34. DOI: 10.15207/JKCS.2016.7.5.029