DOI QR코드

DOI QR Code

Study on object detection and distance measurement functions with Kinect for windows version 2

키넥트(Kinect) 윈도우 V2를 통한 사물감지 및 거리측정 기능에 관한 연구

  • Received : 2017.05.26
  • Accepted : 2017.06.03
  • Published : 2017.06.30

Abstract

Computer vision is coming more interesting with new imaging sensors' new capabilities which enable it to understand more its surrounding environment by imitating human vision system with artificial intelligence techniques. In this paper, we made experiments with Kinect camera, a new depth sensor for object detection and distance measurement functions, most essential functions in computer vision such as for unmanned or manned vehicles, robots, drones, etc. Therefore, Kinect camera is used here to estimate the position or the location of objects in its field of view and measure the distance from them to its depth sensor in an accuracy way by checking whether that the detected object is real object or not to reduce processing time ignoring pixels which are not part of real object. Tests showed promising results with such low-cost range sensor, Kinect camera which can be used for object detection and distance measurement which are fundamental functions in computer vision applications for further processing.

컴퓨터 비전은 인공 지능 기술을 통해 인간의 시각 시스템을 모방해 주변 환경을 보다 정확하게 인식하는 새로운 이미지 센서 기능으로 각광받고 있다. 본 논문에서는 사물감지 및 거리측정 기능이 있는 새로운 깊이 센서인 키넥트(Kinect) 카메라를 통해, 무인 또는 유인 차량, 로봇 및 드론 등을 위한 컴퓨터 비전의 가장 중요한 기능들을 대상으로 시험을 진행하였다. 키넥트 카메라를 통해 시야 내에 있는 사물의 자리 또는 위치를 예측하고, 실제 사물이 아닌 픽셀을 무시해 처리 시간을 줄일 수 있도록 감지한 사물이 실제 사물인지 확인하여 깊이 센터를 통해 정확하게 거리를 측정한다. 실험 결과, 해당 거리센서는 좋은 결과를 나타냈으며, 추가 프로세싱을 위한 컴퓨터 비전 어플리케이션의 핵심 기능인 사물감지와 거리측정에 키넥트 카메라를 사용한다.

Keywords

References

  1. S. Nissimov, J. Goldberger and V. Alchanatis, "Obstacle detection in a greenhouse environment using the Kinect Sensor," Journal of Computers and Electronics in Agriculture, vol. 113, no. 3, pp. 104-115, Feb. 2015. https://doi.org/10.1016/j.compag.2015.02.001
  2. Research at the University of Maryland. Computer Vision: Visual-Event Recognition for security and science [Internet]. Available: http://www.research.umd.edu/sites/default/files/documents/brochures/computer-vision.pdf.
  3. M. A .Mohamed, A. I. Melhum and F. A. Kochery, "Object Distance Measurement by Stereo Vision," International Journal of Science and applied Information Technology, vol. 2,no. 2, pp. 5-8, March 2013.
  4. G. Reina and A. Milella, "Towards Autonomous Agriculture: Automatic Ground Detection Using Trinocular Stereovision," Sensors(Basel) journal, vol. 12,no. 9, pp. 12405-12423, Sept. 2012.
  5. ALABBASI, F. Moldovenau, and A. Moldoveanu, "Facial Emotion Expressions Recognition with Brain Activities using Kinect sensor v2," International Research Journal of Engineering and Technology (IRJET), vol. 2, no. 2, pp.421-428, May 2015.
  6. E. Lachat, H. Macher, T. Landes and P. Grussenmeyer, "Assessment and Calibration of a RGB-D Camera (Kinect v2 Sensor) Towards a Potential Use for Close-Range 3D Modeling," JournalRemote Sensing , vol. 7, no. 10, pp. 13070-13097, Oct. 2015. https://doi.org/10.3390/rs71013070
  7. MIT Computer Science and Artificial Intelligence Laboratory. 3D Object Tracking using the kinect [Internet]. Available: https://people.csail.mit.edu/spillai/data/papers/cvclass-project-paper.pdf.
  8. O. Wasenmuller, M. Meyer and D. Stricker, "CoRBS: Comprehensive RGB-D Benchmark for SLAM using Kinect v2," in Proceeding of IEEE Winter Conference on Applications of Computer Vision, New York, pp.1-7, 2016.
  9. R. Sivalingam, A. Cherian,Joshua, N. Walczak, N. Bird, V. Morellas, B. Murphy, K. Cullen, K. Lim, G. Sapiro and N. Papanikolopoulos, "A Multi-sensor visual tracking system for behavior monitoring of At-Risk children," in Proceedings of IEEE International Conference on Robotics and Automation, Saint Paul (MN,USA), pp.1345-1350, 2012.
  10. V. Filipe, H. Fernades, F. Fernandes and J. Barroso.(2012, December). Blind navigation support system based on Microsoft Kinect. Procedia Computer Science [Online], 14(12), pp.94-101. Available : http://www.sciencedirect.com/science/article/pii/S1877050912007739?via%3Dihub. https://doi.org/10.1016/j.procs.2012.10.011