DOI QR코드

DOI QR Code

Hand Gesture Recognition from Kinect Sensor Data

키넥트 센서 데이터를 이용한 손 제스처 인식

  • Cho, Sun-Young (Dept. of Computer Science, Yonsei University) ;
  • Byun, Hye-Ran (Dept. of Computer Science, Yonsei University) ;
  • Lee, Hee-Kyung (Broadcasting & Telecommunications Convergence Media Research Dept., ETRI) ;
  • Cha, Ji-Hun (Broadcasting & Telecommunications Convergence Media Research Dept., ETRI)
  • 조선영 (연세대학교 컴퓨터과학과) ;
  • 변혜란 (연세대학교 컴퓨터과학과) ;
  • 이희경 (한국전자통신연구원 방통융합미디어연구부) ;
  • 차지훈 (한국전자통신연구원 방통융합미디어연구부)
  • Received : 2012.03.15
  • Accepted : 2012.04.23
  • Published : 2012.05.30

Abstract

We present a method to recognize hand gestures using skeletal joint data obtained from Microsoft's Kinect sensor. We propose a combination feature of multi-angle histograms robust to orientation variations to represent the observation sequence of skeletons. The proposed feature efficiently represents the orientation variations of gestures that can be occurred according to person or environment by combining the multiple angle histograms with various angular-quantization levels. The gesture represented as combination of multi-angle histograms and random decision forest classifier improve the recognition performance. We conduct the experiments in hand gesture dataset obtained from a kinect sensor and show that our method outperforms the other methods by comparing the recognition performance.

본 논문에서는 키넥트 센서로부터 획득한 관절 정보를 이용하여 손 제스처를 인식하는 방법을 나타낸다. 관절 정보에 대한 관찰열을 표현하기 위한 특징으로 방향 변형에 강인한 다각도 결합 히스토그램 특징을 제안한다. 제안한 특징은 다양한 각도의 양자화 레벨을 갖는 여러 개의 각도 히스토그램들을 결합함으로써, 사람 및 환경에 따라 발생할 수 있는 제스처의 방향 변형에 강인하게 제스처를 표현한다. 또한, 다각도 결합 히스토그램으로 표현된 제스처 관찰열은 랜덤 결정 포레스트 분류기와 잘 겹합되어 높은 성능으로 제스처의 클래스를 인식한다. 키넥트 센서로부터 획득한 정적 및 동적 타입의 손 제스처 데이터셋에서 실험을 진행하였고, 다른 제스처 특징 및 분류기를 갖는 방법과의 인식 성능 비교를 통해 제안하는 방법의 우수함을 입증하였다.

Keywords

References

  1. G.R.S. Murthy and R.S. Jadon, "A review of vision based hand gestures recognition," Journal of Information Technology and Knowledge Management, vol. 2, no. 2, pp. 405-410, 2009.
  2. V. Ganapathi, C. Plageman, D. Koller, and S. Thrun, "Real time motion capture using a single time-of-flight camera," In Conf. on Computer Vision and Pattern Recognition, pp. 755-762, 2010.
  3. M. Siddiqui and G. Medioni, "Human pose estimation from a single view point, real-time range sensor," In Workshop on Computer Vision for Computer Games at Conf. on Computer Vision and Pattern Recognition, pp. 1-8, 2010.
  4. R. Munoz-Salinas, R. Medina-Carnicer, F.J. Madrid-Cuevas, and A. Carmona-Poyato, "Depth silhouettes for gesture recognition," Pattern Recognition Letters, vol. 29, no. 3, pp. 319-329, 2008. https://doi.org/10.1016/j.patrec.2007.10.011
  5. P. Suryanarayan, A. Subramanian, and D. Mandalapu, "Dynamic hand pose recognition using depth data," In Conf. on Pattern Recognition, pp. 3105-3108, 2010.
  6. I. Oikonomidis, N. Kyriazis, and A.A. Argyros, "Efficient model- based 3D tracking of hand articulations using Kinect," In British Machine Vision Conference, pp. 101.1-101.11, 2011.
  7. J. Sung, C. Ponce, B. Selman, and A. Saxena, "Human activity detection from RGBD images," In AAAI 2011 Workshop, pp. 47-55, 2011.
  8. W. Li, Z. Zhang, and Z. Liu, "Action recognition based on a bag of 3d points," In Computer Vision and Pattern Recognition Workshops(CVPRW), pp. 9-14, 2010.
  9. M.V. Bergh, D. Carton, R.D. Nijs, N. Mitsou, C. Landsiedel, K. Kuehnlenz, D. Wollherr, L.V. Gool, and M. Buss, "Real-time 3D hand gesture interaction with a robot for understanding directions from humans," In Symposium on Robot and Human Interactive Communication, pp. 357-362, 2011.
  10. M. Yang, N. Ahuja, and M. Tabb, "Extraction of 2D motion trajectories and its application to hand gesture recognition," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 8, pp. 1061-1074, 2002. https://doi.org/10.1109/TPAMI.2002.1023803
  11. P. Doliotis, A. Stefan, C. McMurrough, D. Eckhard, and V. Athitsos, "Comparing gesture recognition accuracy using color and depth information," PETRA, pp. 1-7, 2011.
  12. E.-J. Holden, G. Lee, and R. Owens, "Australian sign language recognition," Machine Vision and Applications, vol. 16, no. 5, pp. 312-320, 2005. https://doi.org/10.1007/s00138-005-0003-1
  13. H.-D. Yang, A.-Y. Park, and S.-W. Lee, "Gesture spotting and recognition for human-robot interaction," IEEE Trans. on Robotics, vol. 23, no. 2, pp. 256-270, 2007. https://doi.org/10.1109/TRO.2006.889491
  14. Z. Ren, J. Yuan, and Z. Zhang, "Robust hand gesture recognition with kinect sensor," Proc. of ACM Intl. Conf. on Multimedia, pp. 759-760, 2011.
  15. Z. Ren, J. Meng, and J. Yuan, "Depth camera based hand gesture recognition and its applications in human-computer interaction," IEEE International Conference on Information, Communication, and Signal Processing, pp. 1-5, 2011.
  16. Y. Ho-Sub, S. Jung, J.B. Young, and S.Y. Hyun, "Hand gesture recognition using combined features of location, angle and velocity," Journal of Pattern Recognition, vol. 34, no. 7, pp. 1491-1501, 2001. https://doi.org/10.1016/S0031-3203(00)00096-0
  17. M. Elmezain, A. Al-Hamadi, and B. Michaelis, "Improving hand gesture recognition using 3D combined features," International Conference on Machine Vision, pp. 128-132, 2009.
  18. H.-S. Yoon, J. Soh, Y.J. Bae, and H.S. Yang, "Hand gesture recognition using combined features of location, angle and velocity," Pattern Recognition, vol. 32, no. 7, pp. 1491-1501, 2001.
  19. T. Starner, J. Weaver, and A. Pentland, "Real-time american sign language recognition using desk and wearable computer based videos," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 20, no. 12, pp. 1371-1375, 1998. https://doi.org/10.1109/34.735811
  20. L. Breiman, "Random forests," Machine Learning, vol. 45, no. 1, pp. 5-32, 2001. https://doi.org/10.1023/A:1010933404324
  21. http://pr.cs.cornell.edu/humanactivities/handgesturedata.html
  22. http://research.microsoft.com/en-us/um/people/zliu/actionrecorsrc/default.htm

Cited by

  1. The Study on Dynamic Position Control base on Neural Networks, Image Processing and CAN Communication vol.17, pp.11, 2013, https://doi.org/10.6109/jkiice.2013.17.11.2499
  2. A Design and Implementation of Natural User Interface System Using Kinect vol.15, pp.4, 2014, https://doi.org/10.9728/dcs.2014.15.4.473
  3. User Customizable Hit Action Recognition Method using Kinect vol.18, pp.4, 2015, https://doi.org/10.9717/kmms.2015.18.4.557