DOI QR코드

DOI QR Code

Telepresence Robotic Technology for Individuals with Visual Impairments Through Real-time Haptic Rendering

실시간 햅틱 렌더링 기술을 통한 시각 장애인을 위한 원격현장감(Telepresence) 로봇 기술

  • Received : 2013.05.30
  • Accepted : 2013.07.25
  • Published : 2013.08.31

Abstract

This paper presents a robotic system that provides telepresence to the visually impaired by combining real-time haptic rendering with multi-modal interaction. A virtual-proxy based haptic rendering process using a RGB-D sensor is developed and integrated into a unified framework for control and feedback for the telepresence robot. We discuss the challenging problem of presenting environmental perception to a user with visual impairments and our solution for multi-modal interaction. We also explain the experimental design and protocols, and results with human subjects with and without visual impairments. Discussion on the performance of our system and our future goals are presented toward the end.

Keywords

References

  1. C. P. Gharpure and V. Kulyukin. Robot-assisted shopping for the blind: issues in spatial cognition and product selection. Intelligent Servic e Robotics, 1(3) :237- 251, 2008. https://doi.org/10.1007/s11370-008-0020-9
  2. D. Hong, S. Kimmel, R. Boehling, N. Camoriano, W. Cardwell, G. Jannaman, A. Purcell, D. Ross, and E. Russel. Development of a semi-autonomous vehicle operable by the visually-impaired. In IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pages 539-544 . IEEE, 2008.
  3. V. Kulyukin, C. P. Gharpure, J. Nicholson, and S. Pavithran. RFID in robot-assisted indoor navigation for the visually impaired. In Proceedings of IEEE/ RSJ International Conference on Intelligent Robots and Systems(IROS), volume 2, pages 1979-1984. IEEE, 2004.
  4. K. Lundin, B. Gudmundsson, and A. Ynnerman. General proxy-based haptics for volume vi sualization. In World Haptics, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pages 557-560. IEEE, 2005.
  5. C. H. Park and A. M. Howard. Towards real-time haptic exploration using a mobile robot as mediator. In Proceedings of IEEE Haptics Symposium(HAPTICS), pages 289-292. IEEE, 2010.
  6. C. H. Park and A. M. Howard. Real World Haptic Exploration for Telepresence of the Visually Impaired. In Proceedings of ACM/IEEE International Conference on Human Robot Interaction(HRI). IEEE, 2012.
  7. D. Pascolini and S. P. Mariotti. Global estimates of visual impairment: 2010. In British Journal Ophthalmology Online, 2011.
  8. F. Ryd'en, H. J. Chezeck, S. N. Kosari, H. King, and B. Hannaford. Using kinect and a haptic interface for implementation of real-time virtual fixtures. In Proceedings of the 2nd Workshop on RGB-D: Advanced Reasoning with Depth Cameras (in conjunction with RSS'11), 2011.
  9. F. Ryd'en, S. N. Kosari, and H. J. Chizeck. Proxy method for fast haptic rendering from time varying point clouds. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 2614 -2619, September 2011.
  10. K. Salisbury, F. Conti, and F. Barbagli. Haptic rendering: introductory concepts, March-April 2004.
  11. K. G. Sreeni and S. Chaudhuri. Haptic rendering of dense 3d point cloud data. In Proceedings of IEEE Haptics Symposium(HAPTICS), pages 333 -339, March 2012.
  12. I. Ulrich and J. Borenstein. The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 31(2):131-136, 2002.
  13. M. Yang, J. Lu, Z. Zhou, A. Safonova, and K. Kuchenbecker. A gpubased approach for real-time haptic rendering of 3d fluids. Proceedings of ACM SIGGRAPH Asia Sketches, 2009.

Cited by

  1. Kinect센서를 이용한 물체 인식 및 자세 추정을 위한 정확도 개선 방법 vol.10, pp.1, 2015, https://doi.org/10.7746/jkros.2015.10.1.016