Realization of sign language motion using a dual-arm/hand humanoid robot
详细信息    查看全文
文摘
The recent increase in technological maturity has empowered robots to assist humans and provide daily services. Voice command usually appears as a popular human–machine interface for communication. Unfortunately, deaf people cannot exchange information from robots through vocal modalities. To interact with deaf people effectively and intuitively, it is desired that robots, especially humanoids, have manual communication skills, such as performing sign languages. Without ad hoc programming to generate a particular sign language motion, we present an imitation system to teach the humanoid robot performing sign languages by directly replicating observed demonstration. The system symbolically encodes the information of human hand–arm motion from low-cost depth sensors as a skeleton motion time-series that serves to generate initial robot movement by means of perception-to-action mapping. To tackle the body correspondence problem, the virtual impedance control approach is adopted to smoothly follow the initial movement, while preventing potential risks due to the difference in the physical properties between the human and the robot, such as joint limit and self-collision. In addition, the integration of the leg-joints stabilizer provides better balance of the whole robot. Finally, our developed humanoid robot, NINO, successfully learned by imitation from human demonstration to introduce itself using Taiwanese Sign Language.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700