会议专题

Integrating Human Visual Motion with Hand Gestures for Wheelchair Driving Control

  In this study,a human-robot collaborative control architecture is presented for a robotic wheelchair(RW),adapted for users with cognitive disabilities and mobility impairment.The developed system integrates the visual motion patterns and hand gestures of robotic wheelchair users with minimally invasive,noncontact,and wireless assistive interface.By detecting and classifying the human inputs of visual and hand motions,the user-defined commands can be translated to the robotic wheelchair so that the users intentions can be inferred to enhance the wheeled mobility while ensuring the driving safety.With the aid of the collaborative control system,the users would require less attention and load for wheelchair driving and thus they can concentrate on higher level cognitive tasks such as destination planning.Through the experimental results,the developed control system demonstrates the effectiveness for the wheeled mobility.

Fang-Yi Lay Hsin-Han Chiang Yen-Lin Chen Tsu-Zen Hong

Department of Electrical Engineering,Fu Jen Catholic University,New Taipei City 24205,Taiwan Department of Computer Science and Information Engineering,National Taipei University of Technology,

国际会议

The 2014 ICME International Conference on Complex Medical Engineering (CME2014)ICME复合医学工程国际会议

台北

英文

309-314,150

2014-06-26(万方平台首次上网日期,不代表论文的发表时间)