Visual Estimation of Lower Limb Motion Using Physical and Virtual Sensors
An approach using virtual-sensor difference and physical-sensor difference based algorithm to visually confirm lower limb posture was proposed and a wearable sensor system was developed. To explicate the lower limb posture, flexion/extension (FE) and abduction/adduction (AA) hip joint angles and FE knee joint angles were estimated for orientations of the lower limb segments, and the knee and ankle joint trajectories were obtained with the segment orientation and length for positions of the lower limb joints. In the wearable sensor system, an accelerometer on the hip joint and two MAG3s (inertial sensor module) on the thigh were in a group to measure the data for the thigh orientation and knee joint position using the double-sensor difference based algorithm, and two MAG3s on the thigh and shank near the knee joint were in a group to measure the data for the shank orientation and ankle joint position using the virtual-sensor based algorithm. Compared with the camera motion capture system, the correlation coefficients in three trials were above 0.89 for the hip FE, higher than 0.9 for the hip AA and better than 0.88 for the knee FE.. There was no integration of angular acceleration or angular velocity for the joint rotations and positions in this method. The developed wearable sensor system was available to visually and quantitatively confirm the complete lower limb posture with fewer sensors and high degree of accuracy. And it can also be used in other conditions, such as estimating posture of mechanical arm or upper limb.
Lower limb motion Wearable sensor system visual estimation Virtual sensor
Kun Liu Tao Liu Kyoko Shibata Yoshio Inoue
Department of Intelligent Mechanical Systems Engineering Kochi University of Technology185 Miyanokuc Department of Intelligent Mechanical Systems Engineering Kochi University of Technology 185 Miyanoku
国际会议
2010 IEEE信息与自动化国际会议(ICIA 2010)
哈尔滨
英文
1-6
2010-06-20(万方平台首次上网日期,不代表论文的发表时间)