Sensor Fusion for Vision-based Indoor Head Pose Tracking
Accurate head pose tracking is a key issue for indoor augmented reality systems. This paper proposes a novel approach to track head pose of indoor users using sensor fusion. The proposed approach utilizes a track-to-track fusion framework composed of extended Kalman filters and fusion filter to fuse the poses from the two complementary tracking modes of insideout tracking (IOT) and outside-in tracking (OIT). a vision-based head tracker is constructed to verify our approach. Primary experimental results show that the tracker is capable of achieving more accurate and stable pose than the single tracking mode of IOT or OIT, which validates the usefulness of the proposed sensor fusion approach.
augmented reality sensor fusion eztended Kalman filter head pose tracking
Bin Luo Yongtian Wang Yue Liu
School of Optics and Electronics Beijing Institute of Technology Beijing, China Institute of Compute School of Optics and Electronics Beijing Institute of Technology Beijing, China
国际会议
The Fifth International Conference on Image and Graphics(第五届国际图像图形学学术会议 ICIG 2009)
西安
英文
677-682
2009-09-20(万方平台首次上网日期,不代表论文的发表时间)