Gaussian Mixture Background Modeling Based on CUDA
Pixel-level background modeling is a common computer vision task, and Gaussian mixture modeling (GMM) algorithm is one of the most used methods. Compute Unified Device Architecture (CUDA) is a technology of general-purpose computing on the GPU, which makes users develop GPU program and achieve high-speed parallel computation. In this paper, by harnessing the feature of CUDA, We improved GMM algorithm by parallel processing and memory optimization. Then, the improved GMM algorithm was implemented on GPU. Compared with the traditional approach based on CPU, the experiment results show that the running speed of proposed method has been increased significantly.
Background Subtraction Gaussian Mixture Modeling GPU CUDA
REN Hao LU Xiao-feng WANG Jia LU Heng-li FAN Tian-xiang
School of Communication and Information Engineering, Shanghai University, Shanghai, China
国际会议
2010 International Conference on Circuit and Signal Processing(2010年电路与信号处理国际会议 ICCSP 2010)
上海
英文
307-310
2010-12-25(万方平台首次上网日期,不代表论文的发表时间)