Multi-task Sparse Gaussian Processes with Improved Multi-task Sparsity Regularization
Gaussian processes are a popular and effective Bayesian method for classification and regression.Generating sparse Gaussian processes is a hot research topic,since Gaussian processes have to face the problem of cubic time complexity with respect to the size of the training set.Inspired by the idea of multi-task learning,we believe that simultaneously selecting subsets of multiple Gaussian processes will be more suitable than selecting them separately.In this paper,we propose an improved multi-task sparsity regularizer which can effectively regularize the subset selection of multiple tasks for multi-task sparse Gaussian processes.In particular,based on the multi-task sparsity regularizer proposed in 12,we perform two improvements: 1) replacing a subset of points with a rough global structure when measuring the global consistency of one point; 2) performing normalization on each dimension of every data set before sparsification.We combine the regularizer with two methods to demonstrate its effectiveness.Experimental results on four real data sets show its superiority.
Gaussian processes multi-task learning sparse representation regularization
Jiang Zhu Shiliang Sun
Department of Computer Science and Technology East China Normal University 500 Dongchuan Road,Shanghai 200241,China
国际会议
Chinese Conference on Pattern Recognition, CCPR(2014年全国模式识别学术会议)
长沙
英文
54-62
2014-11-01(万方平台首次上网日期,不代表论文的发表时间)