Jointly Sparse Reconstructed Regression Learning
Least squares regression and ridge regression are simple and effective methods for feature selection and classification and many methods based on them are proposed.However,most of these methods have small-class problem,which means that the number of the projection learned by these methods is limited by the number of class.In this paper,we propose a jointly sparse reconstructed regression(JSRR)to solve this problem.Moreover,JSRR uses L2,1-norm as the basic measurement so that it can enhance robustness to outliers and guarantee joint sparsity for discriminant feature selection.In addition,by integrating the property of robust feature selection(RFS)and principle component analysis(PCA),JSRR is able to obtain the projections that have minimum reconstructed error and strong discriminability for recognition task.We also propose an iterative algorithm to solve the optimization problem.A series of experiments are conducted to evaluate the performance of JSRR.Experimental results indicate that JSRR outperforms the classical RR and some stateof-the-art regression methods.
Regression Feature selection Joint sparsity Classification Robustness
Dongmei Mo Zhihui Lai Heng Kong
College of Computer Science and Software Engineering,Shenzhen University,Shenzhen 518060,China School of Medicine,Shenzhen University,Shenzhen 518060,China
国际会议
广州
英文
597-609
2018-11-23(万方平台首次上网日期,不代表论文的发表时间)