会议专题

LEAST SQUARES SUPPORT TENSOR MACHINE

  Least squares support vector machine (LS-SVM),as a variant of the standard support vector machine (SVM) operates directly on patterns represented by vector and obtains an analytical solution directly from solving a set of linear equations instead of quadratic programming (QP).Tensor representation is useful to reduce the overfitting problem in vector-based learning,and tensor-based algorithm requires a smaller set of decision variables as compared to vector-based approaches.Above properties make the tensor learning specially suited for small-sample-size (S3) problems.In this paper,we generalize the vectorbased learning algorithm least squares support vector machine to the tensor-based method least squares support tensor machine (LS-STM),which accepts tensors as input.Similar to LS-SVM,the classifier is obtained also by solving a system of linear equations rather than a QP.LS-STM is based on the tensor space,with tensor representation,the number of parameters estimated by LS-STM is less than the number of parameters estimated by LS-SVM,and avoids discarding a great deal of useful structural information.Experimental results on some benchmark datasets indicate that the performance of LS-STM is competitive in classification performance compared to LS-SVM.

Tensor representation Alternating projection Least squares support vector machine Support tensor machine Least squares support tensor machine

Meng Lv Xinbin Zhao Lujia Song Haifa Shi Ling Jing

Department of Applied Mathematics, College of Science, CAU, Beijing 100083, China

国际会议

11th International Symposium on Operations Research and its Applications(第11届运筹学及其应用国际研讨会)

安徽黄山

英文

153-158

2013-08-23(万方平台首次上网日期,不代表论文的发表时间)