会议专题

A New Method of Parameters Optimization Based on Self-Calling SVR

Parameters optimization selection is a key point in Support Vector Regression (SVR). Exhaustive search spends a lot of time, especially when large-scale samples need to be trained. A new method based on Parameters Subsection Selection and Self-Calling (PSS-SC) SVR is proposed. First of all, parameters optimization selection involves in penalty coefficient c, kernel parameter g and non-sensitive coefficient p, and the combination (c,g,p) will make a great effect on the prediction accuracy of SVR. The proposed method is used to select the optimal parameter combination with less time to achieve the better performance of SVR. Firstly, trisection is adopted according to the span of each parameter, thus, three medians as test points could be available for each parameter. Totally 27 parameter combinations (c,g,p) and MSEs of corresponding SVRs could be achieved. Then the mapping relationship between the 27 combinations (c,g,p) and their MSEs could be established. And then, the MSEs of the remaining parameter combinations could be conducted with the mapping relationship. Thus, the N parameters combinations corresponding to the first N minimum MSEs are selected as the candidates TOP-N. Finally, the TOP-N combinations (c,g,p) are applied to SVR to achieve their MSEs separately. The minimum MSE corresponds to the best parameter combination. Experiments on 5 benchmark datasets illustrate that the new method not only can assure the prediction precision but also can reduce training time extremely.

subsection Support Vector Regression (SVR) large-scale samples parameter selection

Qiuye Wang Fan Ning Yong Liu

Beijing University of Posts and Telecommunications Beijing, 100876, China

国际会议

2011 International Conference on Advanced Intelligence and Awareness Internet(第二届高等智能和感知网络国际会议 AIAI 2011)

深圳

英文

179-183

2011-10-28(万方平台首次上网日期,不代表论文的发表时间)