AALRSMF:An Adaptive Learning Rate Schedule for Matrix Factorization
Stochastic gradient descent (SGD) is an effective algorithm to solve matrix factorization problem.However,the performance of SGD depends critically on how learning rates are tuned over time.In this paper,we propose a novel per-dimension learning rate schedule called AALRSMF.This schedule relies on local gradients,requires no manual tunning of a global learning rate,and shows to be robust to the selection of hyper-parameters.The extensive experiments demonstrate that the proposed schedule shows promising results compared to existing ones on matrix factorization.
Adaptive learning rate SGD Matrix factorization
Feng Wei Hao Guo Shaoyin Cheng Fan Jiang
School of Computer Science and Technology,University of Science and Technology of China,Hefei 230027,China
国际会议
International Asia-Pacific Web Conference(第18届国际亚太互联网大会)
苏州
英文
410-413
2016-09-23(万方平台首次上网日期,不代表论文的发表时间)