Dropout Non-Negative Matrix Factorization for Independent Feature Learning
Non-negative Matrix Factorization(NMF)can learn interpretable parts-based representations of natural data,and is widely applied in data mining and machine learning area.However,NMF does not always achieve good performances as the non-negative constraint leads learned features to be non-orthogonal and overlap in semantics.How to improve the semantic independence of latent features without decreasing the interpretability of NMF is still an open research problem.In this paper,we put forward dropout NMF and its extension sequential NMF to enhance the semantic independence of NMF.Dropout NMF prevents the co-adaption of latent features to reduce ambiguity while sequential NMF can further promote the independence of individual latent features.The proposed algorithms are different from traditional regularized and weighted methods,because they require no prior knowledge and bring in no extra constraints or transformations.Extensive experiments on document clustering show that our algorithms outperform baseline methods and can be seamlessly applied to NMF based models.
Non-negative Matrix Factorization dropout NMF sequential NMF independent feature learning
Zhicheng He Jie Liu Caihua Liu Yuan Wang Airu Yin Yalou Huang
College of Computer and Control Engineering,Nankai University,Tianjin,China;College of Software,Nankai University,Tianjin,China
国际会议
第五届自然语言处理与中文计算会议(NLPCC-ICCPOL2016)
昆明
英文
1-12
2016-12-02(万方平台首次上网日期,不代表论文的发表时间)