A Hierarchical LSTM Model for Joint Tasks
Previous work has shown that joint modeling of two Natural Language Processing(NLP)tasks are effective for achieving better performances for both tasks.Lots of task-specific joint models are proposed.This paper proposes a Hierarchical Long Short-Term Memory(HLSTM)model and some its variants for modeling two tasks jointly.The models are flexible for modeling different types of combinations of tasks.It avoids task-specific feature engineering.Besides the enabling of correlation information between tasks,our models take the hierarchical relations between two tasks into consideration,which is not discussed in previous work.Experimental results show that our models outperform strong baselines in three different types of task combination.While both correlation information and hierarchical relations between two tasks are helpful to improve performances for both tasks,the models especially boost performance of tasks on the top of the hierarchical structures.
Hierarchical LSTM Joint modeling
Qianrong Zhou Liyun Wen Xiaojie Wang Long Ma Yue Wang
School of Computer,Beijing University of Posts and Telecommunications,Beijing,China
国内会议
第十五届全国计算语言学学术会议(CCL2016)暨第四届基于自然标注大数据的自然语言处理国际学术研讨会(NLP-NABD-2016)
烟台
英文
1-13
2016-10-14(万方平台首次上网日期,不代表论文的发表时间)