Recurrent Neural CRF for Aspect Term Extraction with Dependency Transmission
This paper presents a novel neural architecture for aspect term extraction in fine-grained sentiment computing area.In addition to amalgamating sequential features(character embedding,word embedding and POS tagging information),we train an end-to-end Recurrent Neural Networks(RNNs)with meticulously designed dependency transmission between recurrent units,thereby making it possible to learn structural syntactic phenomena.The experimental results show that incorporating these shallow semantic features improves aspect term extraction performance compared to a system that uses no linguistic information,demonstrating the utility of morphological information and syntactic structures for capturing the affinity between aspect words and their contexts.
Aspect term extraction Dependency transmission Recurrent neural networks CRF
Lindong Guo Shengyi Jiang Wenjing Du Suifu Gan
School of Information Science and Technology,Guangdong University of Foreign Studies,Guangzhou,China School of Information Science and Technology,Guangdong University of Foreign Studies,Guangzhou,China
国际会议
2018自然语言处理与中文计算国际会议(NLPCC2018)
呼和浩特
英文
378-390
2018-08-26(万方平台首次上网日期,不代表论文的发表时间)