Syntax-Aware Representation for Aspect Term Extraction
Aspect Term Extraction(ATE)plays an important role in aspect-based sentiment analysis.Syntax-based neural models that learn rich linguistic knowledge have proven their effectiveness on ATE.However,previous approaches mainly focus on modeling syntactic structure,neglecting rich interactions along dependency arcs.Besides,these methods highly rely on results of dependency parsing and are sensitive to parsing noise.In this work,we introduce a syntax-directed attention network and a contextual gating mechanism to tackle these issues.Specifically,a graphical neural network is utilized to model interactions along dependency arcs.With the help of syntax-directed self-attention,it could directly operate on syntactic graph and obtain structural information.We further introduce a gating mechanism to synthesize syntactic information with structure-free features.This gate is utilized to reduce the effects of parsing noise.Experimental results demonstrate that the proposed method achieves state-of-the-art performance on three widely used benchmark datasets.
Aspect term extraction Syntactic information Gating mechanism
Jingyuan Zhang Guangluan Xu Xinyi Wang Xian Sun Tinglei Huang
Key Laboratory of Network Information System Technology(NIST),Institute of Electronics,Chinese Acade Key Laboratory of Network Information System Technology(NIST),Institute of Electronics,Chinese Acade
国际会议
澳门
英文
123-134
2019-04-14(万方平台首次上网日期,不代表论文的发表时间)