Densely Connected Bidirectional LSTM with Applications to Sentence Classification
Deep neural networks have recently been shown to achieve highly competitive performance in many computer vision tasks due to their abilities of exploring in a much larger hypothesis space.However,since most deep architectures like stacked RNNs tend to suffer from the vanishing-gradient and overfitting problems,their effects are still understudied in many NLP tasks.Inspired by this,we propose a novel multilayer RNN model called densely connected bidirectional long short-term memory(DC-Bi-LSTM)in this paper,which essentially represents each layer by the concatenation of its hidden state and all preceding layers hidden states,followed by recursively passing each layers representation to all subsequent layers.We evaluate our proposed model on five benchmark datasets of sentence classification.DC-Bi-LSTM with depth up to 20 can be successfully trained and obtain significant improvements over the traditional Bi-LSTM with the same or even fewer parameters.Moreover,our model has promising performance compared with the state-ofthe-art approaches.
Sentence classification Densely connected Stacked RNNs
Zixiang Ding Rui Xia Jianfei Yu Xiang Li Jian Yang
School of Computer Science and Engineering,Nanjing University of Science and Technology,Nanjing,Chin School of Information Systems,Singapore Management University,Singapore,Singapore
国际会议
2018自然语言处理与中文计算国际会议(NLPCC2018)
呼和浩特
英文
278-287
2018-08-26(万方平台首次上网日期,不代表论文的发表时间)