会议专题

Learning Dialogue History for Spoken Language Understanding

  In task-oriented dialogue systems,spoken language understanding(SLU)aims to convert users queries expressed by natural language to structured representations.SLU usually consists of two parts,namely intent identification and slot filling.Although many methods have been proposed for SLU,these methods generally process each utterance individually,which loses context information in dialogues.In this paper,we propose a hierarchical LSTM based model for SLU.The dialogue history is memorized by a turn-level LSTM and it is used to assist the prediction of intent and slot tags.Consequently,the understanding of the current turn is dependent on the preceding turns.We conduct experiments on the NLPCC 2018 Shared Task 4 dataset.The results demonstrate that the dialogue history is effective for SLU and our model outperforms all baselines.

Spoken language understanding Dialogue history Hierarchical LSTM

Xiaodong Zhang Dehong Ma Houfeng Wang

Institute of Computational Linguistics,Peking University,Beijing 100871,China

国际会议

2018自然语言处理与中文计算国际会议(NLPCC2018)

呼和浩特

英文

120-132

2018-08-26(万方平台首次上网日期,不代表论文的发表时间)