会议专题

A Submodular Optimization-Based VAE-Transformer Framework for Paraphrase Generation

  Paraphrase plays an important role in various Natural Lan-guage Processing(NLP)problems,such as question answering,infor-mation retrieval,conversation systems,etc.Previous approaches mainly concentrate on producing paraphrases with similar semantics,namely fidelity,while recent ones begin to focus on the diversity of generated paraphrases.However,most of the existing models fail to explicitly emphasize on both metrics above.To fill this gap,we propose a submod-ular optimization-based VAE-transformer model to generate more con-sistent and diverse phrases.Through extensive experiments on datasets like Quora and Twitter,we demonstrate that our proposed model outper-forms state-of-the-art baselines on BLEU,METEOR,TERp and n-distinct grams.Furthermore,through ablation study,our results suggest that incorporating VAE and submodularity functions could effectively pro-mote fidelity and diversity respectively.

Paraphrase Transformer VAE Submodular function

Xiaoning Fan Danyang Liu Xuejian Wang Yiding Liu Gongshen Liu Bo Su

School of Electronic Information and Electrical Engineering,Shanghai Jiao Tong University,Shanghai 2 Heinz College,Carnegie Mellon University,Pittsburgh 15213,USA

国际会议

9th CCF International Conference on Natural Language Processing and Chinese Computing (NLPCC 2020)

郑州

英文

494-505

2020-10-14(万方平台首次上网日期,不代表论文的发表时间)