Multi-turn Response Selection Using Dialogue Dependency Relations | Awesome LLM Papers

Multi-turn Response Selection Using Dialogue Dependency Relations

Qi Jia, Yizhu Liu, Siyu Ren, Kenny Q. Zhu, Haifeng Tang Β· Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) Β· 2020

Multi-turn response selection is a task designed for developing dialogue agents. The performance on this task has a remarkable improvement with pre-trained language models. However, these models simply concatenate the turns in dialogue history as the input and largely ignore the dependencies between the turns. In this paper, we propose a dialogue extraction algorithm to transform a dialogue history into threads based on their dependency relations. Each thread can be regarded as a self-contained sub-dialogue. We also propose Thread-Encoder model to encode threads and candidates into compact representations by pre-trained Transformers and finally get the matching score through an attention layer. The experiments show that dependency relations are helpful for dialogue context understanding, and our model outperforms the state-of-the-art baselines on both DSTC7 and DSTC8*, with competitive results on UbuntuV2.

Similar Work
Loading…