Modeling Concentrated Cross-attention For Neural Machine Translation With Gaussian Mixture Model | Awesome LLM Papers

Modeling Concentrated Cross-attention For Neural Machine Translation With Gaussian Mixture Model

Shaolei Zhang, Yang Feng Β· Findings of the Association for Computational Linguistics: EMNLP 2021 Β· 2021

Cross-attention is an important component of neural machine translation (NMT), which is always realized by dot-product attention in previous methods. However, dot-product attention only considers the pair-wise correlation between words, resulting in dispersion when dealing with long sentences and neglect of source neighboring relationships. Inspired by linguistics, the above issues are caused by ignoring a type of cross-attention, called concentrated attention, which focuses on several central words and then spreads around them. In this work, we apply Gaussian Mixture Model (GMM) to model the concentrated attention in cross-attention. Experiments and analyses we conducted on three datasets show that the proposed method outperforms the baseline and has significant improvement on alignment quality, N-gram accuracy, and long sentence translation.

Similar Work
Loading…