Discrete And Soft Prompting For Multilingual Models | Awesome LLM Papers

Discrete And Soft Prompting For Multilingual Models

Mengjie Zhao, Hinrich Schütze · Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing · 2021

It has been shown for English that discrete and soft prompting perform strongly in few-shot learning with pretrained language models (PLMs). In this paper, we show that discrete and soft prompting perform better than finetuning in multilingual cases: Crosslingual transfer and in-language training of multilingual natural language inference. For example, with 48 English training examples, finetuning obtains 33.74% accuracy in crosslingual transfer, barely surpassing the majority baseline (33.33%). In contrast, discrete and soft prompting outperform finetuning, achieving 36.43% and 38.79%. We also demonstrate good performance of prompting with training data in multiple languages other than English.

Similar Work
Loading…