Most rumour detection models for social media are designed for one specific
language (mostly English). There are over 40 languages on Twitter and most
languages lack annotated resources to build rumour detection models. In this
paper we propose a zero-shot cross-lingual transfer learning framework that can
adapt a rumour detection model trained for a source language to another target
language. Our framework utilises pretrained multilingual language models (e.g.
multilingual BERT) and a self-training loop to iteratively bootstrap the
creation of ‘‘silver labels’’ in the target language to adapt the model from
the source language to the target language. We evaluate our methodology on
English and Chinese rumour datasets and demonstrate that our model
substantially outperforms competitive benchmarks in both source and target
language rumour detection.