tags | ||
---|---|---|
|
Multi-lingual model proposed by Conneau et al. (2019).
In a brief it is a RoBERTa trained on multiple languages with Masked Language Modelling (MLM), where the languages share vocabulary. This is apparently called Multi-lingual MLM (MMLM).