Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 337 Bytes

xlm_r.md

File metadata and controls

11 lines (9 loc) · 337 Bytes
tags
transformers
ml

XLM-R (XLM-RoBERTa)

Multi-lingual model proposed by Conneau et al. (2019).

In a brief it is a RoBERTa trained on multiple languages with Masked Language Modelling (MLM), where the languages share vocabulary. This is apparently called Multi-lingual MLM (MMLM).