Skip to content

Latest commit

 

History

History
26 lines (14 loc) · 422 Bytes

Cross-lingual_Language_Model_Pretraining.md

File metadata and controls

26 lines (14 loc) · 422 Bytes

Cross-lingual Language Model Pretraining

Status: Pending

Author: Alexis Conneau, Guillaume Lample

Topic: NMT, Text , Transformers

Category: Unsupervised

Conference: arXiv

Year: 2019

Link: https://arxiv.org/abs/1901.07291

Questions

What did authors try to accomplish?

What were the key elements of the approach?

What can you use yourself from this paper?

What other references to follow?