From 09363f2a8b9f0020964a056be621de9012094ed0 Mon Sep 17 00:00:00 2001 From: LysandreJik Date: Fri, 30 Aug 2019 19:48:32 -0400 Subject: [PATCH] Fix documentation index --- docs/source/index.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/source/index.rst b/docs/source/index.rst index d349e146c9..5b451707d6 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -11,7 +11,8 @@ The library currently contains PyTorch implementations, pre-trained model weight 4. `Transformer-XL `_ (from Google/CMU) released with the paper `Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context `_ by Zihang Dai*, Zhilin Yang*, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov. 5. `XLNet `_ (from Google/CMU) released with the paper `​XLNet: Generalized Autoregressive Pretraining for Language Understanding `_ by Zhilin Yang*, Zihang Dai*, Yiming Yang, Jaime Carbonell, Ruslan Salakhutdinov, Quoc V. Le. 6. `XLM `_ (from Facebook) released together with the paper `Cross-lingual Language Model Pretraining `_ by Guillaume Lample and Alexis Conneau. -7. `DistilBERT `_ (from HuggingFace) released together with the blog post `Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT `_ by Victor Sanh, Lysandre Debut and Thomas Wolf. +7. `RoBERTa `_ (from Facebook), released together with the paper a `Robustly Optimized BERT Pretraining Approach `_ by Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, Veselin Stoyanov. +8. `DistilBERT `_ (from HuggingFace) released together with the blog post `Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT `_ by Victor Sanh, Lysandre Debut and Thomas Wolf. .. toctree:: :maxdepth: 2