diff --git a/docs/source/examples.rst b/docs/source/examples.rst index b444009a6d..d978451438 100644 --- a/docs/source/examples.rst +++ b/docs/source/examples.rst @@ -459,7 +459,7 @@ The same option as in the original scripts are provided, please refer to the cod Causal LM fine-tuning on GPT/GPT-2, Masked LM fine-tuning on BERT/RoBERTa -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Before running the following examples you should download the `WikiText-2 dataset `__ and unpack it to some directory `$WIKITEXT_2_DATASET` The following results were obtained using the `raw` WikiText-2 (no tokens were replaced before the tokenization). @@ -467,6 +467,8 @@ The following results were obtained using the `raw` WikiText-2 (no tokens were r This example fine-tunes GPT-2 on the WikiText-2 dataset. The loss function is a causal language modeling loss (perplexity). .. code-block:: bash + + export WIKITEXT_2_DATASET=/path/to/wikitext_dataset python run_lm_finetuning.py @@ -485,6 +487,8 @@ This example fine-tunes RoBERTa on the WikiText-2 dataset. The loss function is The `--mlm` flag is necessary to fine-tune BERT/RoBERTa on masked language modeling. .. code-block:: bash + + export WIKITEXT_2_DATASET=/path/to/wikitext_dataset python run_lm_finetuning.py diff --git a/docs/source/model_doc/distilbert.rst b/docs/source/model_doc/distilbert.rst index cc156c90c2..141d3e151f 100644 --- a/docs/source/model_doc/distilbert.rst +++ b/docs/source/model_doc/distilbert.rst @@ -2,35 +2,35 @@ DistilBERT ---------------------------------------------------- ``DistilBertConfig`` -~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: pytorch_transformers.DistilBertConfig :members: ``DistilBertTokenizer`` -~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: pytorch_transformers.DistilBertTokenizer :members: ``DistilBertModel`` -~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: pytorch_transformers.DistilBertModel :members: ``DistilBertForMaskedLM`` -~~~~~~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: pytorch_transformers.DistilBertForMaskedLM :members: ``DistilBertForSequenceClassification`` -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: pytorch_transformers.DistilBertForSequenceClassification :members: