Merge pull request #1353 from wendingp/patch-1

Fix some typos
This commit is contained in:
Thomas Wolf 2019-09-27 23:00:34 +02:00 committed by GitHub
commit df7cd9e4e4
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 3 additions and 3 deletions

View File

@ -19,12 +19,12 @@ The library was designed with two strong goals in mind:
A few other goals: A few other goals:
- expose the models internals as consistently as possible: - expose the models' internals as consistently as possible:
- we give access, using a single API to the full hidden-states and attention weights, - we give access, using a single API to the full hidden-states and attention weights,
- tokenizer and base model's API are standardized to easily switch between models. - tokenizer and base model's API are standardized to easily switch between models.
- incorporate a subjective selection of promising tools for fine-tuning/investiguating these models: - incorporate a subjective selection of promising tools for fine-tuning/investigating these models:
- a simple/consistent way to add new tokens to the vocabulary and embeddings for fine-tuning, - a simple/consistent way to add new tokens to the vocabulary and embeddings for fine-tuning,
- simple ways to mask and prune transformer heads. - simple ways to mask and prune transformer heads.
@ -51,7 +51,7 @@ We'll finish this quickstart tour by going through a few simple quick-start exam
Here are two examples showcasing a few `Bert` and `GPT2` classes and pre-trained models. Here are two examples showcasing a few `Bert` and `GPT2` classes and pre-trained models.
See full API reference for examples for each model classe. See full API reference for examples for each model class.
### BERT example ### BERT example