diff --git a/docs/source/en/index.mdx b/docs/source/en/index.mdx index a9bd25cffb..15ac6aa50a 100644 --- a/docs/source/en/index.mdx +++ b/docs/source/en/index.mdx @@ -12,18 +12,18 @@ specific language governing permissions and limitations under the License. # 🤗 Transformers -State-of-the-art Machine Learning for PyTorch, TensorFlow and JAX. +State-of-the-art Machine Learning for [PyTorch](https://pytorch.org/), [TensorFlow](https://www.tensorflow.org/), and [JAX](https://jax.readthedocs.io/en/latest/). -🤗 Transformers provides APIs to easily download and train state-of-the-art pretrained models. Using pretrained models can reduce your compute costs, carbon footprint, and save you time from training a model from scratch. The models can be used across different modalities such as: +🤗 Transformers provides APIs and tools to easily download and train state-of-the-art pretrained models. Using pretrained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities, such as: -* 📝 Text: text classification, information extraction, question answering, summarization, translation, and text generation in over 100 languages. -* 🖼️ Images: image classification, object detection, and segmentation. -* 🗣️ Audio: speech recognition and audio classification. -* 🐙 Multimodal: table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering. +📝 **Natural Language Processing**: text classification, named entity recognition, question answering, language modeling, summarization, translation, multiple choice, and text generation.
+🖼️ **Computer Vision**: image classification, object detection, and segmentation.
+🗣️ **Audio**: automatic speech recognition and audio classification.
+🐙 **Multimodal**: table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering. -Our library supports seamless integration between three of the most popular deep learning libraries: [PyTorch](https://pytorch.org/), [TensorFlow](https://www.tensorflow.org/) and [JAX](https://jax.readthedocs.io/en/latest/). Train your model in three lines of code in one framework, and load it for inference with another. +🤗 Transformers support framework interoperability between PyTorch, TensorFlow, and JAX. This provides the flexibility to use a different framework at each stage of a model's life; train a model in three lines of code in one framework, and load it for inference in another. Models can also be exported to a format like ONNX and TorchScript for deployment in production environments. -Each 🤗 Transformers architecture is defined in a standalone Python module so they can be easily customized for research and experiments. +Join the growing community on the [Hub](https://huggingface.co/models), [forum](https://discuss.huggingface.co/), or [Discord](https://discord.com/invite/JfAtkvEtRb) today! ## If you are looking for custom support from the Hugging Face team @@ -33,19 +33,17 @@ Each 🤗 Transformers architecture is defined in a standalone Python module so ## Contents -The documentation is organized in five parts: +The documentation is organized into five sections: -- **GET STARTED** contains a quick tour and installation instructions to get up and running with 🤗 Transformers. -- **TUTORIALS** are a great place to begin if you are new to our library. This section will help you gain the basic skills you need to start using 🤗 Transformers. -- **HOW-TO GUIDES** will show you how to achieve a specific goal like fine-tuning a pretrained model for language modeling or how to create a custom model head. -- **CONCEPTUAL GUIDES** provides more discussion and explanation of the underlying concepts and ideas behind models, tasks, and the design philosophy of 🤗 Transformers. -- **API** describes each class and function, grouped in: +- **GET STARTED** provides a quick tour of the library and installation instructions to get up and running. +- **TUTORIALS** are a great place to start if you're a beginner. This section will help you gain the basic skills you need to start using the library. +- **HOW-TO GUIDES** show you how to achieve a specific goal, like finetuning a pretrained model for language modeling or how to write and share a custom model. +- **CONCEPTUAL GUIDES** offers more discussion and explanation of the underlying concepts and ideas behind models, tasks, and the design philosophy of 🤗 Transformers. +- **API** describes all classes and functions: - - **MAIN CLASSES** for the main classes exposing the important APIs of the library. - - **MODELS** for the classes and functions related to each model implemented in the library. - - **INTERNAL HELPERS** for the classes and functions we use internally. - -The library currently contains JAX, PyTorch and TensorFlow implementations, pretrained model weights, usage scripts and conversion utilities for the following models. + - **MAIN CLASSES** details the most important classes like configuration, model, tokenizer, and pipeline. + - **MODELS** details the classes and functions related to each model implemented in the library. + - **INTERNAL HELPERS** details utility classes and functions used internally. ### Supported models