3.8 KiB
BLOOM
Overview
The BLOOM model has been proposed with its various versions through the BigScience Workshop. BigScience is inspired by other open science initiatives where researchers have pooled their time and resources to collectively achieve a higher impact. The architecture of BLOOM is essentially similar to GPT3 (auto-regressive model for next token prediction), but has been trained on 46 different languages and 13 programming languages. Several smaller versions of the models have been trained on the same dataset. BLOOM is available in the following versions:
Resources
A list of official Hugging Face and community (indicated by 🌎) resources to help you get started with BLOOM. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.
- [
BloomForCausalLM
] is supported by this causal language modeling example script and notebook.
See also:
- Causal language modeling task guide
- Text classification task guide
- Token classification task guide
- Question answering task guide
⚡️ Inference
- A blog on Optimization story: Bloom inference.
- A blog on Incredibly Fast BLOOM Inference with DeepSpeed and Accelerate.
⚙️ Training
- A blog on The Technology Behind BLOOM Training.
BloomConfig
autodoc BloomConfig - all
BloomTokenizerFast
autodoc BloomTokenizerFast - all
BloomModel
autodoc BloomModel - forward
BloomForCausalLM
autodoc BloomForCausalLM - forward
BloomForSequenceClassification
autodoc BloomForSequenceClassification - forward
BloomForTokenClassification
autodoc BloomForTokenClassification - forward
BloomForQuestionAnswering
autodoc BloomForQuestionAnswering - forward
FlaxBloomModel
autodoc FlaxBloomModel - call
FlaxBloomForCausalLM
autodoc FlaxBloomForCausalLM - call