.. |
benchmark
|
[Test refactor 1/5] Per-folder tests reorganization (#15725)
|
2022-02-23 15:46:28 -05:00 |
deepspeed
|
[deepspeed] offload + non-cpuadam optimizer exception (#22043)
|
2023-03-09 08:12:57 -08:00 |
extended
|
Apply ruff flake8-comprehensions (#21694)
|
2023-02-22 09:14:54 +01:00 |
fixtures
|
[WIP] add SpeechT5 model (#18922)
|
2023-02-03 12:43:46 -05:00 |
generation
|
Generate: Add text streamer decoding options (#22544)
|
2023-04-04 09:03:13 +01:00 |
mixed_int8
|
[`bnb`] fix bnb failing test (#22439)
|
2023-03-29 15:13:00 +02:00 |
models
|
🚨🚨🚨 `[NLLB Tokenizer]` Fix the prefix tokens 🚨🚨🚨 (#22313)
|
2023-04-04 14:53:06 +02:00 |
onnx
|
Time to Say Goodbye, torch 1.7 and 1.8 (#22291)
|
2023-03-21 19:22:01 +01:00 |
optimization
|
Make schedulers picklable by making lr_lambda fns global (#21768)
|
2023-03-02 12:08:43 -05:00 |
pipelines
|
Soft error whisper. (#22475)
|
2023-04-04 16:21:57 +02:00 |
repo_utils
|
Test fetch v2 (#22367)
|
2023-03-31 16:18:43 -04:00 |
sagemaker
|
Apply ruff flake8-comprehensions (#21694)
|
2023-02-22 09:14:54 +01:00 |
tokenization
|
Update quality tooling for formatting (#21480)
|
2023-02-06 18:10:56 -05:00 |
trainer
|
Implemented safetensors checkpoints save/load for Trainer (#22498)
|
2023-04-04 09:05:04 -04:00 |
utils
|
Use real tokenizers if tiny version(s) creation has issue(s) (#22428)
|
2023-03-29 16:16:23 +02:00 |
__init__.py
|
GPU text generation: mMoved the encoded_prompt to correct device
|
2020-01-06 15:11:12 +01:00 |
test_configuration_common.py
|
Remove set_access_token usage + fail tests if FutureWarning (#22051)
|
2023-03-09 09:23:48 -05:00 |
test_feature_extraction_common.py
|
Remove set_access_token usage + fail tests if FutureWarning (#22051)
|
2023-03-09 09:23:48 -05:00 |
test_image_processing_common.py
|
Remove set_access_token usage + fail tests if FutureWarning (#22051)
|
2023-03-09 09:23:48 -05:00 |
test_image_transforms.py
|
Rescale image back if it was scaled during PIL conversion (#22458)
|
2023-03-30 11:29:11 +01:00 |
test_modeling_common.py
|
Making sure we can use safetensors to serialize all the time. (#22437)
|
2023-03-31 16:07:35 +02:00 |
test_modeling_flax_common.py
|
Remove set_access_token usage + fail tests if FutureWarning (#22051)
|
2023-03-09 09:23:48 -05:00 |
test_modeling_tf_common.py
|
Remove set_access_token usage + fail tests if FutureWarning (#22051)
|
2023-03-09 09:23:48 -05:00 |
test_pipeline_mixin.py
|
Automatically create/update tiny models (#22275)
|
2023-03-23 19:14:17 +01:00 |
test_sequence_feature_extraction_common.py
|
Apply ruff flake8-comprehensions (#21694)
|
2023-02-22 09:14:54 +01:00 |
test_tokenization_common.py
|
Fix llama tokenizer (#22402)
|
2023-04-03 09:07:32 -04:00 |