Marc Sun
8366b57241
Fix accelerate failing tests ( #30836 )
...
* Fix accelerate tests
* fix clip
* skip dbrx tests
* fix GPTSan
* fix M2M100Model
* same fix as jamba
* fix mt5
* Fix T5Model
* Fix umt5 model
* fix switch_transformers
* fix whisper
* fix gptsan again
* fix siglip recent test
* skip siglip tests
* wrong place fixed
2024-05-23 17:18:58 +02:00
Joao Gante
248d5d23a2
Tests: replace `torch.testing.assert_allclose` by `torch.testing.assert_close` ( #29915 )
...
* replace torch.testing.assert_allclose by torch.testing.assert_close
* missing atol rtol
2024-03-28 09:53:31 +00:00
Lysandre Debut
39114c0383
Remove static pretrained maps from the library's internals ( #29112 )
...
* [test_all] Remove static pretrained maps from the library's internals
* Deprecate archive maps instead of removing them
* Revert init changes
* [test_all] Deprecate instead of removing
* [test_all] PVT v2 support
* [test_all] Tests should all pass
* [test_all] Style
* Address review comments
* Update src/transformers/models/deprecated/_archive_maps.py
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
* Update src/transformers/models/deprecated/_archive_maps.py
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
* [test_all] trigger tests
* [test_all] LLAVA
* [test_all] Bad rebase
---------
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
2024-03-25 10:33:38 +01:00
Lysandre Debut
f497f564bb
Update all references to canonical models ( #29001 )
...
* Script & Manual edition
* Update
2024-02-16 08:16:58 +01:00
jiqing-feng
1d7f406e19
fix assisted decoding assistant model inputs ( #27503 )
...
* fix assisted decoding attention_cat
* fix attention_mask for assisted decoding
* fix attention_mask len
* fix attn len
* Use a more clean way to prepare assistant models inputs
* fix param meaning
* fix param name
* fix assistant model inputs
* update token type ids
* fix assistant kwargs copy
* add encoder-decoder tests of assisted decoding
* check if assistant kwargs contains updated keys
* revert test
* fix whisper tests
* fix assistant kwargs
* revert whisper test
* delete _extend funcs
2023-11-27 14:23:54 +00:00
Arthur
651408a077
[`Styling`] stylify using ruff ( #27144 )
...
* try to stylify using ruff
* might need to remove these changes?
* use ruf format andruff check
* use isinstance instead of type comparision
* use # fmt: skip
* use # fmt: skip
* nits
* soem styling changes
* update ci job
* nits isinstance
* more files update
* nits
* more nits
* small nits
* check and format
* revert wrong changes
* actually use formatter instead of checker
* nits
* well docbuilder is overwriting this commit
* revert notebook changes
* try to nuke docbuilder
* style
* fix feature exrtaction test
* remve `indent-width = 4`
* fixup
* more nits
* update the ruff version that we use
* style
* nuke docbuilder styling
* leve the print for detected changes
* nits
* Remove file I/O
Co-authored-by: charliermarsh
<charlie.r.marsh@gmail.com>
* style
* nits
* revert notebook changes
* Add # fmt skip when possible
* Add # fmt skip when possible
* Fix
* More ` # fmt: skip` usage
* More ` # fmt: skip` usage
* More ` # fmt: skip` usage
* NIts
* more fixes
* fix tapas
* Another way to skip
* Recommended way
* Fix two more fiels
* Remove asynch
Remove asynch
---------
Co-authored-by: charliermarsh <charlie.r.marsh@gmail.com>
2023-11-16 17:43:19 +01:00
Arthur
186c077513
[`CI-test_torch`] skip test_tf_from_pt_safetensors and `test_assisted_decoding_sample` ( #27508 )
...
* skip 4 tests
* nits
* style
* wow it's not my day
* skip new failing tests
* style
* skip for NLLB MoE as well
2023-11-15 08:39:29 +01:00
Hz, Ji
50378cbf6c
device agnostic models testing ( #27146 )
...
* device agnostic models testing
* add decorator `require_torch_fp16`
* make style
* apply review suggestion
* Oops, the fp16 decorator was misused
2023-10-31 18:12:14 +01:00
Joao Gante
4692d26194
Switch Transformers: remove overwritten beam sample test ( #25458 )
2023-08-11 13:16:01 +01:00
Yih-Dar
bd90cda9a6
CI with `num_hidden_layers=2` 🚀 🚀 🚀 ( #25266 )
...
* CI with layers=2
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-08-02 20:22:36 +02:00
Arthur
b15343de6f
[Patch-t5-tokenizer] Patches the changes on T5 to make sure previous behaviour is still valide for beginning of words ( #24622 )
...
* patch `_tokenize` function
* more tests
* properly fix
* fixup
* Update src/transformers/models/t5/tokenization_t5.py
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* fix without ifs
* update
* protect import
* add python processing
* is first needed
* add doc and update with lefacy
* updaate
* fix T5 SPM converter
* styling
* fix T5 warning
* add is_seqio_available
* remove is_first
* revert some changes
* more tests and update
* update llama test batterie
* fixup
* refactor T5 spm common tests
* draft the llama tests
* update
* uopdate test
* nits
* refine
* name nit
* fix t5 tests
* fix T5
* update
* revert convert slow to fast changes that fail lots of tests
* legacy support
* fixup
* nits is first not defined
* don't use legacy behaviour for switch transformers
* style
* My attempt to check.
* nits
* fixes
* update
* fixup
* Apply suggestions from code review
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* updates
* fixup
* add legacy warning
* fixup
* warning_once nit
* update t5 documentation test
* update llama tok documentation
* add space to warning
* nits
* nit
* Apply suggestions from code review
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
* last nits
---------
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com>
2023-07-11 15:02:18 +02:00
Arthur
b52a03cd3b
⚠️ ⚠️ [`T5Tokenize`] Fix T5 family tokenizers ⚠️ ⚠️ ( #24565 )
...
* don't add space before single letter chars that don't have a merge
* fix the fix
* fixup
* add a test
* more testing
* fixup
* hack to make sure fast is also fixed
* update switch transformers test
* revert convert slow
* Update src/transformers/models/t5/tokenization_t5.py
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
* add typechecking
* quality
---------
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2023-06-30 07:00:43 +02:00
Younes Belkada
3ce3385c47
Revert "Fix gradient checkpointing + fp16 autocast for most models" ( #24420 )
...
Revert "Fix gradient checkpointing + fp16 autocast for most models (#24247 )"
This reverts commit 285a48011d
.
2023-06-22 16:11:27 +02:00
Younes Belkada
285a48011d
Fix gradient checkpointing + fp16 autocast for most models ( #24247 )
...
* fix gc bug
* continue PoC on OPT
* fixes
* 🤯
* fix tests
* remove pytest.mark
* fixup
* forward contrib credits from discussions
* forward contrib credits from discussions
* reverting changes on untouched files.
---------
Co-authored-by: zhaoqf123 <zhaoqf123@users.noreply.github.com>
Co-authored-by: 7eu7d7 <7eu7d7@users.noreply.github.com>
2023-06-21 17:04:59 +02:00
Yih-Dar
fa01127a67
update_pip_test_mapping ( #22606 )
...
* Add TFBlipForConditionalGeneration
* update pipeline_model_mapping
* Add import
* Revert changes in GPTSanJapaneseTest
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-04-06 17:56:06 +02:00
Yih-Dar
871c31a6f1
🔥 Rework pipeline testing by removing `PipelineTestCaseMeta` 🚀 ( #21516 )
...
* Add PipelineTesterMixin
* remove class PipelineTestCaseMeta
* move validate_test_components
* Add for ViT
* Add to SPECIAL_MODULE_TO_TEST_MAP
* style and quality
* Add feature-extraction
* update
* raise instead of skip
* add tiny_model_summary.json
* more explicit
* skip tasks not in mapping
* add availability check
* Add Copyright
* A way to diable irrelevant tests
* update with main
* remove disable_irrelevant_tests
* skip tests
* better skip message
* better skip message
* Add all pipeline task tests
* revert
* Import PipelineTesterMixin
* subclass test classes with PipelineTesterMixin
* Add pipieline_model_mapping
* Fix import after adding pipieline_model_mapping
* Fix style and quality after adding pipieline_model_mapping
* Fix one more import after adding pipieline_model_mapping
* Fix style and quality after adding pipieline_model_mapping
* Fix test issues
* Fix import requirements
* Fix mapping for MobileViTModelTest
* Update
* Better skip message
* pipieline_model_mapping could not be None
* Remove some PipelineTesterMixin
* Fix typo
* revert tests_fetcher.py
* update
* rename
* revert
* Remove PipelineTestCaseMeta from ZeroShotAudioClassificationPipelineTests
* style and quality
* test fetcher for all pipeline/model tests
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-02-28 19:40:57 +01:00
Sylvain Gugger
6f79d26442
Update quality tooling for formatting ( #21480 )
...
* Result of black 23.1
* Update target to Python 3.7
* Switch flake8 to ruff
* Configure isort
* Configure isort
* Apply isort with line limit
* Put the right black version
* adapt black in check copies
* Fix copies
2023-02-06 18:10:56 -05:00
Yih-Dar
59d5edef34
Avoid flaky generation sampling tests ( #21445 )
...
* fix
* fix
---------
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-02-03 22:01:25 +01:00
Younes Belkada
74297d0a55
[Switch Transformers] Fix failing slow test ( #20346 )
...
* run slow test on GPU
* remove unnecessary device assignment
* use `torch_device` instead
2022-11-21 15:36:49 +01:00
Younes Belkada
163ac3d3ee
Add Switch transformers ( #19323 )
...
* first commit
* add more comments
* add router v1
* clean up
- remove `tf` modeling files
* clean up
- remove `tf` modeling files
* clean up
* v0 routers
* added more router
- Implemented `ExpertsChooseMaskedRouter`
- added tests
- 2 more routers to implement
* last router
* improved docstring
- completed the docstring in `router.py`
- added more args in the config
* v0 sparse mlp
* replace wrong naming
* forward pass run
* update MOE layer
* small router update
* fixup
* consistency
* remove scatter router
* remove abstract layer
* update test and model for integration testing
* v1 conversion
* update
* hardcode hack
* all keys match
* add gin conversion, without additional libraries
* update conversion sctipy
* delete router file
* update tests wrt router deletion
* fix router issues
* update expert code
* update, logits match, code needsREFACTORING
* Refactor code
Co-authored-by: Younes Belkada <younesbelkada@users.noreply.github.com>
* add generate tests
Co-authored-by: younesbelkada <younesbelkada@gmail.com>
* add support for router loss
Co-authored-by: Younes Belkada <younesbelkada@users.noreply.github.com>
* fix forward error
* refactor a bit
* remove `FlaxSwitchTransformers` modules
* more tests pass
* Update code
Co-authored-by: Younes Belkada <younesbelkada@users.noreply.github.com>
* fixup
* fix tests
* fix doc
* fix doc + tokenization
* fix tokenizer test
* fix test
* fix loss output
* update code for backward pass
* add loss support
* update documentation
* fix documentation, clean tokenizer
* more doc fix, cleanup example_switch
* fix failing test
* fix test
* fix test
* fix loss issue
* move layer
* update doc and fix router capacity usage
* fixup
* add sparse mlp index for documentation on hub
* fixup
* test sparse mix architecture
* Apply suggestions from code review
* Update docs/source/en/model_doc/switch_transformers.mdx
* fixup on update
* fix tests
* fix another test
* attempt fix
* Update src/transformers/models/switch_transformers/configuration_switch_transformers.py
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
* Update src/transformers/models/switch_transformers/convert_switch_transformers_original_flax_checkpoint_to_pytorch.py
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
* try
* all tests pass
* fix jitter noise
* Apply suggestions from code review
* doc tests pass
* Update src/transformers/models/switch_transformers/modeling_switch_transformers.py
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
* Update src/transformers/models/switch_transformers/modeling_switch_transformers.py
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
* remove assert
* change config order
* fix readme japanese
* Apply suggestions from code review
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
* remove parallelizable tests + add one liners
* remove ONNX config
* fix nits
- add `T5Tokenizer` in auto mapping
- remove `Switch Transformers` from ONNX supported models
* remove `_get_router`
* remove asserts
* add check in test for `router_dtype`
* add `SwitchTransformersConfig` in `run_pipeline_test`
* Update tests/pipelines/test_pipelines_summarization.py
* add huge model conversion script
* fix slow tests
- add better casting for `Linear8bitLt`
- remove `torchscript` tests
* add make dir
* style on new script
* fix nits
- doctest
- remove `_keys_to_ignore_on_load_unexpected`
* Update src/transformers/models/switch_transformers/configuration_switch_transformers.py
* add google as authors
* fix year
* remove last `assert` statements
* standardize vertical spaces
* fix failing import
* fix another failing test
* Remove strange àuthorized_keys`
* removing todo and padding that is never used
Co-authored-by: Arthur Zucker <arthur.zucker@gmail.com>
Co-authored-by: ybelkada <younes@huggingface.co>
Co-authored-by: Younes Belkada <younesbelkada@users.noreply.github.com>
Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
Co-authored-by: Arthur Zucker <arthur@huggingface.co>
2022-11-15 13:06:45 +01:00