transformers/tests/models/roberta_prelayernorm
Yih-Dar 6ea3ee3cd2
Fix `test_model_parallelism` (#25359)
* fix

* fix

---------

Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
2023-08-08 10:48:45 +02:00
..
__init__.py Implement Roberta PreLayerNorm (#20305) 2022-12-19 09:30:17 +01:00
test_modeling_flax_roberta_prelayernorm.py CI with `num_hidden_layers=2` 🚀🚀🚀 (#25266) 2023-08-02 20:22:36 +02:00
test_modeling_roberta_prelayernorm.py Fix `test_model_parallelism` (#25359) 2023-08-08 10:48:45 +02:00
test_modeling_tf_roberta_prelayernorm.py Speed up TF tests by reducing hidden layer counts (#24595) 2023-06-30 16:30:33 +01:00