transformers/tests/models/nllb_moe
jiqing-feng 1d7f406e19
fix assisted decoding assistant model inputs (#27503)
* fix assisted decoding attention_cat

* fix attention_mask for assisted decoding

* fix attention_mask len

* fix attn len

* Use a more clean way to prepare assistant models inputs

* fix param meaning

* fix param name

* fix assistant model inputs

* update token type ids

* fix assistant kwargs copy

* add encoder-decoder tests of assisted decoding

* check if assistant kwargs contains updated keys

* revert test

* fix whisper tests

* fix assistant kwargs

* revert whisper test

* delete _extend funcs
2023-11-27 14:23:54 +00:00
..
__init__.py [WIP]`NLLB-MoE` Adds the moe model (#22024) 2023-03-27 19:42:00 +02:00
test_modeling_nllb_moe.py fix assisted decoding assistant model inputs (#27503) 2023-11-27 14:23:54 +00:00