transformers/tests/fixtures
Patrick von Platen 9f3aa46f45
Add Unispeech & Unispeech-SAT (#13963)
* unispeech

* add copy from

* remove hubert copy from

* finish for today

* add unispeech-sat

* adapt more

* up

* up

* up

* up

* add modeling

* add tests

* up

* up

* finish

* up

* Apply suggestions from code review

* up

* up

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* up

* up

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
2021-10-26 18:59:58 +02:00
..
tests_samples Fix img classification tests (#13456) 2021-09-07 05:58:45 -04:00
dummy-config.json AutoConfig + other Auto classes honor model_type 2020-01-11 02:46:17 +00:00
dummy_feature_extractor_config.json Add Unispeech & Unispeech-SAT (#13963) 2021-10-26 18:59:58 +02:00
empty.txt GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
input.txt GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
merges.txt [AutoTokenizer] Allow creation of tokenizers by tokenizer type (#13668) 2021-09-22 00:29:38 +02:00
preprocessor_config.json Add the ImageClassificationPipeline (#11598) 2021-05-07 08:08:40 -04:00
sample_text.txt GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
sample_text_no_unicode.txt [Dependencies|tokenizers] Make both SentencePiece and Tokenizers optional dependencies (#7659) 2020-10-18 20:51:24 +02:00
spiece.model GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
test_sentencepiece.model GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
test_sentencepiece_bpe.model Conversion from slow to fast for BPE spm vocabs contained an error. (#10120) 2021-02-13 08:24:53 -05:00
test_sentencepiece_no_bos.model [pegasus] Faster tokenizer tests (#7672) 2020-10-09 11:10:32 -04:00
vocab.json [AutoTokenizer] Allow creation of tokenizers by tokenizer type (#13668) 2021-09-22 00:29:38 +02:00
vocab.txt [AutoTokenizer] Allow creation of tokenizers by tokenizer type (#13668) 2021-09-22 00:29:38 +02:00