transformers/tests/models/llama
Younes Belkada b647acdb53
FIX [`CI`] `require_read_token` in the llama FA2 test (#29361)
Update test_modeling_llama.py
2024-02-29 04:49:01 +01:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py FIX [`CI`] `require_read_token` in the llama FA2 test (#29361) 2024-02-29 04:49:01 +01:00
test_tokenization_llama.py [`Core tokenization`] `add_dummy_prefix_space` option to help with latest issues (#28010) 2024-02-20 12:50:31 +01:00