transformers/tests/models/llama
Joao Gante 75bbfd5b22
Cache: Static cache as a standalone object (#30476)
2024-04-30 16:37:19 +01:00
..
__init__.py LLaMA Implementation (#21955) 2023-03-16 09:00:53 -04:00
test_modeling_flax_llama.py Add Llama Flax Implementation (#24587) 2023-12-07 07:05:00 +01:00
test_modeling_llama.py Cache: Static cache as a standalone object (#30476) 2024-04-30 16:37:19 +01:00
test_tokenization_llama.py [`LlamaTokenizerFast`] Refactor default llama (#28881) 2024-04-23 23:12:59 +02:00