transformers/tests
Raushan Turganbay 5ad960f1f4
Add Watermarking LogitsProcessor and WatermarkDetector (#29676)
* add watermarking processor

* remove the other hashing (context width=1 always)

* make style

* Update src/transformers/generation/logits_process.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/logits_process.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/logits_process.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/configuration_utils.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* update watermarking process

* add detector

* update tests to use detector

* fix failing tests

* rename `input_seq`

* make style

* doc for processor

* minor fixes

* docs

* make quality

* Update src/transformers/generation/configuration_utils.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/logits_process.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/watermarking.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/watermarking.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/watermarking.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* add PR suggestions

* let's use lru_cache's default max size (128)

* import processor if torch available

* maybe like this

* lets move the config to torch independet file

* add docs

* tiny docs fix to make the test happy

* Update src/transformers/generation/configuration_utils.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* Update src/transformers/generation/watermarking.py

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>

* PR suggestions

* add docs

* fix test

* fix docs

* address pr comments

* style

* Revert "style"

This reverts commit 7f33cc34ff.

* correct style

* make doctest green

---------

Co-authored-by: Joao Gante <joaofranciscocardosogante@gmail.com>
2024-05-14 13:31:39 +05:00
..
agents Reboot Agents (#30387) 2024-05-07 12:59:49 +02:00
benchmark [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
bettertransformer Fixed malapropism error (#26660) 2023-10-09 11:04:57 +02:00
deepspeed 🚨🚨🚨Deprecate `evaluation_strategy` to `eval_strategy`🚨🚨🚨 (#30190) 2024-04-18 12:49:43 -04:00
extended CI: update to ROCm 6.0.2 and test MI300 (#30266) 2024-05-13 18:14:36 +02:00
fixtures Implementation of SuperPoint and AutoModelForKeypointDetection (#28966) 2024-03-19 14:43:02 +00:00
fsdp Add FSDP config for CPU RAM efficient loading through accelerate (#30002) 2024-04-22 13:15:28 +01:00
generation Add Watermarking LogitsProcessor and WatermarkDetector (#29676) 2024-05-14 13:31:39 +05:00
models Port IDEFICS to tensorflow (#26870) 2024-05-13 15:59:46 +01:00
optimization Add WSD scheduler (#30231) 2024-04-25 12:07:21 +01:00
peft_integration FIX [`CI`]: Fix failing tests for peft integration (#29330) 2024-02-29 03:56:16 +01:00
pipelines enable Pipeline to get device from model (#30534) 2024-05-13 15:00:39 +01:00
quantization [awq] replace scale when we have GELU (#30074) 2024-05-13 11:41:03 +02:00
repo_utils Allow `# Ignore copy` (#27328) 2023-12-07 10:00:08 +01:00
sagemaker Update all references to canonical models (#29001) 2024-02-16 08:16:58 +01:00
tokenization Remove static pretrained maps from the library's internals (#29112) 2024-03-25 10:33:38 +01:00
trainer CI: update to ROCm 6.0.2 and test MI300 (#30266) 2024-05-13 18:14:36 +02:00
utils load_image - decode b64encode and encodebytes strings (#30192) 2024-04-26 18:21:47 +01:00
__init__.py GPU text generation: mMoved the encoded_prompt to correct device 2020-01-06 15:11:12 +01:00
test_backbone_common.py Align backbone stage selection with out_indices & out_features (#27606) 2023-12-20 18:33:17 +00:00
test_cache_utils.py Generate: add tests for caches with `pad_to_multiple_of` (#29462) 2024-03-06 10:57:04 +00:00
test_configuration_common.py [ `PretrainedConfig`] Improve messaging (#27438) 2023-11-15 14:10:39 +01:00
test_configuration_utils.py [tests] remove deprecated tests for model loading (#29450) 2024-03-15 14:18:41 +00:00
test_feature_extraction_common.py Split common test from core tests (#24284) 2023-06-15 07:30:24 -04:00
test_feature_extraction_utils.py [tests] remove deprecated tests for model loading (#29450) 2024-03-15 14:18:41 +00:00
test_image_processing_common.py Raise unused kwargs image processor (#29063) 2024-02-20 16:20:20 +01:00
test_image_processing_utils.py [tests] remove deprecated tests for model loading (#29450) 2024-03-15 14:18:41 +00:00
test_image_transforms.py Normalize floating point cast (#27249) 2023-11-10 15:35:27 +00:00
test_modeling_common.py skip low_cpu_mem_usage tests (#30782) 2024-05-13 18:00:43 +02:00
test_modeling_flax_common.py fix: Replace deprecated `assertEquals` with `assertEqual` (#30241) 2024-04-15 09:36:06 +01:00
test_modeling_flax_utils.py Enable safetensors conversion from PyTorch to other frameworks without the torch requirement (#27599) 2024-01-23 10:28:23 +01:00
test_modeling_tf_common.py Port IDEFICS to tensorflow (#26870) 2024-05-13 15:59:46 +01:00
test_modeling_tf_utils.py Cast bfloat16 to float32 for Numpy conversions (#29755) 2024-03-21 14:04:11 +00:00
test_modeling_utils.py Llama: fix custom 4D masks, v2 (#30348) 2024-05-13 13:46:06 +02:00
test_pipeline_mixin.py Image Feature Extraction pipeline (#28216) 2024-02-05 14:50:07 +00:00
test_processing_common.py Don't save `processor_config.json` if a processor has no extra attribute (#28584) 2024-01-19 09:59:14 +00:00
test_sequence_feature_extraction_common.py Fix typo (#25966) 2023-09-05 10:12:25 +02:00
test_tokenization_common.py fix: Replace deprecated `assertEquals` with `assertEqual` (#30241) 2024-04-15 09:36:06 +01:00
test_tokenization_utils.py [tests] remove deprecated tests for model loading (#29450) 2024-03-15 14:18:41 +00:00