set the default cache_enable to True, aligned with the default value in pytorch cpu/cuda amp autocast (#20289)

Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>

Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
This commit is contained in:
Wang, Yi 2022-11-17 22:21:06 +08:00 committed by GitHub
parent 07b8f249cd
commit 8b8b23a8cd
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 1 additions and 1 deletions

View File

@ -2476,7 +2476,7 @@ class Trainer:
"""
return self.ctx_manager_torchdynamo
def autocast_smart_context_manager(self, cache_enabled: Optional[bool] = None):
def autocast_smart_context_manager(self, cache_enabled: Optional[bool] = True):
"""
A helper wrapper that creates an appropriate context manager for `autocast` while feeding it the desired
arguments, depending on the situation.