[Feature] Enable bf16 in AmpOptimWrapper (#960)
* support bf16 in AmpOptimWrapper * add docstring * modify docs * add unittests for bf16 in AmpOptimWrapper * fix type * fix to pass ci * fix ut skip logic to pass ci * fix as comment * add type hints * fix docstring and add warning information * remove check for pytorch>=1.6 in unittest * modify unittest * modify unittest * remove torch.float32 && torch.float64 from valid dtypes * fix as comments * minor refine docstring * fix unittest parameterized to pass CI * fix unittest && add back torch.float32, torch.float64
Showing
- docs/en/common_usage/speed_up_training.md 10 additions, 2 deletionsdocs/en/common_usage/speed_up_training.md
- docs/zh_cn/common_usage/speed_up_training.md 10 additions, 2 deletionsdocs/zh_cn/common_usage/speed_up_training.md
- mmengine/optim/optimizer/amp_optimizer_wrapper.py 28 additions, 2 deletionsmmengine/optim/optimizer/amp_optimizer_wrapper.py
- tests/test_optim/test_optimizer/test_optimizer_wrapper.py 53 additions, 35 deletionstests/test_optim/test_optimizer/test_optimizer_wrapper.py
Please register or sign in to comment