Skip to content
Snippets Groups Projects
  1. Mar 01, 2023
    • Qian Zhao's avatar
      [Feature] Enable bf16 in AmpOptimWrapper (#960) · 2ed8e343
      Qian Zhao authored
      * support bf16 in AmpOptimWrapper
      
      * add docstring
      
      * modify docs
      
      * add unittests for bf16 in AmpOptimWrapper
      
      * fix type
      
      * fix to pass ci
      
      * fix ut skip logic to pass ci
      
      * fix as comment
      
      * add type hints
      
      * fix docstring and add warning information
      
      * remove check for pytorch>=1.6 in unittest
      
      * modify unittest
      
      * modify unittest
      
      * remove torch.float32 && torch.float64 from valid dtypes
      
      * fix as comments
      
      * minor refine docstring
      
      * fix unittest parameterized to pass CI
      
      * fix unittest && add back torch.float32, torch.float64
      2ed8e343
  2. Feb 06, 2023
    • xcnick's avatar
      [Feature] Add ApexOptimWrapper (#742) · e35ed5fd
      xcnick authored
      
      * add ApexOptimWrapper
      
      * typo fix
      
      * add apex amp.initialize in optim_context
      
      * assert apex_amp
      
      * polish code
      
      * add parameters of apex_amp.initialize
      
      * add docs
      
      * polish code
      
      * polish code
      
      * polish code
      
      * fix calling of apex amp load_state_dict
      
      * polish
      
      * add comments
      
      * Update apex_optimizer_wrapper.py
      
      * Update apex_optimizer_wrapper.py
      
      ---------
      
      Co-authored-by: default avatarZaida Zhou <58739961+zhouzaida@users.noreply.github.com>
      e35ed5fd
  3. Oct 18, 2022
  4. Sep 15, 2022
  5. Aug 24, 2022
    • Zaida Zhou's avatar
      [Refactor] Refactor code structure (#395) · 7e1d7af2
      Zaida Zhou authored
      * Rename data to structure
      
      * adjust the way to import module
      
      * adjust the way to import module
      
      * rename Structure to Data Structures in docs api
      
      * rename structure to structures
      
      * support using some modules of mmengine without torch
      
      * fix circleci config
      
      * fix circleci config
      
      * fix registry ut
      
      * minor fix
      
      * move init method from model/utils to model/weight_init.py
      
      * move init method from model/utils to model/weight_init.py
      
      * move sync_bn to model
      
      * move functions depending on torch to dl_utils
      
      * format import
      
      * fix logging ut
      
      * add weight init in model/__init__.py
      
      * move get_config and get_model to mmengine/hub
      
      * move log_processor.py to mmengine/runner
      
      * fix ut
      
      * Add TimeCounter in dl_utils/__init__.py
      7e1d7af2
  6. Aug 23, 2022
  7. Jul 20, 2022
  8. Jul 06, 2022
  9. Jun 28, 2022
  10. Jun 13, 2022
    • Mashiro's avatar
      [Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284) · b7866021
      Mashiro authored
      * merge context
      
      * update unit test
      
      * add docstring
      
      * fix bug in AmpOptimWrapper
      
      * add docstring for backward
      
      * add warning and docstring for accumuate gradient
      
      * fix docstring
      
      * fix docstring
      
      * add params_group method
      
      * fix as comment
      
      * fix as comment
      
      * make default_value of loss_scale to dynamic
      
      * Fix docstring
      
      * decouple should update and should no sync
      
      * rename attribute in OptimWrapper
      
      * fix docstring
      
      * fix comment
      
      * fix comment
      
      * fix as comment
      
      * fix as comment and add unit test
      b7866021
  11. Jun 01, 2022
    • Mashiro's avatar
      [Feature] Add optimizer wrapper (#265) · 3e3866c1
      Mashiro authored
      
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * add optimizer_wrapper
      
      * fix comment and docstirng
      
      * fix unit test
      
      * add unit test
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * add optimizer_wrapper
      
      * fix mypy
      
      * fix lint
      
      * fix OptimizerWrapperDict docstring and add unit test
      
      * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
      
      * Fix AmpOptimizerWrapper
      
      * rename build_optmizer_wrapper to build_optim_wrapper
      
      * refine optimizer wrapper
      
      * fix AmpOptimWrapper.step, docstring
      
      * resolve confict
      
      * rename DefaultOptimConstructor
      
      * fix as comment
      
      * rename clig grad auguments
      
      * refactor optim_wrapper config
      
      * fix docstring of DefaultOptimWrapperConstructor
      
      fix docstring of DefaultOptimWrapperConstructor
      
      * add get_lr method to OptimWrapper and OptimWrapperDict
      
      * skip some amp unit test
      
      * fix unit test
      
      * fix get_lr, get_momentum docstring
      
      * refactor get_lr, get_momentum, fix as comment
      
      * fix error message
      
      Co-authored-by: default avatarzhouzaida <zhouzaida@163.com>
      3e3866c1
Loading