Skip to content
Snippets Groups Projects
  1. Mar 01, 2023
    • Qian Zhao's avatar
      [Feature] Enable bf16 in AmpOptimWrapper (#960) · 2ed8e343
      Qian Zhao authored
      * support bf16 in AmpOptimWrapper
      
      * add docstring
      
      * modify docs
      
      * add unittests for bf16 in AmpOptimWrapper
      
      * fix type
      
      * fix to pass ci
      
      * fix ut skip logic to pass ci
      
      * fix as comment
      
      * add type hints
      
      * fix docstring and add warning information
      
      * remove check for pytorch>=1.6 in unittest
      
      * modify unittest
      
      * modify unittest
      
      * remove torch.float32 && torch.float64 from valid dtypes
      
      * fix as comments
      
      * minor refine docstring
      
      * fix unittest parameterized to pass CI
      
      * fix unittest && add back torch.float32, torch.float64
      2ed8e343
  2. Feb 23, 2023
  3. Feb 15, 2023
  4. Feb 13, 2023
  5. Feb 08, 2023
    • Qian Zhao's avatar
      [Docs] Resolve warnings in sphinx build (#915) · c712070c
      Qian Zhao authored
      * add ZeroOptimizer to optim
      
      * resolve `duplicate label` warnings
      
      * upgrade docutils && shpinx to resolve `unknown directive or role` warnings
      
      * fix typo
      
      * resolve literal_block && heading warnings
      
      * resolve json literal_block warnings
      
      * resolve python literal_block warnings
      
      * resolve bunches of reference warnings
      
      * resolve bunches of docstring warnings
      
      * resolve warnings in autosummary
      
      * resolve remaining warnings in en docs
      
      * resolve heading warnings in zh_cn docs
      
      * resolve remaining warnings in zh_cn docs
      
      * fix as comments
      
      * fix as comments
      c712070c
  6. Feb 06, 2023
    • xcnick's avatar
      [Feature] Add ApexOptimWrapper (#742) · e35ed5fd
      xcnick authored
      
      * add ApexOptimWrapper
      
      * typo fix
      
      * add apex amp.initialize in optim_context
      
      * assert apex_amp
      
      * polish code
      
      * add parameters of apex_amp.initialize
      
      * add docs
      
      * polish code
      
      * polish code
      
      * polish code
      
      * fix calling of apex amp load_state_dict
      
      * polish
      
      * add comments
      
      * Update apex_optimizer_wrapper.py
      
      * Update apex_optimizer_wrapper.py
      
      ---------
      
      Co-authored-by: default avatarZaida Zhou <58739961+zhouzaida@users.noreply.github.com>
      e35ed5fd
  7. Feb 03, 2023
  8. Jan 16, 2023
    • LEFTeyes's avatar
      [Feature] Support ReduceOnPlateauParamScheduler(#819) · 0b59a90a
      LEFTeyes authored
      
      * [Feature] Add ReduceOnPlateauParamScheduler and change ParamSchedulerHook
      
      * [Feature] add ReduceOnPlateauLR and ReduceOnPlateauMomentum
      
      * pre-commit check
      
      * add a little docs
      
      * change position
      
      * fix the conflict between isort and yapf
      
      * fix ParamSchedulerHook after_val_epoch execute without train_loop and param_schedulers built
      
      * Apply suggestions from code review
      
      Co-authored-by: default avatarMashiro <57566630+HAOCHENYE@users.noreply.github.com>
      
      * update ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ParamSchedulerHook
      
      * fix get need_step_args attribute error in ParamSchedulerHook
      
      * fix load_state_dict error for rule in ReduceOnPlateauParamScheduler
      
      * add docs for ParamSchedulerHook and fix a few codes
      
      * [Docs] add ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ReduceOnPlateauLR docs
      
      * [Refactor] adjust the order of import
      
      * [Fix] add init check for threshold in ReduceOnPlateauParamScheduler
      
      * [Test] add test for ReduceOnPlateauParamScheduler, ReduceOnPlateauLR and ReduceOnPlateauMomentum
      
      * [Fix] fix no attribute self.min_value
      
      * [Fix] fix numerical problem in tests
      
      * [Fix] fix error in tests
      
      * [Fix] fix ignore first param in tests
      
      * [Fix] fix bug in tests
      
      * [Fix] fix bug in tests
      
      * [Fix] fix bug in tests
      
      * [Fix] increase coverage
      
      * [Fix] fix count self._global_step bug and docs
      
      * [Fix] fix tests
      
      * [Fix] modified ParamSchedulerHook test
      
      * Update mmengine/optim/scheduler/param_scheduler.py
      
      Co-authored-by: default avatarMashiro <57566630+HAOCHENYE@users.noreply.github.com>
      
      * Apply suggestions from code review
      
      Co-authored-by: default avatarMashiro <57566630+HAOCHENYE@users.noreply.github.com>
      
      * [Fix] modified something according to commented
      
      * [Docs] add api for en and zh_cn
      
      * [Fix] fix bug in test_param_scheduler_hook.py
      
      * [Test] support more complicated test modes(less, greater, rel, abs) for ReduceOnPlateauParamScheduler
      
      * [Docs] add docs for rule
      
      * [Fix] fix pop from empty list bug in test
      
      * [Fix] fix check param_schedulers is not built bug
      
      * [Fix] fix step_args bug and without runner._train_loop bug
      
      * [Fix] fix step_args bug and without runner._train_loop bug
      
      * [Fix] fix scheduler type bug
      
      * [Test] rename step_args to step_kwargs
      
      * [Fix] remove redundancy check
      
      * [Test] remove redundancy check
      
      * Apply suggestions from code review
      
      Co-authored-by: default avatarZaida Zhou <58739961+zhouzaida@users.noreply.github.com>
      
      * [Test] fix some defects
      
      Co-authored-by: default avatarMashiro <57566630+HAOCHENYE@users.noreply.github.com>
      Co-authored-by: default avatarZaida Zhou <58739961+zhouzaida@users.noreply.github.com>
      0b59a90a
  9. Dec 19, 2022
  10. Dec 16, 2022
  11. Dec 12, 2022
  12. Dec 08, 2022
  13. Nov 22, 2022
  14. Nov 19, 2022
  15. Nov 08, 2022
  16. Nov 01, 2022
  17. Oct 27, 2022
  18. Oct 26, 2022
  19. Oct 24, 2022
  20. Oct 18, 2022
  21. Oct 08, 2022
  22. Aug 26, 2022
  23. Aug 24, 2022
    • Zaida Zhou's avatar
      [Refactor] Refactor code structure (#395) · 7e1d7af2
      Zaida Zhou authored
      * Rename data to structure
      
      * adjust the way to import module
      
      * adjust the way to import module
      
      * rename Structure to Data Structures in docs api
      
      * rename structure to structures
      
      * support using some modules of mmengine without torch
      
      * fix circleci config
      
      * fix circleci config
      
      * fix registry ut
      
      * minor fix
      
      * move init method from model/utils to model/weight_init.py
      
      * move init method from model/utils to model/weight_init.py
      
      * move sync_bn to model
      
      * move functions depending on torch to dl_utils
      
      * format import
      
      * fix logging ut
      
      * add weight init in model/__init__.py
      
      * move get_config and get_model to mmengine/hub
      
      * move log_processor.py to mmengine/runner
      
      * fix ut
      
      * Add TimeCounter in dl_utils/__init__.py
      7e1d7af2
  24. Aug 11, 2022
  25. Aug 01, 2022
  26. Jul 20, 2022
  27. Jul 15, 2022
  28. Jul 06, 2022
  29. Jul 05, 2022
  30. Jun 22, 2022
    • Mashiro's avatar
      [Feature] Add autocast wrapper (#307) · 312f264e
      Mashiro authored
      * add autocast wrapper
      
      * fix docstring
      
      * fix docstring
      
      * fix compare version
      
      * fix unit test
      
      * fix incompatible arguments
      
      * fix as comment
      
      * fix unit test
      
      * rename auto_cast to autocast
      312f264e
  31. Jun 21, 2022
  32. Jun 13, 2022
    • Mashiro's avatar
      [Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284) · b7866021
      Mashiro authored
      * merge context
      
      * update unit test
      
      * add docstring
      
      * fix bug in AmpOptimWrapper
      
      * add docstring for backward
      
      * add warning and docstring for accumuate gradient
      
      * fix docstring
      
      * fix docstring
      
      * add params_group method
      
      * fix as comment
      
      * fix as comment
      
      * make default_value of loss_scale to dynamic
      
      * Fix docstring
      
      * decouple should update and should no sync
      
      * rename attribute in OptimWrapper
      
      * fix docstring
      
      * fix comment
      
      * fix comment
      
      * fix as comment
      
      * fix as comment and add unit test
      b7866021
    • Miao Zheng's avatar
      [Features]Add OneCycleLR (#296) · fd295741
      Miao Zheng authored
      * [Features]Add OnecycleLR
      
      * [Features]Add OnecycleLR
      
      * yapf disable
      
      * build_iter_from_epoch
      
      * add epoch
      
      * fix args
      
      * fix according to comments;
      
      * lr-param
      
      * fix according to comments
      
      * defaults -> default to
      
      * remove epoch and steps per step
      
      * variabel names
      fd295741
  33. Jun 09, 2022
  34. Jun 05, 2022
  35. Jun 01, 2022
    • Mashiro's avatar
      [Feature] Add optimizer wrapper (#265) · 3e3866c1
      Mashiro authored
      
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * add optimizer_wrapper
      
      * fix comment and docstirng
      
      * fix unit test
      
      * add unit test
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * add optimizer_wrapper
      
      * fix mypy
      
      * fix lint
      
      * fix OptimizerWrapperDict docstring and add unit test
      
      * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
      
      * Fix AmpOptimizerWrapper
      
      * rename build_optmizer_wrapper to build_optim_wrapper
      
      * refine optimizer wrapper
      
      * fix AmpOptimWrapper.step, docstring
      
      * resolve confict
      
      * rename DefaultOptimConstructor
      
      * fix as comment
      
      * rename clig grad auguments
      
      * refactor optim_wrapper config
      
      * fix docstring of DefaultOptimWrapperConstructor
      
      fix docstring of DefaultOptimWrapperConstructor
      
      * add get_lr method to OptimWrapper and OptimWrapperDict
      
      * skip some amp unit test
      
      * fix unit test
      
      * fix get_lr, get_momentum docstring
      
      * refactor get_lr, get_momentum, fix as comment
      
      * fix error message
      
      Co-authored-by: default avatarzhouzaida <zhouzaida@163.com>
      3e3866c1
  36. May 31, 2022
  37. May 10, 2022
  38. Apr 27, 2022
Loading