Skip to content
Snippets Groups Projects
  1. Jul 27, 2022
  2. Jul 20, 2022
  3. Jul 15, 2022
  4. Jul 06, 2022
  5. Jul 05, 2022
  6. Jun 28, 2022
  7. Jun 13, 2022
    • Mashiro's avatar
      [Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284) · b7866021
      Mashiro authored
      * merge context
      
      * update unit test
      
      * add docstring
      
      * fix bug in AmpOptimWrapper
      
      * add docstring for backward
      
      * add warning and docstring for accumuate gradient
      
      * fix docstring
      
      * fix docstring
      
      * add params_group method
      
      * fix as comment
      
      * fix as comment
      
      * make default_value of loss_scale to dynamic
      
      * Fix docstring
      
      * decouple should update and should no sync
      
      * rename attribute in OptimWrapper
      
      * fix docstring
      
      * fix comment
      
      * fix comment
      
      * fix as comment
      
      * fix as comment and add unit test
    • Miao Zheng's avatar
      [Features]Add OneCycleLR (#296) · fd295741
      Miao Zheng authored
      * [Features]Add OnecycleLR
      
      * [Features]Add OnecycleLR
      
      * yapf disable
      
      * build_iter_from_epoch
      
      * add epoch
      
      * fix args
      
      * fix according to comments;
      
      * lr-param
      
      * fix according to comments
      
      * defaults -> default to
      
      * remove epoch and steps per step
      
      * variabel names
  8. Jun 01, 2022
    • Mashiro's avatar
      [Feature] Add optimizer wrapper (#265) · 3e3866c1
      Mashiro authored
      
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * add optimizer_wrapper
      
      * fix comment and docstirng
      
      * fix unit test
      
      * add unit test
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * add optimizer_wrapper
      
      * fix mypy
      
      * fix lint
      
      * fix OptimizerWrapperDict docstring and add unit test
      
      * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
      
      * Fix AmpOptimizerWrapper
      
      * rename build_optmizer_wrapper to build_optim_wrapper
      
      * refine optimizer wrapper
      
      * fix AmpOptimWrapper.step, docstring
      
      * resolve confict
      
      * rename DefaultOptimConstructor
      
      * fix as comment
      
      * rename clig grad auguments
      
      * refactor optim_wrapper config
      
      * fix docstring of DefaultOptimWrapperConstructor
      
      fix docstring of DefaultOptimWrapperConstructor
      
      * add get_lr method to OptimWrapper and OptimWrapperDict
      
      * skip some amp unit test
      
      * fix unit test
      
      * fix get_lr, get_momentum docstring
      
      * refactor get_lr, get_momentum, fix as comment
      
      * fix error message
      
      Co-authored-by: default avatarzhouzaida <zhouzaida@163.com>
  9. May 10, 2022
  10. Apr 25, 2022
    • Mashiro's avatar
      [Fix] resolve conflict betweem adapt and main. (#198) · e0d00c5b
      Mashiro authored
      
      * [Docs] Refine registry documentation (#186)
      
      * [Docs] Refine registry documentation
      
      * reslove comments
      
      * minor refinement
      
      * Refine Visualizer docs (#177)
      
      * Refine Visualizer docs
      
      * update
      
      * update
      
      * update featmap
      
      * update docs
      
      * update visualizer docs
      
      * [Refactor] Refine LoggerHook (#155)
      
      * rename global accessible and intergration get_sintance and create_instance
      
      * move ManagerMixin to utils
      
      * fix as docstring and seporate get_instance to get_instance and get_current_instance
      
      * fix lint
      
      * fix docstring, rename and move test_global_meta
      
      * rename LogBuffer to HistoryBuffer, rename MessageHub methods, MessageHub support resume
      
      * refine MMLogger timestamp, update unit test
      
      * MMLogger add logger_name arguments
      
      * Fix docstring
      
      * Add LogProcessor and some unit test
      
      * update unit test
      
      * complete LogProcessor unit test
      
      * refine LoggerHook
      
      * solve circle import
      
      * change default logger_name to mmengine
      
      * refactor eta
      
      * Fix docstring comment and unitt test
      
      * Fix with runner
      
      * fix docstring
      
      fix docstring
      
      * fix docstring
      
      * Add by_epoch attribute to LoggerHook and fix docstring
      
      * Please mypy and fix comment
      
      * remove \ in MMLogger
      
      * Fix lint
      
      * roll back pre-commit-hook
      
      * Fix hook unit test
      
      * Fix comments
      
      * remove \t in log and add docstring
      
      * Fix as comment
      
      * should not accept other arguments if corresponding instance has been created
      
      * fix logging ddp file saving
      
      * fix logging ddp file saving
      
      * move log processor to logging
      
      * move log processor to logging
      
      * remove current datalaoder
      
      * fix docstring
      
      * fix unit test
      
      * add learing rate in messagehub
      
      * Support output training/validation/testing message after iterations/epochs
      
      * fix docstring
      
      * Fix IterBasedRunner log string
      
      * Fix IterBasedRunner log string
      
      * Support parse validation loss in log processor
      
      * [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR (#188)
      
      * [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR
      
      * min_lr -> eta_min, refined docstr
      
      Co-authored-by: default avatarZaida Zhou <58739961+zhouzaida@users.noreply.github.com>
      Co-authored-by: default avatarHaian Huang(深度眸) <1286304229@qq.com>
      Co-authored-by: default avatarTong Gao <gaotongxiao@gmail.com>
    • Tong Gao's avatar
      [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR (#188) · c3aff4fc
      Tong Gao authored
      * [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR
      
      * min_lr -> eta_min, refined docstr
  11. Mar 08, 2022
  12. Mar 01, 2022
  13. Feb 25, 2022
  14. Feb 19, 2022
  15. Feb 16, 2022
    • RangiLyu's avatar
      [Feature]: Add parameter schedulers. (#22) · 7905f039
      RangiLyu authored
      * [Feature]: Add parameter schedulers.
      
      * update
      
      * update
      
      * update
      
      * update
      
      * add docstring to lr and momentum
      
      * resolve comments
    • RangiLyu's avatar
      add scheduler unit test (#13) · bbb7d625
      RangiLyu authored
      * tmp
      
      * add scheduler unit test
      
      * disable yapf
      
      * add more test
      
      * add more test
      
      * not use torch test case
      
      * solve comments
      
      * update file
      
      * add more unit tests
      
      * resolve comments
      
      * update cosine ut
      
      * fix typo
      
      * solve comments
      
      * solve comments
      
      * resolve comments
Loading