Skip to content
Snippets Groups Projects
  1. Aug 13, 2022
  2. Aug 11, 2022
  3. Aug 09, 2022
  4. Aug 08, 2022
    • Mashiro's avatar
      [Enhance] Add build function for scheduler. (#372) · a07a0633
      Mashiro authored
      * add build function for scheduler
      
      * add unit test
      
      add unit test
      
      * handle convert_to_iter in build_scheduler_from_cfg
      
      * restore deleted code
      
      * format import
      
      * fix lint
      a07a0633
    • Mashiro's avatar
      [Fix] Fix build multiple list of scheduler for multiple optimizers (#383) · 55805426
      Mashiro authored
      * fix build multiple scheduler
      
      * add new unit test
      
      * fix comment and error message
      
      * fix comment and error message
      
      * extract _parse_scheduler_cfg
      
      * always call build_param_scheduler during train and resume. If there is only one optimizer, the defaut value for sheduler will be a list, otherwise there is multiple optimizer, the default value of sheduler will be a dict
      
      * minor refine
      
      * rename runner test exp name
      
      * fix as comment
      
      * minor refine
      
      * fix ut
      
      * only check parameter scheduler
      
      * minor refine
      55805426
  5. Aug 04, 2022
  6. Jul 19, 2022
  7. Jul 15, 2022
  8. Jul 14, 2022
    • Mashiro's avatar
      [Fix] fix resume message_hub (#353) · 78fad67d
      Mashiro authored
      * fix resume message_hub
      
      * add unit test
      
      * support resume from messagehub
      
      * minor refine
      
      * add comment
      
      * fix typo
      
      * update docstring
      78fad67d
  9. Jul 05, 2022
  10. Jun 30, 2022
  11. Jun 29, 2022
  12. Jun 24, 2022
    • Yuan Liu's avatar
      [Feature]: Set different seed to different rank (#298) · 03d5c17b
      Yuan Liu authored
      * [Feature]: Set different seed for diff rank
      
      * [Feature]: Add log
      
      * [Fix]: Fix lint
      
      * [Fix]: Fix docstring
      
      * [Fix]: Fix sampler seed
      
      * [Fix]: Fix log bug
      
      * [Fix]: Change diff_seed to diff_rank_seed
      
      * [Fix]: Fix lint
      03d5c17b
  13. Jun 22, 2022
  14. Jun 17, 2022
  15. Jun 14, 2022
  16. Jun 13, 2022
    • Mashiro's avatar
      [Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284) · b7866021
      Mashiro authored
      * merge context
      
      * update unit test
      
      * add docstring
      
      * fix bug in AmpOptimWrapper
      
      * add docstring for backward
      
      * add warning and docstring for accumuate gradient
      
      * fix docstring
      
      * fix docstring
      
      * add params_group method
      
      * fix as comment
      
      * fix as comment
      
      * make default_value of loss_scale to dynamic
      
      * Fix docstring
      
      * decouple should update and should no sync
      
      * rename attribute in OptimWrapper
      
      * fix docstring
      
      * fix comment
      
      * fix comment
      
      * fix as comment
      
      * fix as comment and add unit test
      b7866021
    • Mashiro's avatar
      [Fix] fix build train_loop during test (#295) · 8b0c9c5f
      Mashiro authored
      * fix build train_loop during test
      
      * fix build train_loop during test
      
      * fix build train_loop during test
      
      * fix build train_loop during test
      
      * Fix as comment
      8b0c9c5f
  17. Jun 10, 2022
  18. Jun 09, 2022
  19. Jun 07, 2022
  20. Jun 06, 2022
  21. Jun 05, 2022
  22. Jun 01, 2022
    • Mashiro's avatar
      [Feature] Add optimizer wrapper (#265) · 3e3866c1
      Mashiro authored
      
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * add optimizer_wrapper
      
      * fix comment and docstirng
      
      * fix unit test
      
      * add unit test
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * add optimizer_wrapper
      
      * fix mypy
      
      * fix lint
      
      * fix OptimizerWrapperDict docstring and add unit test
      
      * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
      
      * Fix AmpOptimizerWrapper
      
      * rename build_optmizer_wrapper to build_optim_wrapper
      
      * refine optimizer wrapper
      
      * fix AmpOptimWrapper.step, docstring
      
      * resolve confict
      
      * rename DefaultOptimConstructor
      
      * fix as comment
      
      * rename clig grad auguments
      
      * refactor optim_wrapper config
      
      * fix docstring of DefaultOptimWrapperConstructor
      
      fix docstring of DefaultOptimWrapperConstructor
      
      * add get_lr method to OptimWrapper and OptimWrapperDict
      
      * skip some amp unit test
      
      * fix unit test
      
      * fix get_lr, get_momentum docstring
      
      * refactor get_lr, get_momentum, fix as comment
      
      * fix error message
      
      Co-authored-by: default avatarzhouzaida <zhouzaida@163.com>
      3e3866c1
  23. May 31, 2022
    • Zaida Zhou's avatar
      [Feature] Support multiple optimizers (#235) · f1da9a1d
      Zaida Zhou authored
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * fix typo
      f1da9a1d
    • Jiazhen Wang's avatar
      [Enhance] Improve Exception in call_hook (#247) · f2190de7
      Jiazhen Wang authored
      * improve exception in call_hook
      
      * refine unit test
      
      * add test_call_hook
      
      * refine
      
      * update docstring and ut
      f2190de7
  24. May 27, 2022
  25. May 26, 2022
  26. May 25, 2022
  27. May 24, 2022
  28. May 20, 2022
  29. May 18, 2022
Loading