Skip to content
Snippets Groups Projects
  1. Aug 17, 2022
  2. Aug 11, 2022
  3. Aug 02, 2022
  4. Jun 13, 2022
  5. Jun 07, 2022
    • Mashiro's avatar
      [Feature]: add base model, ddp model wrapper and unit test (#268) · f04fec73
      Mashiro authored
      * add base model, ddp model and unit test
      
      * add unit test
      
      * fix unit test
      
      * fix docstring
      
      * fix cpu unit test
      
      * refine base data preprocessor
      
      * refine base data preprocessor
      
      * refine interface of ddp module
      
      * remove optimizer hook
      
      * add forward
      
      * fix as comment
      
      * fix unit test
      
      * fix as comment
      
      * fix build optimizer wrapper
      
      * rebase main and fix unit test
      
      * stack_batch support stacking ndim tensor, add docstring for merge dict
      
      * fix lint
      
      * fix test loop
      
      * make precision_context effective to data_preprocessor
      
      * fix as comment
      
      * fix as comment
      
      * refine docstring
      
      * change collate_data output typehints
      
      * rename to_rgb to bgr_to_rgb and rgb_to_bgr
      
      * support build basemodel with built DataPreprocessor
      
      * fix as comment
      
      * fix docstring
      f04fec73
  6. Jun 01, 2022
    • Mashiro's avatar
      [Feature] Add optimizer wrapper (#265) · 3e3866c1
      Mashiro authored
      
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * add optimizer_wrapper
      
      * fix comment and docstirng
      
      * fix unit test
      
      * add unit test
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * add optimizer_wrapper
      
      * fix mypy
      
      * fix lint
      
      * fix OptimizerWrapperDict docstring and add unit test
      
      * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
      
      * Fix AmpOptimizerWrapper
      
      * rename build_optmizer_wrapper to build_optim_wrapper
      
      * refine optimizer wrapper
      
      * fix AmpOptimWrapper.step, docstring
      
      * resolve confict
      
      * rename DefaultOptimConstructor
      
      * fix as comment
      
      * rename clig grad auguments
      
      * refactor optim_wrapper config
      
      * fix docstring of DefaultOptimWrapperConstructor
      
      fix docstring of DefaultOptimWrapperConstructor
      
      * add get_lr method to OptimWrapper and OptimWrapperDict
      
      * skip some amp unit test
      
      * fix unit test
      
      * fix get_lr, get_momentum docstring
      
      * refactor get_lr, get_momentum, fix as comment
      
      * fix error message
      
      Co-authored-by: default avatarzhouzaida <zhouzaida@163.com>
      3e3866c1
  7. May 31, 2022
    • Zaida Zhou's avatar
      [Feature] Support multiple optimizers (#235) · f1da9a1d
      Zaida Zhou authored
      * Support multiple optimizers
      
      * minor refinement
      
      * improve unit tests
      
      * minor fix
      
      * Update unit tests for resuming or saving ckpt for multiple optimizers
      
      * refine docstring
      
      * refine docstring
      
      * fix typo
      
      * update docstring
      
      * refactor the logic to build multiple optimizers
      
      * resolve comments
      
      * ParamSchedulers spports multiple optimizers
      
      * refine docstring
      
      * RuntimeInfoHook supports printing multi learning rates
      
      * resolve comments
      
      * fix typo
      f1da9a1d
  8. May 26, 2022
Loading