- Oct 18, 2022
-
-
Mashiro authored
* [Enhance] add documents for , and support clip grad by value * refine docstring * fix as comment * Fix as comment * minor refine * minor refine * remove error comment for clip grad * refine docstring
-
- Aug 26, 2022
-
-
Mashiro authored
* add args to OptimWrapper.step backward zero_grad * minor refine * minor refine
-
- Aug 24, 2022
-
-
Zaida Zhou authored
* Rename data to structure * adjust the way to import module * adjust the way to import module * rename Structure to Data Structures in docs api * rename structure to structures * support using some modules of mmengine without torch * fix circleci config * fix circleci config * fix registry ut * minor fix * move init method from model/utils to model/weight_init.py * move init method from model/utils to model/weight_init.py * move sync_bn to model * move functions depending on torch to dl_utils * format import * fix logging ut * add weight init in model/__init__.py * move get_config and get_model to mmengine/hub * move log_processor.py to mmengine/runner * fix ut * Add TimeCounter in dl_utils/__init__.py
-
- Jul 20, 2022
-
-
Mashiro authored
* fix save scheduler state dict with optim wrapper * remove for loop and inherit TestParameterScheduler * remove for loop and inherit TestParameterScheduler * minor refine
-
- Jul 06, 2022
-
-
Mashiro authored
* fix optimizer wrapper counts * fix ut
-
- Jul 05, 2022
-
-
RangiLyu authored
* [Enhance] Support scheduling betas with MomentumScheduler. * enhance ut * test adam betas * enhance ut * enhance ut
-
- Jun 13, 2022
-
-
Mashiro authored
* merge context * update unit test * add docstring * fix bug in AmpOptimWrapper * add docstring for backward * add warning and docstring for accumuate gradient * fix docstring * fix docstring * add params_group method * fix as comment * fix as comment * make default_value of loss_scale to dynamic * Fix docstring * decouple should update and should no sync * rename attribute in OptimWrapper * fix docstring * fix comment * fix comment * fix as comment * fix as comment and add unit test
-
- Jun 01, 2022
-
-
Mashiro authored
* Support multiple optimizers * minor refinement * improve unit tests * minor fix * Update unit tests for resuming or saving ckpt for multiple optimizers * refine docstring * refine docstring * fix typo * update docstring * refactor the logic to build multiple optimizers * resolve comments * ParamSchedulers spports multiple optimizers * add optimizer_wrapper * fix comment and docstirng * fix unit test * add unit test * refine docstring * RuntimeInfoHook supports printing multi learning rates * resolve comments * add optimizer_wrapper * fix mypy * fix lint * fix OptimizerWrapperDict docstring and add unit test * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment * Fix AmpOptimizerWrapper * rename build_optmizer_wrapper to build_optim_wrapper * refine optimizer wrapper * fix AmpOptimWrapper.step, docstring * resolve confict * rename DefaultOptimConstructor * fix as comment * rename clig grad auguments * refactor optim_wrapper config * fix docstring of DefaultOptimWrapperConstructor fix docstring of DefaultOptimWrapperConstructor * add get_lr method to OptimWrapper and OptimWrapperDict * skip some amp unit test * fix unit test * fix get_lr, get_momentum docstring * refactor get_lr, get_momentum, fix as comment * fix error message Co-authored-by:
zhouzaida <zhouzaida@163.com>
-