- Mar 01, 2023
-
-
Qian Zhao authored
* support bf16 in AmpOptimWrapper * add docstring * modify docs * add unittests for bf16 in AmpOptimWrapper * fix type * fix to pass ci * fix ut skip logic to pass ci * fix as comment * add type hints * fix docstring and add warning information * remove check for pytorch>=1.6 in unittest * modify unittest * modify unittest * remove torch.float32 && torch.float64 from valid dtypes * fix as comments * minor refine docstring * fix unittest parameterized to pass CI * fix unittest && add back torch.float32, torch.float64
-
- Feb 23, 2023
-
-
Zaida Zhou authored
-
- Feb 15, 2023
-
-
whcao authored
* fix the bug when the params in shared modules do not require grad * test DefaultOptimWrapperConstructor when the params in shared modules do not require grad
-
- Feb 13, 2023
-
-
Zaida Zhou authored
* Fix docstring * fix
-
- Feb 08, 2023
-
-
Qian Zhao authored
* add ZeroOptimizer to optim * resolve `duplicate label` warnings * upgrade docutils && shpinx to resolve `unknown directive or role` warnings * fix typo * resolve literal_block && heading warnings * resolve json literal_block warnings * resolve python literal_block warnings * resolve bunches of reference warnings * resolve bunches of docstring warnings * resolve warnings in autosummary * resolve remaining warnings in en docs * resolve heading warnings in zh_cn docs * resolve remaining warnings in zh_cn docs * fix as comments * fix as comments
-
- Feb 06, 2023
-
-
xcnick authored
* add ApexOptimWrapper * typo fix * add apex amp.initialize in optim_context * assert apex_amp * polish code * add parameters of apex_amp.initialize * add docs * polish code * polish code * polish code * fix calling of apex amp load_state_dict * polish * add comments * Update apex_optimizer_wrapper.py * Update apex_optimizer_wrapper.py --------- Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
-
- Feb 03, 2023
-
-
takuoko authored
* add dadaptation * Update mmengine/optim/optimizer/builder.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * update dadaptation docs --------- Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
-
- Dec 19, 2022
-
-
Qian Zhao authored
* fix zero_optimizer error with param groups when pytorch < 1.12.0 * add docstring * fix docstring * add unittest * change ut to use a valid paramwise_cfg * modify ut * fix as comments
-
- Dec 16, 2022
-
-
RangiLyu authored
* [Fix] Fix bias decay mult of depth-wise conv. * support flatten param weight decay multiplier * add unit tests * REMOVE TODO * update doc
-
- Dec 12, 2022
-
-
Zaida Zhou authored
* [Docs] Fix docstring format * minor refine * minor refine
-
- Dec 08, 2022
-
-
Ming-Hsuan-Tu authored
* [Enhance] Support step arugments and zero arguments with update_params * Update mmengine/optim/optimizer/optimizer_wrapper.py * Update mmengine/optim/optimizer/optimizer_wrapper.py Co-authored-by:
Mashiro <57566630+HAOCHENYE@users.noreply.github.com> Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
-
- Oct 27, 2022
-
-
Hakjin Lee authored
* [Feature] Support torch ZeRORedundancyOptimizer Co-authored-by:
Junhwa Song <ethan9867@gmail.com> Signed-off-by:
Junhwa Song <ethan9867@gmail.com> Signed-off-by:
Hakjin Lee <nijkah@gmail.com> * lint * Fix saving optimizer state_dict * Fix handling import error * Add test case * fix UT * Revert "fix UT" This reverts commit dd64538960ff7440c6020f533d43945ffc23f2d2. * fix handling import in UT * Fix saving zero checkpoint and delete redundant master_only * lint * test unittest * Fix handling impor error * Fix UT condition * Edit docstrings * Fix typo * Skip redundant procudure in checkpoint hook * fix typo again * Update mmengine/optim/optimizer/zero_optimizer.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * Add api info * lint * Fix lint * Handling AmpOptimWrapper case * handling overlap_with_ddp * Fix error Signed-off-by:
Junhwa Song <ethan9867@gmail.com> Signed-off-by:
Hakjin Lee <nijkah@gmail.com> Co-authored-by:
Junhwa Song <ethan9867@gmail.com> Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
-
- Oct 26, 2022
-
-
tripleMu authored
Fix typo
-
- Oct 24, 2022
-
-
wangjiangben-hw authored
* init npu * Update mmengine/optim/optimizer/amp_optimizer_wrapper.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * Update mmengine/dist/dist.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * change to is_hccl_backend * Update mmengine/optim/optimizer/amp_optimizer_wrapper.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * add comment with AmpOptimWrapper * Update mmengine/runner/amp.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * Update mmengine/runner/amp.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * add npu fn in base_model * Update mmengine/optim/optimizer/amp_optimizer_wrapper.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * clean lint * Update mmengine/optim/optimizer/amp_optimizer_wrapper.py Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> * Update mmengine/model/base_model/base_model.py Co-authored-by:
Mashiro <57566630+HAOCHENYE@users.noreply.github.com> * add is_npu_available * try to fix * Add comments * Refine grammar Co-authored-by:
Zaida Zhou <58739961+zhouzaida@users.noreply.github.com> Co-authored-by:
Mashiro <57566630+HAOCHENYE@users.noreply.github.com> Co-authored-by:
HAOCHENYE <21724054@zju.edu.cn>
-
- Oct 18, 2022
-
-
Mashiro authored
* [Enhance] add documents for , and support clip grad by value * refine docstring * fix as comment * Fix as comment * minor refine * minor refine * remove error comment for clip grad * refine docstring
-
- Aug 26, 2022
-
-
Mashiro authored
* add args to OptimWrapper.step backward zero_grad * minor refine * minor refine
-
- Aug 24, 2022
-
-
Zaida Zhou authored
* Rename data to structure * adjust the way to import module * adjust the way to import module * rename Structure to Data Structures in docs api * rename structure to structures * support using some modules of mmengine without torch * fix circleci config * fix circleci config * fix registry ut * minor fix * move init method from model/utils to model/weight_init.py * move init method from model/utils to model/weight_init.py * move sync_bn to model * move functions depending on torch to dl_utils * format import * fix logging ut * add weight init in model/__init__.py * move get_config and get_model to mmengine/hub * move log_processor.py to mmengine/runner * fix ut * Add TimeCounter in dl_utils/__init__.py
-
- Jul 20, 2022
-
-
Mashiro authored
* fix save scheduler state dict with optim wrapper * remove for loop and inherit TestParameterScheduler * remove for loop and inherit TestParameterScheduler * minor refine
-
- Jul 06, 2022
-
-
Mashiro authored
* fix optimizer wrapper counts * fix ut
-
- Jul 05, 2022
-
-
RangiLyu authored
* [Enhance] Support scheduling betas with MomentumScheduler. * enhance ut * test adam betas * enhance ut * enhance ut
-
- Jun 22, 2022
-
-
Mashiro authored
* add autocast wrapper * fix docstring * fix docstring * fix compare version * fix unit test * fix incompatible arguments * fix as comment * fix unit test * rename auto_cast to autocast
-
- Jun 13, 2022
-
-
Mashiro authored
* merge context * update unit test * add docstring * fix bug in AmpOptimWrapper * add docstring for backward * add warning and docstring for accumuate gradient * fix docstring * fix docstring * add params_group method * fix as comment * fix as comment * make default_value of loss_scale to dynamic * Fix docstring * decouple should update and should no sync * rename attribute in OptimWrapper * fix docstring * fix comment * fix comment * fix as comment * fix as comment and add unit test
-
- Jun 09, 2022
-
-
Yixiao Fang authored
-
- Jun 05, 2022
-
-
Mashiro authored
* fix build optimizer wrapper without type * refine logic * fix as comment * fix optim_wrapper config error in docstring and unit test * refine docstring of build_optim_wrapper
-
- Jun 01, 2022
-
-
Mashiro authored
* Support multiple optimizers * minor refinement * improve unit tests * minor fix * Update unit tests for resuming or saving ckpt for multiple optimizers * refine docstring * refine docstring * fix typo * update docstring * refactor the logic to build multiple optimizers * resolve comments * ParamSchedulers spports multiple optimizers * add optimizer_wrapper * fix comment and docstirng * fix unit test * add unit test * refine docstring * RuntimeInfoHook supports printing multi learning rates * resolve comments * add optimizer_wrapper * fix mypy * fix lint * fix OptimizerWrapperDict docstring and add unit test * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment * Fix AmpOptimizerWrapper * rename build_optmizer_wrapper to build_optim_wrapper * refine optimizer wrapper * fix AmpOptimWrapper.step, docstring * resolve confict * rename DefaultOptimConstructor * fix as comment * rename clig grad auguments * refactor optim_wrapper config * fix docstring of DefaultOptimWrapperConstructor fix docstring of DefaultOptimWrapperConstructor * add get_lr method to OptimWrapper and OptimWrapperDict * skip some amp unit test * fix unit test * fix get_lr, get_momentum docstring * refactor get_lr, get_momentum, fix as comment * fix error message Co-authored-by:
zhouzaida <zhouzaida@163.com>
-
- May 31, 2022
- Apr 01, 2022
-
-
RangiLyu authored
-
- Mar 08, 2022
-
-
RangiLyu authored
* Support default_scope when building optimizer and evaluator. * add docstring * fix * fix
-
- Feb 19, 2022
-
-
RangiLyu authored
* [Feature]: Add optimzier and constructor. * refactor unit tests * add optimizer doc * add parrots wrapper * add parrots wrapper * solve comments * resolve comments
-