- Aug 11, 2022
- Aug 09, 2022
-
-
Ma Zerun authored
* [Enhance] Add `preprocess_cfg` as an argument of runner. * Rename `preprocess_cfg` to `data_preprocessor` * Fix docstring
-
- Aug 08, 2022
-
-
Mashiro authored
* Support log enviroment information when initiate runner * Fix unit test * fix as comment, save world_size * log gpu num * clear code and reformat log * minor refine * fix as comment * minor refine * clean the code * clean the code * remove save world_size in meta
-
Mashiro authored
* add build function for scheduler * add unit test add unit test * handle convert_to_iter in build_scheduler_from_cfg * restore deleted code * format import * fix lint
-
Mashiro authored
* fix build multiple scheduler * add new unit test * fix comment and error message * fix comment and error message * extract _parse_scheduler_cfg * always call build_param_scheduler during train and resume. If there is only one optimizer, the defaut value for sheduler will be a list, otherwise there is multiple optimizer, the default value of sheduler will be a dict * minor refine * rename runner test exp name * fix as comment * minor refine * fix ut * only check parameter scheduler * minor refine
-
- Aug 04, 2022
-
-
Mashiro authored
* make scheduler default to None * fix bc breaking * refine warning message * fix as comment * fix as comment * fix lint
-
- Jul 30, 2022
-
-
RangiLyu authored
-
- Jul 22, 2022
-
-
RangiLyu authored
-
- Jul 21, 2022
-
-
RangiLyu authored
-
- Jul 19, 2022
-
-
RangiLyu authored
* [Fix] Fix weight initializing and registry logging. * sync params * resolve comments
-
- Jul 15, 2022
-
-
Ma Zerun authored
* [Enhance] Auto set the `end` of param schedulers. * Add log output and unit test * Update docstring * Update unit tests of `CosineAnnealingParamScheduler`.
-
- Jul 14, 2022
- Jul 08, 2022
-
-
Mashiro authored
* fix missing device ids in wrap_model * clean the code * use default broadcast_buffers * refine MMSeparateDistributedDataParallel * rename tmp variable * refine docstring * add type hints * refactor docstring of ddp model * add arg in docstring * minor refine * better ddp link
-
- Jul 05, 2022
-
-
Mashiro authored
* fix build multiple runner error * fix comments * fix cpu ci
-
- Jun 24, 2022
-
-
Yuan Liu authored
* [Feature]: Set different seed for diff rank * [Feature]: Add log * [Fix]: Fix lint * [Fix]: Fix docstring * [Fix]: Fix sampler seed * [Fix]: Fix log bug * [Fix]: Change diff_seed to diff_rank_seed * [Fix]: Fix lint
-
- Jun 23, 2022
-
-
Jiazhen Wang authored
-
- Jun 22, 2022
-
-
Alex Yang authored
* [Feat] Support revert syncbn * use logger.info but not warning * fix info string
-
Mashiro authored
* add autocast wrapper * fix docstring * fix docstring * fix compare version * fix unit test * fix incompatible arguments * fix as comment * fix unit test * rename auto_cast to autocast
-
Alex Yang authored
* [Feat] Support save best ckpt * reformat code * rename function and reformat code * fix logging info
-
Jiazhen Wang authored
* modify cuda() to to() * rollback load_checkpoint * refine runner * add TODO
-
Mashiro authored
-
- Jun 21, 2022
-
-
Zaida Zhou authored
-
- Jun 17, 2022
-
-
Mashiro authored
* move import resource * move import resource
-
Mashiro authored
* [Enhance] dump messagehub in runner.resume * delete unnecessary code * delete debugging code Co-authored-by:
imabackstabber <312276423@qq.com>
-
Jiazhen Wang authored
* support resume from ceph * move func and refine * delete symlink * fix unittest * perserve _allow_symlink and symlink
-
- Jun 16, 2022
-
-
Jiazhen Wang authored
* support mlu * add ut and refine docstring
-
- Jun 15, 2022
-
-
Mashiro authored
-
- Jun 13, 2022
-
-
Mashiro authored
* merge context * update unit test * add docstring * fix bug in AmpOptimWrapper * add docstring for backward * add warning and docstring for accumuate gradient * fix docstring * fix docstring * add params_group method * fix as comment * fix as comment * make default_value of loss_scale to dynamic * Fix docstring * decouple should update and should no sync * rename attribute in OptimWrapper * fix docstring * fix comment * fix comment * fix as comment * fix as comment and add unit test
-
- Jun 10, 2022
-
-
Alex Yang authored
* [Feat]:support customizing evaluator * fix keyname of determining using default evaluator or not * add assertion * fix typo
-
- Jun 09, 2022
-
-
jbwang1997 authored
* Replace auto_scale_lr_cfg to auto_scale_lr * Update
-
- Jun 07, 2022
-
-
Mashiro authored
* add base model, ddp model and unit test * add unit test * fix unit test * fix docstring * fix cpu unit test * refine base data preprocessor * refine base data preprocessor * refine interface of ddp module * remove optimizer hook * add forward * fix as comment * fix unit test * fix as comment * fix build optimizer wrapper * rebase main and fix unit test * stack_batch support stacking ndim tensor, add docstring for merge dict * fix lint * fix test loop * make precision_context effective to data_preprocessor * fix as comment * fix as comment * refine docstring * change collate_data output typehints * rename to_rgb to bgr_to_rgb and rgb_to_bgr * support build basemodel with built DataPreprocessor * fix as comment * fix docstring
-
RangiLyu authored
-
- Jun 06, 2022
-
-
jbwang1997 authored
* Add auto scale lr fucntion * Update * Update * Update * Update * Update * Update * Update * Update * Update * Update Co-authored-by:
wangjiabao1.vendor <wangjiabao@pjlab.org.cn>
-
Jiazhen Wang authored
* support custom runner * change build_runner_from_cfg * refine docstring * refine docstring
-
RangiLyu authored
* Modify val_interval and val_begin to be the attributes of TrainLoop. * update doc * fix lint * type hint
-
- Jun 05, 2022
-
-
Mashiro authored
* fix build optimizer wrapper without type * refine logic * fix as comment * fix optim_wrapper config error in docstring and unit test * refine docstring of build_optim_wrapper
-
- Jun 01, 2022
-
-
Mashiro authored
* Support multiple optimizers * minor refinement * improve unit tests * minor fix * Update unit tests for resuming or saving ckpt for multiple optimizers * refine docstring * refine docstring * fix typo * update docstring * refactor the logic to build multiple optimizers * resolve comments * ParamSchedulers spports multiple optimizers * add optimizer_wrapper * fix comment and docstirng * fix unit test * add unit test * refine docstring * RuntimeInfoHook supports printing multi learning rates * resolve comments * add optimizer_wrapper * fix mypy * fix lint * fix OptimizerWrapperDict docstring and add unit test * rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment * Fix AmpOptimizerWrapper * rename build_optmizer_wrapper to build_optim_wrapper * refine optimizer wrapper * fix AmpOptimWrapper.step, docstring * resolve confict * rename DefaultOptimConstructor * fix as comment * rename clig grad auguments * refactor optim_wrapper config * fix docstring of DefaultOptimWrapperConstructor fix docstring of DefaultOptimWrapperConstructor * add get_lr method to OptimWrapper and OptimWrapperDict * skip some amp unit test * fix unit test * fix get_lr, get_momentum docstring * refactor get_lr, get_momentum, fix as comment * fix error message Co-authored-by:
zhouzaida <zhouzaida@163.com>
-
- May 31, 2022
-
-
Zaida Zhou authored
* Support multiple optimizers * minor refinement * improve unit tests * minor fix * Update unit tests for resuming or saving ckpt for multiple optimizers * refine docstring * refine docstring * fix typo * update docstring * refactor the logic to build multiple optimizers * resolve comments * ParamSchedulers spports multiple optimizers * refine docstring * RuntimeInfoHook supports printing multi learning rates * resolve comments * fix typo
-