aiaccel.torch.lr_schedulers.SequentialLR¶
- class aiaccel.torch.lr_schedulers.SequentialLR(optimizer: Optimizer, schedulers_fn: list[Callable[[Optimizer], _LRScheduler]], milestones: list[int])[source]¶
A wrapper of torch.optim.lr_scheduler.SequentialLR to use list of functions to create schedulers.
- Parameters:
optimizer – Optimizer.
schedulers_fn – List of functions to create schedulers.
milestones – List of epoch indices. Must be increasing.
… code-block:: yaml
- scheduler_generator:
_partial_: True _convert_: “all” _target_: aiaccel.lr_schedulers.SequentialLR schedulers_fn:
_target_: torch.optim.lr_scheduler.LinearLR _partial_: True start_factor: 1.e-3 end_factor: 1.0 total_iters: 5000
_target_: torch.optim.lr_scheduler.CosineAnnealingLR
_partial_: True T_max: 95000
milestones: [5000]
- __init__(optimizer: Optimizer, schedulers_fn: list[Callable[[Optimizer], _LRScheduler]], milestones: list[int])[source]¶
Methods
__init__
(optimizer, schedulers_fn, milestones)get_last_lr
()Return last computed learning rate by current scheduler.
get_lr
()Compute learning rate using chainable form of the scheduler.
load_state_dict
(state_dict)Load the scheduler's state.
recursive_undo
([sched])Recursively undo any step performed by the initialisation of schedulers.
state_dict
()Return the state of the scheduler as a
dict
.step
()Perform a step.