aiaccel.torch.lightning.OptimizerConfig#

class aiaccel.torch.lightning.OptimizerConfig(optimizer_generator: Callable[..., optim.Optimizer], params_transformer: Callable[..., Iterator[tuple[str, Any]]] | None = None, scheduler_generator: Callable[..., optim.lr_scheduler.LRScheduler] | None = None, scheduler_interval: str | None = 'step', scheduler_monitor: str | None = 'validation/loss')[source]#

Configuration for the optimizer and scheduler in a LightningModule.

Parameters:
  • optimizer_generator (Callable[..., optim.Optimizer]) – A callable that generates the optimizer.

  • params_transformer (Callable[..., Iterator[tuple[str, Any]]] | None) – A callable that transforms the parameters into a format suitable for the optimizer. If None, the parameters are used as is. Defaults to None.

  • scheduler_generator (Callable[..., optim.lr_scheduler.LRScheduler] | None) – A callable that generates the learning rate scheduler. If None, no scheduler is used. Defaults to None.

  • scheduler_interval (str | None) – The interval at which the scheduler is called. Defaults to “step”.

  • scheduler_monitor (str | None) – The metric to monitor for the scheduler. Defaults to “validation/loss”.

__init__(optimizer_generator: Callable[..., optim.Optimizer], params_transformer: Callable[..., Iterator[tuple[str, Any]]] | None = None, scheduler_generator: Callable[..., optim.lr_scheduler.LRScheduler] | None = None, scheduler_interval: str | None = 'step', scheduler_monitor: str | None = 'validation/loss') None#

Methods

__init__(optimizer_generator[, ...])

Attributes