mlx.optimizers.step_decay#
- step_decay(init: float, decay_rate: float, step_size: int) Callable #
Make a step decay scheduler.
- Parameters:
Example
>>> lr_schedule = optim.step_decay(1e-1, 0.9, 10) >>> optimizer = optim.SGD(learning_rate=lr_schedule) >>> optimizer.learning_rate array(0.1, dtype=float32) >>> >>> for _ in range(21): optimizer.update({}, {}) ... >>> optimizer.learning_rate array(0.081, dtype=float32)