mlx.nn.losses.smooth_l1_loss#

class smooth_l1_loss(predictions: array, targets: array, beta: float = 1.0, reduction: Literal['none', 'mean', 'sum'] = 'mean')#

Computes the smooth L1 loss.

The smooth L1 loss is a variant of the L1 loss which replaces the absolute difference with a squared difference when the absolute difference is less than beta.

The formula for the smooth L1 Loss is:

\[\begin{split}l = \begin{cases} 0.5 (x - y)^2, & \text{if } (x - y) < \beta \\ |x - y| - 0.5 \beta, & \text{otherwise} \end{cases}\end{split}\]
Parameters:
  • predictions (array) – Predicted values.

  • targets (array) – Ground truth values.

  • beta (float, optional) – The threshold after which the loss changes from the squared to the absolute difference. Default: 1.0.

  • reduction (str, optional) – Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'. Default: 'mean'.

Returns:

The computed smooth L1 loss.

Return type:

array