Functions

Functions#

Layers without parameters (e.g. activation functions) are also provided as simple functions.

elu

elu(x, alpha=1.0)

celu

celu(x, alpha=1.0)

gelu

gelu(x) -> mlx.core.array

gelu_approx

gelu_approx(x)

gelu_fast_approx

gelu_fast_approx(x)

glu(x[, axis])

Applies the gated linear unit function.

hard_shrink

hard_shrink(x, lambd=0.5)

hard_tanh

hard_tanh(x, min_val=-1.0, max_val=1.0)

hardswish

hardswish(x)

leaky_relu

leaky_relu(x, negative_slope=0.01)

log_sigmoid

log_sigmoid(x)

log_softmax

log_softmax(x, axis=-1)

mish

mlx.core.array) -> mlx.core.array

prelu

mlx.core.array) -> mlx.core.array

relu

relu(x)

relu6

relu6(x)

selu

selu(x)

sigmoid

sigmoid(x)

silu

silu(x)

softmax

softmax(x, axis=-1)

softmin

softmin(x, axis=-1)

softplus

softplus(x)

softshrink

float = 0.5)

step

float = 0.0)

tanh(x)

Applies the hyperbolic tangent function.