Functions

Functions#

Layers without parameters (e.g. activation functions) are also provided as simple functions.

elu(x[, alpha])

Applies the Exponential Linear Unit.

celu(x[, alpha])

Applies the Continuously Differentiable Exponential Linear Unit.

gelu(x)

Applies the Gaussian Error Linear Units function.

gelu_approx(x)

An approximation to Gaussian Error Linear Unit.

gelu_fast_approx(x)

A fast approximation to Gaussian Error Linear Unit.

glu(x[, axis])

Applies the gated linear unit function.

hard_shrink(x[, lambd])

Applies the HardShrink activation function.

hard_tanh(x[, min_val, max_val])

Applies the HardTanh function.

hardswish(x)

Applies the hardswish function, element-wise.

leaky_relu(x[, negative_slope])

Applies the Leaky Rectified Linear Unit.

log_sigmoid(x)

Applies the Log Sigmoid function.

log_softmax(x[, axis])

Applies the Log Softmax function.

mish(x)

Applies the Mish function, element-wise.

prelu(x, alpha)

Applies the element-wise parametric ReLU.

relu(x)

Applies the Rectified Linear Unit.

relu2(x)

Applies the ReLU² activation function.

relu6(x)

Applies the Rectified Linear Unit 6.

selu(x)

Applies the Scaled Exponential Linear Unit.

sigmoid(x)

Applies the sigmoid function.

silu(x)

Applies the Sigmoid Linear Unit.

softmax(x[, axis])

Applies the Softmax function.

softmin(x[, axis])

Applies the Softmin function.

softplus(x)

Applies the Softplus function.

softshrink(x[, lambd])

Applies the Softshrink activation function.

step(x[, threshold])

Applies the Step Activation Function.

tanh(x)

Applies the hyperbolic tangent function.