Functions#

Layers without parameters (e.g. activation functions) are also provided as simple functions.

elu(x[, alpha])

Applies the Exponential Linear Unit.

gelu(x)

Applies the Gaussian Error Linear Units function.

gelu_approx(x)

An approximation to Gaussian Error Linear Unit.

gelu_fast_approx(x)

A fast approximation to Gaussian Error Linear Unit.

glu(x[, axis])

Applies the gated linear unit function.

hardswish(x)

Applies the hardswish function, element-wise.

leaky_relu(x[, negative_slope])

Applies the Leaky Rectified Linear Unit.

log_sigmoid(x)

Applies the Log Sigmoid function.

log_softmax(x[, axis])

Applies the Log Softmax function.

mish(x)

Applies the Mish function, element-wise.

prelu(x, alpha)

Applies the element-wise parametric ReLU.

relu(x)

Applies the Rectified Linear Unit.

relu6(x)

Applies the Rectified Linear Unit 6.

selu(x)

Applies the Scaled Exponential Linear Unit.

sigmoid(x)

Applies the sigmoid function.

silu(x)

Applies the Sigmoid Linear Unit.

softmax(x[, axis])

Applies the Softmax function.

softplus(x)

Applies the Softplus function.

softshrink(x[, lambd])

Applies the Softshrink activation function.

step(x[, threshold])

Applies the Step Activation Function.

tanh(x)

Applies the hyperbolic tangent function.