Functions#
Layers without parameters (e.g. activation functions) are also provided as simple functions.
|
Applies the Exponential Linear Unit. |
|
Applies the Continuously Differentiable Exponential Linear Unit. |
|
Applies the Gaussian Error Linear Units function. |
|
An approximation to Gaussian Error Linear Unit. |
A fast approximation to Gaussian Error Linear Unit. |
|
|
Applies the gated linear unit function. |
|
Applies the HardShrink activation function. |
|
Applies the HardTanh function. |
|
Applies the hardswish function, element-wise. |
|
Applies the Leaky Rectified Linear Unit. |
|
Applies the Log Sigmoid function. |
|
Applies the Log Softmax function. |
|
Applies the Mish function, element-wise. |
|
Applies the element-wise parametric ReLU. |
|
Applies the Rectified Linear Unit. |
|
Applies the Rectified Linear Unit 6. |
|
Applies the Scaled Exponential Linear Unit. |
|
Applies the sigmoid function. |
|
Applies the Sigmoid Linear Unit. |
|
Applies the Softmax function. |
|
Applies the Softmin function. |
|
Applies the Softplus function. |
|
Applies the Softshrink activation function. |
|
Applies the Step Activation Function. |
|
Applies the hyperbolic tangent function. |