Layers#
|
|
|
Applies 1-dimensional average pooling. |
|
Applies 2-dimensional average pooling. |
|
Applies Batch Normalization over a 2D or 3D input. |
|
Applies the Continuously Differentiable Exponential Linear Unit. |
|
Applies a 1-dimensional convolution over the multi-channel input sequence. |
|
Applies a 2-dimensional convolution over the multi-channel input image. |
|
Applies a 3-dimensional convolution over the multi-channel input image. |
|
Applies a 1-dimensional transposed convolution over the multi-channel input sequence. |
|
Applies a 2-dimensional transposed convolution over the multi-channel input image. |
|
Applies a 3-dimensional transposed convolution over the multi-channel input image. |
|
Randomly zero a portion of the elements during training. |
|
Apply 2D channel-wise dropout during training. |
|
Apply 3D channel-wise dropout during training. |
|
Implements a simple lookup table that maps each input integer to a high-dimensional vector. |
|
Applies the Exponential Linear Unit. |
|
Applies the Gaussian Error Linear Units. |
|
Applies the gated linear unit function. |
|
Applies Group Normalization [1] to the inputs. |
|
A gated recurrent unit (GRU) RNN layer. |
Applies the HardShrink function. |
|
|
Applies the HardTanh function. |
Applies the hardswish function, element-wise. |
|
|
Applies instance normalization [1] on the inputs. |
|
Applies layer normalization [1] on the inputs. |
|
Applies the Leaky Rectified Linear Unit. |
|
Applies an affine transformation to the input. |
Applies the Log Sigmoid function. |
|
Applies the Log Softmax function. |
|
|
An LSTM recurrent layer. |
|
Applies 1-dimensional max pooling. |
|
Applies 2-dimensional max pooling. |
|
Applies the Mish function, element-wise. |
|
Implements the scaled dot product attention with multiple heads. |
|
Applies the element-wise parametric ReLU. |
|
The same as |
|
Applies an affine transformation to the input using a quantized weight matrix. |
|
Applies Root Mean Square normalization [1] to the inputs. |
|
Applies the Rectified Linear Unit. |
|
Applies the Rectified Linear Unit 6. |
|
An Elman recurrent layer. |
|
Implements the rotary positional encoding. |
|
Applies the Scaled Exponential Linear Unit. |
|
A layer that calls the passed callables in order. |
|
Applies the sigmoid function, element-wise. |
|
Applies the Sigmoid Linear Unit. |
|
Implements sinusoidal positional encoding. |
|
Applies the Softmin function. |
|
Applies the Softshrink function. |
|
Applies the Softsign function. |
|
Applies the Softmax function. |
|
Applies the Softplus function. |
|
Applies the Step Activation Function. |
|
Applies the hyperbolic tangent function. |
|
Implements a standard Transformer model. |
|
Upsample the input signal spatially. |