lasagne.layers
¶
Computes the output of the network at one or more given layers. |
|
Computes the output shape of the network at one or more given layers. |
|
This function gathers all layers below one or more given |
|
Returns a list of Theano shared variables or expressions that parameterize the layer. |
|
This function counts all parameters (i.e., the number of scalar values) of all layers below one or more given |
|
This function returns the values of the parameters of all layers below one or more given |
|
Given a list of numpy arrays, this function sets the parameters of all layers below one or more given |
The |
|
This class represents a layer that aggregates input from multiple layers. |
This layer holds a symbolic variable that represents a network input. |
A fully connected layer. |
|
Network-in-network layer. |
1D convolutional layer |
|
2D convolutional layer |
|
3D convolutional layer |
|
2D transposed convolution layer |
|
alias of |
|
2D dilated convolution layer |
2D locally connected layer |
1D max-pooling layer |
|
2D max-pooling layer |
|
3D max-pooling layer |
|
1D pooling layer |
|
2D pooling layer |
|
3D pooling layer |
|
1D upscaling layer |
|
2D upscaling layer |
|
3D upscaling layer |
|
Global pooling layer |
|
Feature pooling layer |
|
‘Winner Take All’ layer |
|
Spatial Pyramid Pooling Layer |
A layer which implements a recurrent connection. |
|
Dense recurrent neural network (RNN) layer |
|
A long short-term memory (LSTM) layer. |
|
Gated Recurrent Unit (GRU) Layer |
|
Simple class to hold the parameters for a gate connection. |
Dropout layer |
|
alias of |
|
Convenience function to drop full channels of feature maps. |
|
Convenience function to drop full channels of feature maps. |
|
Convenience function to drop full locations of feature maps. |
|
Gaussian noise layer. |
A layer reshaping its input tensor to another tensor of the same total number of elements. |
|
alias of |
|
A layer that flattens its input. |
|
alias of |
|
A layer that rearranges the dimension of its input tensor, maintaining the same same total number of elements. |
|
alias of |
|
Pad all dimensions except the first |
|
alias of |
|
Slices the input at a specific axis and at specific indices. |
Concatenates multiple inputs along the specified axis. |
|
alias of |
|
This layer performs an elementwise merge of its input layers. |
|
This layer performs an elementwise sum of its input layers. |
Cross-channel Local Response Normalization for 2D feature maps. |
|
Batch Normalization |
|
Apply batch normalization to an existing layer. |
|
Standardize inputs to zero mean and unit variance: |
|
Apply instance normalization to an existing layer. |
|
Apply layer normalization to an existing layer. |
A layer for word embeddings. |
A layer that just applies a nonlinearity. |
|
A layer that just adds a (trainable) bias term. |
|
A layer that scales its inputs by learned coefficients. |
|
Convenience function for standardizing inputs by applying a fixed offset and scale. |
|
This layer provides boilerplate for a custom layer that applies a simple transformation to the input. |
|
The |
|
Spatial transformer layer |
|
Spatial transformer layer |
|
A layer that applies parametric rectify nonlinearity to its input following [R33]. |
|
Convenience function to apply parametric rectify to a given layer’s output. |
|
A layer that applies a randomized leaky rectify nonlinearity to its input. |
|
Convenience function to apply randomized rectify to a given layer’s output. |
2D convolutional layer |
|
|
|
|
|
|
|
|
|
|
|
|
|
2D convolutional layer |
|
3D convolutional layer |
|
2D max-pooling layer |
|
2D pooling layer |
|
3D max-pooling layer |
|
3D pooling layer |
|
Spatial Pyramid Pooling Layer |
|
Batch Normalization |
|
Apply cuDNN batch normalization to an existing layer. |