nn#
Modules#
|
A sequence of |
The layer in which the pytree leaves are not considered as differentiable parameters in Quantax computations. |
|
|
Creates a function that computes the gradient of |
|
Like equinox.filter_vjp. |
Activation function#
|
Rescale the input \(f(x) = x * \mathrm{scale}\) |
|
Apply a function to a rescaled input \(f(x) = fn(x * \mathrm{scale})\). |
The activation layer with output \(f(x) = g(x) * \exp(\theta_0)\). |
|
\(f(x) = (\sinh(x) + 1) \exp(\theta_0)\) |
|
|
\(f(x) = \exp(\theta_0) \prod x\) |
|
\(f(x) = \exp(x + \theta_0)\) |
|
Make a real input complex by splitting it into two parts, one taken as the real part and the other the imaginary part. |
Initializers#
|
Apply the Lecun normal initializer to the weights of the layer. |
|
Apply the He normal initializer to the weights of the layer. |
NQS layers#
|
Reshape the input to the shape suitable for convolutional layers. |
|
Symmetrize the output of a convolutional network according to the given symmetry. |