nn#
Modules#
|
A sequence of |
The layer that takes not only the output of the previous layer, but also the raw input basis state. |
|
|
The model that allows accelerated forward pass through local updates and internal quantities. |
Activation function#
\(f(x) = \sinh(x) + 1\). |
|
|
\(f(x) = \prod x\). |
|
\(f(x) = \exp(x)\). |
|
\(f(x) = \exp(x)\). |
|
\(f(x) = x_1 + i x_2\) , where \(x = (x_1, x_2)\). |
Initializers#
|
Apply the Lecun normal initializer. |
|
Apply the He normal initializer. |
|
Apply the Glorot normal initializer. |
Sign structures#
|
Compute the sign, phase, or cosine value based on the provided kernel and spin configuration. |
Marshall sign rule for bipartite lattices. |
|
|
Stripe sign rule for bipartite lattices. |
120 degree Neel phase for triangular lattices. |
Conv layers#
|
Reshape the input to the shape suitable for convolutional layers. |
|
Symmetrize the output of a convolutional network according to the given symmetry. |
Fermions#
Get the indices of occupied fermion sites. |
|
Get the indices of the hopping fermions. |
|
Get the sign change due to fermion hopping. |