quantax.nn.Theta0Layer#
- class quantax.nn.Theta0Layer#
Bases:
NoGradLayer
The activation layer with output \(f(x) = g(x) * \exp(\theta_0)\). One can tune \(\theta_0\) to adjust the norm of the output state and avoid possible overflow.
- __init__()#
- rescale(maximum: Array) Theta0Layer #
Rescale the function output by adjusting \(\theta_0\).
- Parameters:
maximum – The maximum output m obtained from this activation function. \(\theta_0\) is adjusted as \(\theta'_0 = \theta_0 - \log(m)\) so that the maximum output is rescaled to 1.
- Returns:
The layer with adjusted \(\theta_0\).
Note
This method generates a new layer while doesn’t modify the existing layer.
Attributes
theta0