NonLinearityDef
#include <popnn/NonLinearityDef.hpp>
Definitions for non-linearity operations.
-
namespace popnn
Functions used in neural networks.
Enums
-
enum class NonLinearityType
Values:
-
enumerator SIGMOID
Sigmoid:
y = 1 / (1 + e^(-x))
-
enumerator HARD_SIGMOID
Hard Sigmoid:
y = max(0, min(1, 0.2*x + 0.5)
-
enumerator RELU
Rectified Linear Unit:
x >= 0 -> y = x
x < 0 -> y = 0
-
enumerator TANH
Hyperbolic tangent:
y = tanh(x)
-
enumerator GELU
Gaussian Error Linear Unit:
y = x * Phi(x) where Phi(x) is the cumulative distribution function of normal gaussian distribution. Phi(x) is approximated as:
Phi(x) = 0.5 * (1 + (tanh(x * 0.7978845608 * (1 + 0.044715 * x * x))))
-
enumerator SWISH
-
enumerator SOFTMAX
Softmax:
Always applied over the innermost dimension of the given tensor. Outer dimensions are independent of one another.
-
enumerator SOFTMAX_STABLE
Same as SOFTMAX, but slower more numerically stable algorithm used.
-
enumerator SOFTMAX_SCALED
Same as SOFTMAX, but slower more numerically stable algorithm used.
Outputs are scaled to allow use of greater dynamic range in outputs.
-
enumerator SIGMOID
-
enum class NonLinearityType