leaky relu

Leaky ReLU (Rectified Linear Unit) is an activation function used in neural networks, referring to a variation of the standard ReLU function. It solves the limitation of the ReLU function by introducing a small non-zero gradient for negative input values. This allows some information to pass through even when inputs are less than zero, reducing the problem of "dead" neurons in the network.

Requires login.