elu (exponential linear unit)

The Exponential Linear Unit (ELU) is a mathematical function commonly used in artificial neural networks as an activation function. It is similar to the Rectified Linear Unit (ReLU), but instead of outputting zero for negative inputs, it outputs a smoothed curve that allows for negative values. This helps the network to learn both positive and negative information, making it useful in complex learning tasks.

Requires login.