elu (exponential linear unit)
The Exponential Linear Unit (ELU) is a mathematical function commonly used in artificial neural networks as an activation function. It is similar to the Rectified Linear Unit (ReLU), but instead of outputting zero for negative inputs, it outputs a smoothed curve that allows for negative values. This helps the network to learn both positive and negative information, making it useful in complex learning tasks.
Requires login.
Related Concepts (1)
Similar Concepts
- elasticity coefficient
- electroluminescence
- euler's formula
- experimental units
- exponential form of a complex number
- exponential growth
- exponential loss
- exponential representation
- exponents
- lambda symbol
- lyapunov exponent
- parametric rectified linear unit (prelu)
- relu (rectified linear unit)
- units
- units of measurement