relu (rectified linear unit)
Relu, short for rectified linear unit, is a mathematical function commonly used in artificial neural networks. It simply outputs the input value if it is positive, and zero otherwise.
Requires login.
Related Concepts (1)
Similar Concepts
- convolutional neural networks (cnn)
- elu (exponential linear unit)
- fractional linear transformation
- fractional linear transformations
- gated recurrent units (gru)
- l1 regularization
- l2 regularization
- leaky relu
- light ray
- linear algebra
- linear fractional transformation
- linear regression models
- lqr control (linear quadratic regulator)
- parametric rectified linear unit (prelu)
- recurrent neural networks (rnn)