relu (rectified linear unit)
Relu, short for rectified linear unit, is a mathematical function commonly used in artificial neural networks. It simply outputs the input value if it is positive, and zero otherwise.
Requires login.
Related Concepts (1)
Similar Concepts
- convolutional layers
- convolutional neural networks (cnn)
- elu (exponential linear unit)
- gated recurrent unit (gru)
- gated recurrent units (gru)
- l1 regularization
- l2 regularization
- leaky relu
- light ray
- linear algebra
- linear regression models
- lqr control (linear quadratic regulator)
- nonlinear image processing
- parametric rectified linear unit (prelu)
- recurrent neural networks (rnn)