parametric rectified linear unit (prelu)
Parametric rectified linear unit (PRELU) is an activation function commonly used in artificial neural networks. It is similar to the rectified linear unit (ReLU), but with an adjustable parameter. PRELU allows certain negative input values to be multiplied by a learnable weight, while positive values are left unchanged. This adjustable parameter helps improve the flexibility and accuracy of the activation function during the training process.
Requires login.
Related Concepts (1)
Similar Concepts
- convolutional layers
- elu (exponential linear unit)
- gated recurrent unit (gru)
- gated recurrent unit (gru) layers
- gated recurrent units (gru)
- l2 regularization
- leaky relu
- linear algebra
- linear regression models
- lqr control (linear quadratic regulator)
- multilayer perceptron
- multilayer perceptrons
- prime lens
- pruning and quantization
- relu (rectified linear unit)