parametric rectified linear unit (prelu)

Parametric rectified linear unit (PRELU) is an activation function commonly used in artificial neural networks. It is similar to the rectified linear unit (ReLU), but with an adjustable parameter. PRELU allows certain negative input values to be multiplied by a learnable weight, while positive values are left unchanged. This adjustable parameter helps improve the flexibility and accuracy of the activation function during the training process.

Requires login.