relu (rectified linear unit)

Relu, short for rectified linear unit, is a mathematical function commonly used in artificial neural networks. It simply outputs the input value if it is positive, and zero otherwise.

Requires login.