activation functions
Activation functions are mathematical functions that determine the output of a neural network model based on the weighted sum of its inputs. These functions introduce non-linearities to the network, allowing it to model complex relationships between inputs and outputs.
Requires login.
Related Concepts (17)
- arctan function
- backpropagation
- binary step function
- elu (exponential linear unit)
- fully connected layers
- gaussian function
- hyperbolic tangent function
- identity function
- leaky relu
- logistic function
- maxout function
- parametric rectified linear unit (prelu)
- piecewise linear function
- relu (rectified linear unit)
- sigmoid function
- softmax function
- swish function