Activation-Functions

07-15-2022 || 00:01
Tags: #deep-learning

activation-functions

In the neural layer with a weight and bias is simply can be defined as,
$$
y = Wx + b
$$

Why do we use an activation function

Without the activation function, no matter how much layers we add, indeed all are just a linear regression model and fails to learn complex patterns. In deep learning, non-linear activation functions are mostly use as without the non-linearity all the layer becomes one single linear layer after summing up.

Non-Linear Activation Functions

  1. sigmoid-function

References