Dying ReLU

  • Dying ReLU is a disadvantage of ReLU activation function
  • Because it maps all negative values to 0 and the gradient of 0 is also 0, so a lot of neurons stops learning
  • To combat Dying ReLU, we use
    • Leaky ReLU