Dropout

  • Dropout means dropping out some of the nodes of the neural network to avoid Overfitting
    • This is done randomly based on given probability
  • Dropout can be thought of ensemble of multiple networks
  • One way to solve Overfitting is ensemble of multiple neural networks
    • But for a big neural network, it is inefficient and time consuming
  • So one of the way is to use Dropout.
    • Dropout randomly drops out nodes on the training time from the parent network
    • Hence create different models for each batch