Ways to reduce overfitting in NN

Hello everybody,

today I want to write short summary of how to reduce overfitting. Here it goes:

  1. Weight decay.
  2. Weight sharing
  3. Early stopping of training
  4. Model averaging
  5. Bayesian fitting of NN
  6. Dropout
  7. Generative pre-training

Some explanations about some points.

  1. Weight decay stands for keeping weights small
  2. Insist that weights will be similar to each other
  3. Early stopping stands for not training NN to full memorizing of test set
  4. In other words usage of different models 
  5. Little bit another usage of model averaging according to some rules
  6. random ommiting of hidden units in order to validate results

No Comments

Add a Comment