Contents tagged with overfitting

  • Ways to reduce overfitting in NN

    Hello everybody,

    today I want to write short summary of how to reduce overfitting. Here it goes:

    Weight decay.

    Weight sharing

    Early stopping of training

    Model averaging

    Bayesian fitting of NN

    Dropout

    Generative pre-training

    Some explanations about some points.

    Weight decay stands for keeping weights small

    Insist that weights will be similar to each other

    Early stopping stands for not training NN to full memorizing of test set

    In other words usage of different models 

    Little bit another usage of model averaging according to some rules

    random ommiting of hidden units in order to validate results

    more

  • Principal Component Analysis in Machine Learning

    Hello everybody,

    today I want to note important for me details of Machine Learning. 

    So, the first and very important usage of PCA is visualizing data. If you have 10 dimensions, can you visualize those data? If you can I'm happy about you, but I can't. I can imagine only 1, 2, 3 D :). But with principal componenet analysis it's possible to visualize data. 

    Second application is reducing memory/disk need to store data. That's quite self-explanatory, to train on 10 000 dimensions and 100 dimensions is different.

    Third is speeding up learning algorithm. It's actually related with second.

    Another important detail, it's bad idea to use PCA in order to avoid overfitting. Actually everybody … more