Contents tagged with Neural networks scaling normalizing data
I'm passing coursera course about neural networks.
Today I discovered for myself reason why normalization and scaling in neural networks provides faster learning. Everything is related with error surface and optimization. If to put simply
the task of neural network is to find a global minimub at error surface. Algorithms of study of neural networks gradually move at error surface in order to finally find global minima of error
surface. Going to global minima in the circle will go faster then going to global minima in some ellipse or other kind of error surface.
Suppose we have training for neural network with two samples:
101,101 - > 2
101, 99 - > 0
Then error … more