Hello everybody,
today I want to document one simple feature of Deeplearning4j library. Recently I had an assignment to feed into neural network for Deeplearning4j.
If your learning set is not big ( later I'll explain what big means ) then you can put all your data into INDArray and then bas...
Hello everybody,
today I want to write a short note about normalization for neural networks.
So, first goes formula how to normalize input in range [0, 1] ( taken from here ):
Another good for me example is going below ( taken from here ):
p = [4 4 3 3 4;
2...
Hello everybody,
today I want to write short summary of how to reduce overfitting. Here it goes:
Weight decay.
Weight sharing
Early stopping of training
Model averaging
Bayesian fitting of NN
Dropout
Generative pre-training
Some explanations about some points.
Weight decay stands for keeping w...
Hello everybody,
today I want to write few words about topic why mathematician believe that neural networks can be taught of something. Recently I've read book Fundamentasl of Artifical Neural Networks of Mohamad Hassoun and want to share some thoughts in more digestible manner with omitting some...
Small note of how to use more effective of Neural Networks:
1. Use different numbers of hidden layers
2. Different numbers of units per layer
3. Different types of unit
4. Different types of strengths of weight penalty
5. Different learning algorithms
Hello everybody,
today I want to share few words about my learning of Encog.
Let's say you have array of 15 doubles:
double []s = new double[15];
Then for simple case you can use BasciMLData class:
IMLData data = new BasicMLData(s);
Now data can be used to feed data to any neural network.&n...
Hello everybody.
I'm passing coursera course about neural networks.
Today I discovered for myself reason why normalization and scaling in neural networks provides faster learning. Everything is related with error surface and optimization. If to put simply
the task of neural network is to fi...
Hello everybody,
today I watched vidoe of Dr. James McCaffrey about traing neural netwoks.
So, few interesting details, which was useful for me.
1. As usually it is more then enough one hidden layer.
2. Imagine, that you need to feed into your neural network some data. Let's say...