Encog Propogation Training Algorithms

 

Hello everybody,

today I want to describe in simple words some training algos of Encog.

Before I'll continue, I want to show general block schema of training algorithms:

Init NN can look like this:

public BasicNetwork CreateNetwork()
{
   var network = new BasicNetwork();
   network.AddLayer(new BasicLayer(WindowSize));
   network.AddLayer(new BasicLayer(10));
   network.AddLayer(new BasicLayer(1));
   network.Structure.FinalizeStructure();
   network.Reset();
   return network;
}

Steps "NN error < acceptable error" -> "Update weights according to learning algorithim" can look like this:

public void Train(BasicNetwork network, IMLDataSet training)
{
    ITrain train = new ResilientPropagation(network, training);
    int epoch = 1;
    do{
            train.Iteration();
            Console.WriteLine(@"Epoch #" + epoch + @" Error:" + train.Error);
            epoch++;
         } while (train.Error > acceptableError);
    }

The first in my decription goes Backpropogation algorithm. Few characteristics of it: 

1. probably the oldest training algorithm

2. As name says error are propogated back

3.  usage: 

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

The next learning algorithm goes Manhattan update rule. Few characteristics of it:

1. use only sign of the gradient

2. Updates weiht by constant value ( depending from the sign it will be added or subtracted from the weithg )

3.  usage: 

var train = new ManhattanPropogation(network, trainingSet, constantValue);

as usually constantValue is very low ( around 0.00001 )

Other one is Quck propogation algorithm. Features of this method:

1. Newton's method of error minimization

2. Use learning trate which is higher value ( I used values starting from 2 )

3. Usage: 

var train = new QuickPropogation(network, trainingSet, learningRate);

Yet one more training algorithm is Resilent propogation algorithm. Some details of this method:

1. One of the fastest algorithm in Encog.

2. Easiest usage ( you don't care about rate, momentum or constant value )

3. It uses sign of gradient value.

4. It tries to compute delta value for each weight and use it for calculated weight.

5.  Example of usage:

var train = new ResilentPropogation(network, trainingSet);

6. It has four variants of resilent propogation: RPROP+, RPROP-, iRPROP+, iRPROP- . The best is considered iRPROP+

One more detail goes Scaled conjugate gradient. Features of it:

1. Use conjugate gradient method

2. Not applicable to all dataset

3. Doesn't require any learning paramethers

4. Example of usage:

var train = new ScaledConjugateGradient(network, trainingSet);

And the last one propogation algorithm is Levenberg Marquardt Algorithm ( aka LMA ). This is my favorite method. Features of it:

1. Something in between Gauss-Newton Algorithm ( GNA ) & Gradient Method

2. Easy to use ( no learning parameters )

3. Example of creating:

var train = new LevenbergMarquardtTraining(network, trainingSet);

4. Sometime it can be very efficient.

No Comments

 

Add a Comment
 

 

Backpropogation Encog

 

Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 

  

Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

No Comments

Add a Comment
 

 

Encog Training

 

Hello everybody,

suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

var train = new Backpropagation(_network, trainSet);
double error; do { train.Iteration();
error = train.Error;
} while (error > 0.01);

Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

In mentioned code train is Backpropogation training algorithm, which takes as paramethers _network and trainSet  and applies backpropogation to it.

0.01 in our case means 1 %.

Another term which is often used as substitution for Iteration is epoch. So don't be confused if somebody interchanges epoch and iteration.

No Comments

 

Add a Comment
 

 

Encog Compute

 

Hello.

Some other generalizations of how to use Encog.

For getting result of network you can use Compute method:

var output = network.Compute(input);

If we want to get result of bigger number of items, we can use following construction

foreach(var item in trainingSet)

{
     var output = network.Compute(item.Input);
}

No Comments

 

Add a Comment
 

 

Encog Basicmldataset

 

Hello everybody,

today I want to share few words about my learning of Encog.

Let's say you have array of 15 doubles:

double []s = new double[15];

Then for simple case you can use BasciMLData class:

IMLData data = new BasicMLData(s); 

Now data can be used to feed data to any neural network. 

Next point to consider is inputting bigger values.

Suppose you want to have input as xor:
 

double [][] xorInput = 
{
   new []{0.0, 0.0},
   new []{1.0, 0.0},
   new []{0.0, 1.0},
   new []{1.0, 1.0}
};
// output
double [][] xorIdeal = 
{
   new []{0.0},
   new []{1.0},
   new []{1.0},
   new []{0.0}
};

var trainSet = new BasicMLDataSet(xorInput, xorIdeal);

Now you can use trainSet BasciMLDataSet in order to feed neural network

No Comments

 

Add a Comment