Contents tagged with Encog

  • Encog propogation training algorithms

    Hello everybody,

    today I want to describe in simple words some training algos of Encog.

    Before I'll continue, I want to show general block schema of training algorithms:

    Init NN can look like this:

    public BasicNetwork CreateNetwork()

    {

    var network = new BasicNetwork();

    network.AddLayer(new BasicLayer(WindowSize));

    network.AddLayer(new BasicLayer(10));

    network.AddLayer(new BasicLayer(1));

    network.Structure.FinalizeStructure();

    network.Reset();

    return network;

    }

    Steps "NN error < acceptable error" -> "Update weights according to learning algorithim" can look like this:

    public void Train(BasicNetwork network, IMLDataSet training)

    {

    ITrain train = … more

  • Encog Training

    Hello everybody,

    suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

    var train = new Backpropagation(_network, trainSet);double error;

    do

    {

    train.Iteration(); error = train.Error; } while (error > 0.01);

    Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

    In mentioned code train is Backpropogation … more

  • Encog create simple network

    Hello everybody,

    today I want to share how to create simple neural network in Encog. It's very simple process:

    var network = BasicNetwork();

    Each neural network have layer. 

    Example of layer creating: 

    network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));

    The first paramether of BasicLayer is activation function, which is in our case ActivationSigmoid,

    The second paramether is Bias neuron. True means that layer will have bias layer also. 

    The third paramether represents number of neurons in layer. 

    If you think that creating network is enough for training, you are wrong. As a lot of staff in our world, in Encog you need to call FinalizeStructure. It looks like … more

  • Backpropogation Encog

    Here is Backpropogation algorithm declaration of Encog:

    var train = new Backpropogation(network, trainingSet, learningRate, momentum);

    Today I discovered for myself purpose of momentum paramether. 

      

    Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored. more

  • Encog compute

    Hello.

    Some other generalizations of how to use Encog.

    For getting result of network you can use Compute method:

    var output = network.Compute(input);

    If we want to get result of bigger number of items, we can use following construction

    foreach(var item in trainingSet)

    {

    var output = network.Compute(item.Input);

    } more

  • Encog BasicMLDataSet

    Hello everybody,

    today I want to share few words about my learning of Encog.

    Let's say you have array of 15 doubles:

    double []s = new double[15];

    Then for simple case you can use BasciMLData class:

    IMLData data = new BasicMLData(s); 

    Now data can be used to feed data to any neural network. 

    Next point to consider is inputting bigger values.

    Suppose you want to have input as xor:

     

    double [][] xorInput =

    {

    new []{0.0, 0.0},

    new []{1.0, 0.0},

    new []{0.0, 1.0},

    new []{1.0, 1.0}

    };

    // output

    double [][] xorIdeal =

    {

    new []{0.0},

    new []{1.0},

    new []{1.0},

    new []{0.0}

    };

    var trainSet = new BasicMLDataSet(xorInput, xorIdeal);

    Now you can use … more