Contents tagged with Backpropogation. Manhatan update rule
suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:
var train = new Backpropagation(_network, trainSet);double error;
train.Iteration(); error = train.Error; } while (error > 0.01);
Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried.
In mentioned code train is Backpropogation … more