Loading ...

Encog Training

Hello everybody,

suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

var train = new Backpropagation(_network, trainSet);
double error; do { train.Iteration();
error = train.Error;
} while (error > 0.01);

Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

In mentioned code train is Backpropogation training algorithm, which takes as paramethers _network and trainSet  and applies backpropogation to it.

0.01 in our case means 1 %.

Another term which is often used as substitution for Iteration is epoch. So don't be confused if somebody interchanges epoch and iteration.

Ready to take your Acumatica experience to the next level? Just as neural networks can be customized with algorithms like Backpropagation, LMA, or Simulated Annealing to achieve optimal performance, your Acumatica system can be tailored to meet your unique business needs. Whether it’s streamlining processes, enhancing functionality, or integrating with other tools, we’re here to help. Leave us a customization request today and let’s build a solution that works as intelligently as your business demands. Your perfect Acumatica setup is just one request away!