Loading ...

Backpropogation Encog

Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 

 

   momentum.png

Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

Are you inspired by the intricacies of neural network optimization, like the role of momentum in escaping local minima? At Acumatica, we believe in tailoring solutions to fit your unique business needs, just as a well-tuned algorithm adapts to find the global optimum. If you’re looking to customize your Acumatica experience to achieve peak efficiency, we’re here to help! Leave us a customization request today, and let’s work together to fine-tune your system for success. Your ideal solution is just one click away—request your customization now!