Encog create simple network

Hello everybody,

today I want to share how to create simple neural network in Encog. It's very simple process:

var network = BasicNetwork();

Each neural network have layer. 

Example of layer creating: 

network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));

The first paramether of BasicLayer is activation function, which is in our case ActivationSigmoid,

The second paramether is Bias neuron. True means that layer will have bias layer also. 

The third paramether represents number of neurons in layer. 

If you think that creating network is enough for training, you are wrong. As a lot of staff in our world, in Encog you need to call FinalizeStructure. It looks like this:

If you think, that FinalizeStructure is the last step, then one more disappointment. You also need to call:

The last operator will init network with initial set of random numbers.

If to collect it all together in one function it will look like this:

public static BasicNetwork CreateNeuralNetwork()
    var network = new BasicNetwork();
    network.AddLayer(new BasicLayer(null, true, 2));
    network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));
    network.AddLayer(new BasicLayer(new ActivationSigmoid(), false, 3));
    return network();

It will give you neural network with three layers. The first layer has two neurons, and doesn't use any activation function. The second layer uses sigmoid as activation function, and has bias. The third layer uses also ActivationSigmoid but doesn't have bias neuron.  FinalizeStructure can be also considered as saying to Encog that we done with NN constructing.


Backpropogation Encog

Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 

Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

Encog compute


Some other generalizations of how to use Encog.

For getting result of network you can use Compute method:

var output = network.Compute(input);

If we want to get result of bigger number of items, we can use following construction

foreach(var item in trainingSet)

     var output = network.Compute(item.Input);

How to add menu to button in Acumatica

Hello everybody,

today I want to share trick which I call convert Acumatica button into menu.

Lets say in graph APBillManager created button in the following way:

public PXAction<APBill> Report;

If you want to convert it to menu with one item you can do the following:

public APBillManager()

public PXAction<PRPayroll> bankStatementReport;
[PXUIField(DisplayName = "Bank Statement")]
protected void BankStatementReport()

Encog BasicMLDataSet

Hello everybody,

today I want to share few words about my learning of Encog.

Let's say you have array of 15 doubles:

double []s = new double[15];

Then for simple case you can use BasciMLData class:

IMLData data = new BasicMLData(s); 

Now data can be used to feed data to any neural network. 

Next point to consider is inputting bigger values.

Suppose you want to have input as xor:

double [][] xorInput = 
   new []{0.0, 0.0},
   new []{1.0, 0.0},
   new []{0.0, 1.0},
   new []{1.0, 1.0}
// output
double [][] xorIdeal = 
   new []{0.0},
   new []{1.0},
   new []{1.0},
   new []{0.0}

var trainSet = new BasicMLDataSet(xorInput, xorIdeal);

Now you can use trainSet BasciMLDataSet in order to feed neural network

Normalization and scaling in neural networks

Hello everybody.

I'm passing coursera course about neural networks.

Today I discovered for myself reason why normalization and scaling in neural networks provides faster learning. Everything is related with error surface and optimization. If to put simply 

the task of neural network is to find a global minimub at error surface. Algorithms of study of neural networks gradually move at error surface in order to finally find global minima of error

surface. Going to global minima in the circle will go faster then going to global minima in some ellipse or other kind of error surface.

Suppose we have training for neural network with two samples:

101,101 - > 2

101, 99 - > 0

Then error surface will be oval, and convergence will be relatively slow. But if to normalize data from range [99 ; 101] to range [-1; 0 ] we will gt error surface as circle, which converges much more faster.

See the picture:

The same is true for case of scaling. Let's say we have two inputs, and two outputs:

0.1, 10  -> 2

0.1, -10 -> 0.

If to scale the second part to the following look how error surface are changed:

Enable disable button of grid or PXToolBarButton, which depends from value of column in Acumatica

Hello everybody.

My next notice is about following case. 

Suppose you have from PR301000, which has grid with id "grid" with button calculate. Also grid has column, which is bounded to column caculated, and you need the following:

If in selected row field "Calculated" is true, then disable button Calculate. If in selected row field "Calculated" is unchecked, then enable button calculate. 

In order to implement this following should be implemented:

1. In grid at page pr301000:

        <ActionBar ActionsText="True">


                <px:PXToolBarButton Text="Calculate" DependOnGrid="grid" StateColumn="Calculated">

   <AutoCallBack Command="Calculate" Target="ds" >





2. In ds section write the following:

<px:PXDataSource ID="ds" runat="server" Visible="True" Width="100%" PrimaryView="PayRolls" SuspendUnloading="False" TypeName="DS.PayRollManager">


            <px:PXDSCallbackCommand Name="Calculate" Visible="False" DependOnGrid="grid">




3. Declaration in dac class should be the following:

#region Calculated

public abstract class calculated : PX.Data.IBqlField



protected bool? _Calculated;


[PXDefault(false, PersistingCheck = PXPersistingCheck.Nothing)]

[PXUIField(DisplayName = "Calculated")]

public virtual bool? Calculated




return this._Calculated;




this._Calculated = value;




after I implemented those changes button calculated taken into account field Calculated

Maintenance pages in Acumatica

Hello everybody.

Here goes some my notes of Maintenance pages. 

First convention is that maintenance pages has number start of 20. For example pr203000.aspx means that it is maintenance page for pr, and I make this conclusion on basis that numbers start from 20.

As usually they are placed under manage group at sitemap and used for input of helper data, not the main.

Make grid to have all screen

Hello everybody,

today I want to share how to make grid item to feet all container. For this purpose you just can use property AutoSize. It makes grid to feet entire area of parent container

Create graph instance

Hello everybody,

today I want to notice how to create graph. There are two ways:





If you want to get extention class, from base class, you can use following function: