Order of views in Graph in Acumatica

Hello everybody,

note of today is value of orders. In the word of C# if you work with your class, you as usually don't care what you declare in your code. But in Acumatica graph order of data viewes is important, because they define order of saving data to the database. If you think that surprises are completed then here it is another one: order of viewes doesn't define order of data viewes execution. And the third one, view that is binded to PrimaryView should be defined first in the graph. 

Enabling Reusable Grid Filters

Hello everybody,

today I want to note reusable grid filters. Acumatica has interesting dialog window, wich is named Filter Settings in which user can define and save custom fitlers and then use them every time this user opens the page. They are recommended for usage at inquiry and processing pages, so users can customize these pages to show specific data that is most relevant to their needes and responsibilities. If you wonder how to convert ordinary view to filterable view here is the way: PXFilterable attribute.

public PXSelectReadonly<AnticipatedPayrollDetail> AnticipatedPayrollDetails;

I applied it to my page, and here is example of what I got:

So, in case if you need to add filtering option to your screen, attribute PXFilterable is your friend.

Order columns in Grid in Acumatica

Hello everybody,

Today I want to share with you interesting trick of how to order data in grid by default. The simplest way is to organize IsKey = true value. If you do this, then Acumatica will add Order by clause for each key field of the DAC.

Acumatica predefined width values


I want to share predefined options for the ColumnWidth property:


• XXS(100px)

• XS(150px)

• S(200px)

• M(250px)

• XM(300px)

• L(350px)

• XL(400px)

• XXL(450px)

Restore original condition of Acumatica

I want to note how to restore original condition of Acumatica if Unpublish project failed:

1. Restore the content of the App_Data\RollbackFilesfolder to the root folder of the website.

2. Clear the content of the CstPublishedfolder.

3. Delete the files placed in the Cachessubfolder of the App_Codewebsite folder.

4. Remove all external files deployed on the website

Acumatica can have two views with main DAC

Hello everybody,

today I want to share option, which was shocking for me. I discovered that graph can have more then one dataview for the same main DAC. Here is sample of code form T200 manual:

public PXSelect<Product> Products;

public PXSelect<Product, 

Where<Product.productID, Equal<Current<Product.productID>>>>


Here also continaution of the manual why on Earth you can need this:

These two data views can only be used as data members for UI containers that display the same data record at a time. In such definition of data views, the first one is used to display brief information of a product on a form, and the second one is used to display the detail information of the same product on a tab. For both data views, the Currentproperty of the PXCachecache object returns the same Productdata record. If a user selects a data record in one UI container, the same data record appears in the second container.

Encog Training

Hello everybody,

suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

var train = new Backpropagation(_network, trainSet);
double error; do { train.Iteration();
error = train.Error;
} while (error > 0.01);

Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

In mentioned code train is Backpropogation training algorithm, which takes as paramethers _network and trainSet  and applies backpropogation to it.

0.01 in our case means 1 %.

Another term which is often used as substitution for Iteration is epoch. So don't be confused if somebody interchanges epoch and iteration.

Encog create simple network

Hello everybody,

today I want to share how to create simple neural network in Encog. It's very simple process:

var network = BasicNetwork();

Each neural network have layer. 

Example of layer creating: 

network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));

The first paramether of BasicLayer is activation function, which is in our case ActivationSigmoid,

The second paramether is Bias neuron. True means that layer will have bias layer also. 

The third paramether represents number of neurons in layer. 

If you think that creating network is enough for training, you are wrong. As a lot of staff in our world, in Encog you need to call FinalizeStructure. It looks like this:

If you think, that FinalizeStructure is the last step, then one more disappointment. You also need to call:

The last operator will init network with initial set of random numbers.

If to collect it all together in one function it will look like this:

public static BasicNetwork CreateNeuralNetwork()
    var network = new BasicNetwork();
    network.AddLayer(new BasicLayer(null, true, 2));
    network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));
    network.AddLayer(new BasicLayer(new ActivationSigmoid(), false, 3));
    return network();

It will give you neural network with three layers. The first layer has two neurons, and doesn't use any activation function. The second layer uses sigmoid as activation function, and has bias. The third layer uses also ActivationSigmoid but doesn't have bias neuron.  FinalizeStructure can be also considered as saying to Encog that we done with NN constructing.


Backpropogation Encog

Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 

Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

Encog compute


Some other generalizations of how to use Encog.

For getting result of network you can use Compute method:

var output = network.Compute(input);

If we want to get result of bigger number of items, we can use following construction

foreach(var item in trainingSet)

     var output = network.Compute(item.Input);