Acumatica Predefined Width Values



I want to share predefined options for the ColumnWidth property:

• XXS(100px)

• XS(150px)

• S(200px)

• M(250px)

• XM(300px)

• L(350px)

• XL(400px)

• XXL(450px)

No Comments

Add a Comment


Restore Original Condition Of Acumatica


I want to note how to restore original condition of Acumatica if Unpublish project failed:

1. Restore the content of the App_Data\RollbackFilesfolder to the root folder of the website.

2. Clear the content of the CstPublishedfolder.

3. Delete the files placed in the Cachessubfolder of the App_Codewebsite folder.

4. Remove all external files deployed on the website

No Comments


Add a Comment


Acumatica Can Have Two Views With Main Dac


Hello everybody,

today I want to share option, which was shocking for me. I discovered that graph can have more then one dataview for the same main DAC. Here is sample of code form T200 manual:

public PXSelect<Product> Products;

public PXSelect<Product, 

Where<Product.productID, Equal<Current<Product.productID>>>>


Here also continaution of the manual why on Earth you can need this:

These two data views can only be used as data members for UI containers that display the same data record at a time. In such definition of data views, the first one is used to display brief information of a product on a form, and the second one is used to display the detail information of the same product on a tab. For both data views, the Currentproperty of the PXCachecache object returns the same Productdata record. If a user selects a data record in one UI container, the same data record appears in the second container.

No Comments


Add a Comment


Encog Training


Hello everybody,

suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

var train = new Backpropagation(_network, trainSet);
double error; do { train.Iteration();
error = train.Error;
} while (error > 0.01);

Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

In mentioned code train is Backpropogation training algorithm, which takes as paramethers _network and trainSet  and applies backpropogation to it.

0.01 in our case means 1 %.

Another term which is often used as substitution for Iteration is epoch. So don't be confused if somebody interchanges epoch and iteration.

No Comments


Add a Comment


Backpropogation Encog


Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 


Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

No Comments

Add a Comment


Encog Compute



Some other generalizations of how to use Encog.

For getting result of network you can use Compute method:

var output = network.Compute(input);

If we want to get result of bigger number of items, we can use following construction

foreach(var item in trainingSet)

     var output = network.Compute(item.Input);

No Comments


Add a Comment


Encog Basicmldataset


Hello everybody,

today I want to share few words about my learning of Encog.

Let's say you have array of 15 doubles:

double []s = new double[15];

Then for simple case you can use BasciMLData class:

IMLData data = new BasicMLData(s); 

Now data can be used to feed data to any neural network. 

Next point to consider is inputting bigger values.

Suppose you want to have input as xor:

double [][] xorInput = 
   new []{0.0, 0.0},
   new []{1.0, 0.0},
   new []{0.0, 1.0},
   new []{1.0, 1.0}
// output
double [][] xorIdeal = 
   new []{0.0},
   new []{1.0},
   new []{1.0},
   new []{0.0}

var trainSet = new BasicMLDataSet(xorInput, xorIdeal);

Now you can use trainSet BasciMLDataSet in order to feed neural network

No Comments


Add a Comment


How To Add Menu To Button In Acumatica


Hello everybody,

today I want to share trick which I call convert Acumatica button into menu.

Lets say in graph APBillManager created button in the following way:

public PXAction<APBill> Report;

If you want to convert it to menu with one item you can do the following:

public APBillManager()

public PXAction<PRPayroll> bankStatementReport;
[PXUIField(DisplayName = "Bank Statement")]
protected void BankStatementReport()

No Comments


Add a Comment


Normalization And Scaling In Neural Networks


Hello everybody.

I'm passing coursera course about neural networks.

Today I discovered for myself reason why normalization and scaling in neural networks provides faster learning. Everything is related with error surface and optimization. If to put simply 

the task of neural network is to find a global minimub at error surface. Algorithms of study of neural networks gradually move at error surface in order to finally find global minima of error

surface. Going to global minima in the circle will go faster then going to global minima in some ellipse or other kind of error surface.

Suppose we have training for neural network with two samples:

101,101 - > 2

101, 99 - > 0

Then error surface will be oval, and convergence will be relatively slow. But if to normalize data from range [99 ; 101] to range [-1; 0 ] we will gt error surface as circle, which converges much more faster.

See the picture:

The same is true for case of scaling. Let's say we have two inputs, and two outputs:

0.1, 10  -> 2

0.1, -10 -> 0.

If to scale the second part to the following look how error surface are changed:

No Comments

Add a Comment


Enable Disable Button Of Grid Or Pxtoolbarbutton Which Depends From Value Of Column In Acumatica

Enable disable button of grid or PXToolBarButton, which depends from value of column in Acumatica

Hello everybody.

My next notice is about following case. 

Suppose you have from PR301000, which has grid with id "grid" with button calculate. Also grid has column, which is bounded to column caculated, and you need the following:

If in selected row field "Calculated" is true, then disable button Calculate. If in selected row field "Calculated" is unchecked, then enable button calculate. 

In order to implement this following should be implemented:

1. In grid at page pr301000:

        <ActionBar ActionsText="True">


                <px:PXToolBarButton Text="Calculate" DependOnGrid="grid" StateColumn="Calculated">

   <AutoCallBack Command="Calculate" Target="ds" >





2. In ds section write the following:

<px:PXDataSource ID="ds" runat="server" Visible="True" Width="100%" PrimaryView="PayRolls" SuspendUnloading="False" TypeName="DS.PayRollManager">


            <px:PXDSCallbackCommand Name="Calculate" Visible="False" DependOnGrid="grid">




3. Declaration in dac class should be the following:

#region Calculated

public abstract class calculated : PX.Data.IBqlField



protected bool? _Calculated;


[PXDefault(false, PersistingCheck = PXPersistingCheck.Nothing)]

[PXUIField(DisplayName = "Calculated")]

public virtual bool? Calculated




return this._Calculated;




this._Calculated = value;




after I implemented those changes button calculated taken into account field Calculated

No Comments

Add a Comment