Acumatica can have two views with main DAC

Hello everybody,

today I want to share option, which was shocking for me. I discovered that graph can have more then one dataview for the same main DAC. Here is sample of code form T200 manual:

public PXSelect<Product> Products;

public PXSelect<Product, 

Where<Product.productID, Equal<Current<Product.productID>>>>

ProductDetails;

Here also continaution of the manual why on Earth you can need this:

These two data views can only be used as data members for UI containers that display the same data record at a time. In such definition of data views, the first one is used to display brief information of a product on a form, and the second one is used to display the detail information of the same product on a tab. For both data views, the Currentproperty of the PXCachecache object returns the same Productdata record. If a user selects a data record in one UI container, the same data record appears in the second container.

Encog Training

Hello everybody,

suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

var train = new Backpropagation(_network, trainSet);
double error; do { train.Iteration();
error = train.Error;
} while (error > 0.01);

Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

In mentioned code train is Backpropogation training algorithm, which takes as paramethers _network and trainSet  and applies backpropogation to it.

0.01 in our case means 1 %.

Another term which is often used as substitution for Iteration is epoch. So don't be confused if somebody interchanges epoch and iteration.

Encog create simple network

Hello everybody,

today I want to share how to create simple neural network in Encog. It's very simple process:

var network = BasicNetwork();

Each neural network have layer. 

Example of layer creating: 

network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));

The first paramether of BasicLayer is activation function, which is in our case ActivationSigmoid,

The second paramether is Bias neuron. True means that layer will have bias layer also. 

The third paramether represents number of neurons in layer. 

If you think that creating network is enough for training, you are wrong. As a lot of staff in our world, in Encog you need to call FinalizeStructure. It looks like this:

network.Structure.FinalizeStructure();
If you think, that FinalizeStructure is the last step, then one more disappointment. You also need to call:
network.Reset();

The last operator will init network with initial set of random numbers.

If to collect it all together in one function it will look like this:

public static BasicNetwork CreateNeuralNetwork()
{
    var network = new BasicNetwork();
    network.AddLayer(new BasicLayer(null, true, 2));
    network.AddLayer(new BasicLayer(new ActivationSigmoid(), true, 5));
    network.AddLayer(new BasicLayer(new ActivationSigmoid(), false, 3));
    network.Structure.FinalizeStructure();
    network.Reset();
    return network();
}

It will give you neural network with three layers. The first layer has two neurons, and doesn't use any activation function. The second layer uses sigmoid as activation function, and has bias. The third layer uses also ActivationSigmoid but doesn't have bias neuron.  FinalizeStructure can be also considered as saying to Encog that we done with NN constructing.

 

Backpropogation Encog

Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 

Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

Encog compute

Hello.

Some other generalizations of how to use Encog.

For getting result of network you can use Compute method:

var output = network.Compute(input);

If we want to get result of bigger number of items, we can use following construction

foreach(var item in trainingSet)

{
     var output = network.Compute(item.Input);
}

How to add menu to button in Acumatica

Hello everybody,

today I want to share trick which I call convert Acumatica button into menu.

Lets say in graph APBillManager created button in the following way:

public PXAction<APBill> Report;

If you want to convert it to menu with one item you can do the following:

public APBillManager()
{
    this.Report.AddMenuAction(bankStatementReport);
}

public PXAction<PRPayroll> bankStatementReport;
[PXButton]
[PXUIField(DisplayName = "Bank Statement")]
protected void BankStatementReport()
{
}

Encog BasicMLDataSet

Hello everybody,

today I want to share few words about my learning of Encog.

Let's say you have array of 15 doubles:

double []s = new double[15];

Then for simple case you can use BasciMLData class:

IMLData data = new BasicMLData(s); 

Now data can be used to feed data to any neural network. 

Next point to consider is inputting bigger values.

Suppose you want to have input as xor:
 

double [][] xorInput = 
{
   new []{0.0, 0.0},
   new []{1.0, 0.0},
   new []{0.0, 1.0},
   new []{1.0, 1.0}
};
// output
double [][] xorIdeal = 
{
   new []{0.0},
   new []{1.0},
   new []{1.0},
   new []{0.0}
};

var trainSet = new BasicMLDataSet(xorInput, xorIdeal);

Now you can use trainSet BasciMLDataSet in order to feed neural network

Normalization and scaling in neural networks

Hello everybody.

I'm passing coursera course about neural networks.

Today I discovered for myself reason why normalization and scaling in neural networks provides faster learning. Everything is related with error surface and optimization. If to put simply 

the task of neural network is to find a global minimub at error surface. Algorithms of study of neural networks gradually move at error surface in order to finally find global minima of error

surface. Going to global minima in the circle will go faster then going to global minima in some ellipse or other kind of error surface.

Suppose we have training for neural network with two samples:

101,101 - > 2

101, 99 - > 0

Then error surface will be oval, and convergence will be relatively slow. But if to normalize data from range [99 ; 101] to range [-1; 0 ] we will gt error surface as circle, which converges much more faster.

See the picture:

The same is true for case of scaling. Let's say we have two inputs, and two outputs:

0.1, 10  -> 2

0.1, -10 -> 0.

If to scale the second part to the following look how error surface are changed:

Enable disable button of grid or PXToolBarButton, which depends from value of column in Acumatica

Hello everybody.

My next notice is about following case. 

Suppose you have from PR301000, which has grid with id "grid" with button calculate. Also grid has column, which is bounded to column caculated, and you need the following:

If in selected row field "Calculated" is true, then disable button Calculate. If in selected row field "Calculated" is unchecked, then enable button calculate. 

In order to implement this following should be implemented:

1. In grid at page pr301000:

        <ActionBar ActionsText="True">

   <CustomItems>

                <px:PXToolBarButton Text="Calculate" DependOnGrid="grid" StateColumn="Calculated">

   <AutoCallBack Command="Calculate" Target="ds" >

                    </AutoCallBack>

   </px:PXToolBarButton>

            </CustomItems>

</ActionBar>

2. In ds section write the following:

<px:PXDataSource ID="ds" runat="server" Visible="True" Width="100%" PrimaryView="PayRolls" SuspendUnloading="False" TypeName="DS.PayRollManager">

        <CallbackCommands>

            <px:PXDSCallbackCommand Name="Calculate" Visible="False" DependOnGrid="grid">

            </px:PXDSCallbackCommand>

        </CallbackCommands>

</px:PXDataSource>

3. Declaration in dac class should be the following:

#region Calculated

public abstract class calculated : PX.Data.IBqlField

{

}

protected bool? _Calculated;

[PXDBBool()]

[PXDefault(false, PersistingCheck = PXPersistingCheck.Nothing)]

[PXUIField(DisplayName = "Calculated")]

public virtual bool? Calculated

{

get

{

return this._Calculated;

}

set

{

this._Calculated = value;

}

}

#endregion

after I implemented those changes button calculated taken into account field Calculated

Maintenance pages in Acumatica

Hello everybody.

Here goes some my notes of Maintenance pages. 

First convention is that maintenance pages has number start of 20. For example pr203000.aspx means that it is maintenance page for pr, and I make this conclusion on basis that numbers start from 20.

As usually they are placed under manage group at sitemap and used for input of helper data, not the main.