Backpropogation Encog

Backpropogation Encog

Here is Backpropogation algorithm declaration of Encog:

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

Today I discovered for myself purpose of momentum paramether. 

  

Here we have error function with global minimum and three local minimums. In order to jump out of local minima and run into global minima, neural network can take into account previous modification of weights. Momentum is coeficient, which manages which part of previous iteration take into account. If it is 1, then previous result will be taken into account completely. If it is 0, then previous update will be ignored.

No Comments

Add a Comment

Encog Compute

Encog compute

Hello.

Some other generalizations of how to use Encog.

For getting result of network you can use Compute method:

var output = network.Compute(input);

If we want to get result of bigger number of items, we can use following construction

foreach(var item in trainingSet)

{
     var output = network.Compute(item.Input);
}

No Comments

Add a Comment

Encog Training

Encog Training

Hello everybody,

suppose you have read two of my previous notes about network creating and basic input into neural network and now you have huge desire to make training on neural network with Encog. You are on the right way. One of the options which you have to try is the following:

var train = new Backpropagation(_network, trainSet);
double error; do { train.Iteration();
error = train.Error;
} while (error > 0.01);

Backpropogation is one of the trainings algorithms. Other training algorithms of Encog are: LMA, Similated annealing, quick propogation, Mahatan update rule, scaled conjugate rule and other, which I didn't yet tried. 

In mentioned code train is Backpropogation training algorithm, which takes as paramethers _network and trainSet  and applies backpropogation to it.

0.01 in our case means 1 %.

Another term which is often used as substitution for Iteration is epoch. So don't be confused if somebody interchanges epoch and iteration.

No Comments

Add a Comment

Encog Basicmldataset

Encog BasicMLDataSet

Hello everybody,

today I want to share few words about my learning of Encog.

Let's say you have array of 15 doubles:

double []s = new double[15];

Then for simple case you can use BasciMLData class:

IMLData data = new BasicMLData(s); 

Now data can be used to feed data to any neural network. 

Next point to consider is inputting bigger values.

Suppose you want to have input as xor:
 

double [][] xorInput = 
{
   new []{0.0, 0.0},
   new []{1.0, 0.0},
   new []{0.0, 1.0},
   new []{1.0, 1.0}
};
// output
double [][] xorIdeal = 
{
   new []{0.0},
   new []{1.0},
   new []{1.0},
   new []{0.0}
};

var trainSet = new BasicMLDataSet(xorInput, xorIdeal);

Now you can use trainSet BasciMLDataSet in order to feed neural network

No Comments

Add a Comment

How To Add Menu To Button In Acumatica

How to add menu to button in Acumatica

Hello everybody,

today I want to share trick which I call convert Acumatica button into menu.

Lets say in graph APBillManager created button in the following way:

public PXAction<APBill> Report;

If you want to convert it to menu with one item you can do the following:

public APBillManager()
{
    this.Report.AddMenuAction(bankStatementReport);
}

public PXAction<PRPayroll> bankStatementReport;
[PXButton]
[PXUIField(DisplayName = "Bank Statement")]
protected void BankStatementReport()
{
}

No Comments

Add a Comment

Normalization And Scaling In Neural Networks

Normalization and scaling in neural networks

Hello everybody.

I'm passing coursera course about neural networks.

Today I discovered for myself reason why normalization and scaling in neural networks provides faster learning. Everything is related with error surface and optimization. If to put simply 

the task of neural network is to find a global minimub at error surface. Algorithms of study of neural networks gradually move at error surface in order to finally find global minima of error

surface. Going to global minima in the circle will go faster then going to global minima in some ellipse or other kind of error surface.

Suppose we have training for neural network with two samples:

101,101 - > 2

101, 99 - > 0

Then error surface will be oval, and convergence will be relatively slow. But if to normalize data from range [99 ; 101] to range [-1; 0 ] we will gt error surface as circle, which converges much more faster.

See the picture:

The same is true for case of scaling. Let's say we have two inputs, and two outputs:

0.1, 10  -> 2

0.1, -10 -> 0.

If to scale the second part to the following look how error surface are changed:

No Comments

Add a Comment

Enable Disable Button Of Grid Or Pxtoolbarbutton Which Depends From Value Of Column In Acumatica

Enable disable button of grid or PXToolBarButton, which depends from value of column in Acumatica

Hello everybody.

My next notice is about following case. 

Suppose you have from PR301000, which has grid with id "grid" with button calculate. Also grid has column, which is bounded to column caculated, and you need the following:

If in selected row field "Calculated" is true, then disable button Calculate. If in selected row field "Calculated" is unchecked, then enable button calculate. 

In order to implement this following should be implemented:

1. In grid at page pr301000:

        <ActionBar ActionsText="True">

   <CustomItems>

                <px:PXToolBarButton Text="Calculate" DependOnGrid="grid" StateColumn="Calculated">

   <AutoCallBack Command="Calculate" Target="ds" >

                    </AutoCallBack>

   </px:PXToolBarButton>

            </CustomItems>

</ActionBar>

2. In ds section write the following:

<px:PXDataSource ID="ds" runat="server" Visible="True" Width="100%" PrimaryView="PayRolls" SuspendUnloading="False" TypeName="DS.PayRollManager">

        <CallbackCommands>

            <px:PXDSCallbackCommand Name="Calculate" Visible="False" DependOnGrid="grid">

            </px:PXDSCallbackCommand>

        </CallbackCommands>

</px:PXDataSource>

3. Declaration in dac class should be the following:

#region Calculated

public abstract class calculated : PX.Data.IBqlField

{

}

protected bool? _Calculated;

[PXDBBool()]

[PXDefault(false, PersistingCheck = PXPersistingCheck.Nothing)]

[PXUIField(DisplayName = "Calculated")]

public virtual bool? Calculated

{

get

{

return this._Calculated;

}

set

{

this._Calculated = value;

}

}

#endregion

after I implemented those changes button calculated taken into account field Calculated

No Comments

Add a Comment

Maintenance Pages In Acumatica

Maintenance pages in Acumatica

Hello everybody.

Here goes some my notes of Maintenance pages. 

First convention is that maintenance pages has number start of 20. For example pr203000.aspx means that it is maintenance page for pr, and I make this conclusion on basis that numbers start from 20.

As usually they are placed under manage group at sitemap and used for input of helper data, not the main.

No Comments

Add a Comment

Make Grid To Have All Screen

Make grid to have all screen

Hello everybody,

today I want to share how to make grid item to feet all container. For this purpose you just can use property AutoSize. It makes grid to feet entire area of parent container

No Comments

Add a Comment

Create Graph Instance

Create graph instance

Hello everybody,

today I want to notice how to create graph. There are two ways:

PXGraph.CreateInstance<BaseBLC>();

PXGraph.CreateInstance(typeof(BaseBLC));

If you want to get extention class, from base class, you can use following function:

GraphInstance.GetExtension<ExtentionClass>();

No Comments

Add a Comment