How to get top 1 record from db in Acumatica

Hello everybody,

today I want to share with you one trick that sometime can be useful. Imagine, that you constructed some BQL query and want to get only one record from db with ommiting others or in other words if you need TOP 1. For this purpose you can use SelectSingle method that generates SQL statement with TOP 1 records to return and executes faster.

Asserts in NUnit

Hello everybody,

just short notice of NUnit function Assert.That

[Test]

public void CheckAddition()
{
  Assert.That(CalculatorClass.Minus (5, 2), Is.EqualTo(3));
}
public void CheckAddition()
{
   //old styel
Assert.AreEqual(3, CalculatorClass.Minus (5, 2));
}

Unit test abbreviations

Hello everybody,

some abbreaviations:

SUT - system under test. AKA as AUT, MUT, CUT.

DUT - device under test

DOC - depend on component

What makes a good Unit test

Hello everybody,

today some notices of what is considered to be a good unit test.

1. Tests should be independent and isolated.  

For example if you have functions a, b, c tested, then sequence of test shouldn't affect the result. 

2. Each test should test single behaviour or logical staff. 

If to speak about phone example, calling and sending sms shouldn't be in one functoin

3. Clear purpose understood.

4. Don't test the compiler ( like writing/reading to db )

5. Reliable and repetable ( give the same result ).

6. Quality the same as other parts of solution.

7. Valuable for developers

Email control in Acumatica

Hello everybody,

just short glimpse of how to make mail control in Acumatica. It's very simple task. Make dac class with string and bind it to page in the following way:

<px:PXMailEdit ID="edUsrPersonalMail" runat="server" DataField="UsrPersonalMail" CommitChanges="True" ></px:PXMailEdit>

That's it, you'll get email control

Encog propogation training algorithms

Hello everybody,

today I want to describe in simple words some training algos of Encog.

Before I'll continue, I want to show general block schema of training algorithms:

Init NN can look like this:

public BasicNetwork CreateNetwork()
{
   var network = new BasicNetwork();
   network.AddLayer(new BasicLayer(WindowSize));
   network.AddLayer(new BasicLayer(10));
   network.AddLayer(new BasicLayer(1));
   network.Structure.FinalizeStructure();
   network.Reset();
   return network;
}

Steps "NN error < acceptable error" -> "Update weights according to learning algorithim" can look like this:

public void Train(BasicNetwork network, IMLDataSet training)
{
    ITrain train = new ResilientPropagation(network, training);
    int epoch = 1;
    do{
            train.Iteration();
            Console.WriteLine(@"Epoch #" + epoch + @" Error:" + train.Error);
            epoch++;
         } while (train.Error > acceptableError);
    }

The first in my decription goes Backpropogation algorithm. Few characteristics of it: 

1. probably the oldest training algorithm

2. As name says error are propogated back

3.  usage: 

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

The next learning algorithm goes Manhattan update rule. Few characteristics of it:

1. use only sign of the gradient

2. Updates weiht by constant value ( depending from the sign it will be added or subtracted from the weithg )

3.  usage: 

var train = new ManhattanPropogation(network, trainingSet, constantValue);

as usually constantValue is very low ( around 0.00001 )

Other one is Quck propogation algorithm. Features of this method:

1. Newton's method of error minimization

2. Use learning trate which is higher value ( I used values starting from 2 )

3. Usage: 

var train = new QuickPropogation(network, trainingSet, learningRate);

Yet one more training algorithm is Resilent propogation algorithm. Some details of this method:

1. One of the fastest algorithm in Encog.

2. Easiest usage ( you don't care about rate, momentum or constant value )

3. It uses sign of gradient value.

4. It tries to compute delta value for each weight and use it for calculated weight.

5.  Example of usage:

var train = new ResilentPropogation(network, trainingSet);

6. It has four variants of resilent propogation: RPROP+, RPROP-, iRPROP+, iRPROP- . The best is considered iRPROP+

One more detail goes Scaled conjugate gradient. Features of it:

1. Use conjugate gradient method

2. Not applicable to all dataset

3. Doesn't require any learning paramethers

4. Example of usage:

var train = new ScaledConjugateGradient(network, trainingSet);

And the last one propogation algorithm is Levenberg Marquardt Algorithm ( aka LMA ). This is my favorite method. Features of it:

1. Something in between Gauss-Newton Algorithm ( GNA ) & Gradient Method

2. Easy to use ( no learning parameters )

3. Example of creating:

var train = new LevenbergMarquardtTraining(network, trainingSet);

4. Sometime it can be very efficient.

Transfer to new Acumatica version

Hello,

today I had task of switching to new version of Acumatica. From 4.2 to 5.1. 

The first surprise which I faced was lack of .Net framework 4.5.1. This shocked me especially from viewpoint that I had Visual Studio 2012 installed with service pack 4. Then I found that I need "Microsoft .NET Framework 4.5.1 Developer Pack for Windows Vista SP2, Windows 7 SP1, Windows 8, Windows 8.1, Windows Server 2008 SP2 Windows Server 2008 R2 SP1, Windows Server 2012 and Windows Server 2012 R2" .

T200 Acumatica certificate

Hello everybody,

I want to boast that I finally received T200 Acumatica certificate!!!!!! And now I can proudly say that I'm certified Acumatica developer,

which gained

  • T100

  • T101

  • T200

  • T300

  • T900 certificates.

How to get tstamp in Acumatica

Hello everybody,

today I want to shre small note of how to generate timestamp for Acumatica objects ( in case if you use for some reason PXDataBase.Insert or  PXDataBase.Update)

PXDataBase has public method SelectTimeStamp.

public static byte[] SelectTimeStamp()
{
    return Provider.SelectTimeStamp();
}

So, in case if you need to put in variable t TimeStamp you can do the following:

var t = PXDatabase.SelectTimeStamp();

And variable t will have timestamp

Indiegogo

I want to start campaign for making machine for my neural networks investigations. 

Here is the link if you want to participate

http://igg.me/p/neural-network-time-series-forecaster/x/10380153