What Makes A Good Unit Test

What makes a good Unit test

Hello everybody,

today some notices of what is considered to be a good unit test.

1. Tests should be independent and isolated.  

For example if you have functions a, b, c tested, then sequence of test shouldn't affect the result. 

2. Each test should test single behaviour or logical staff. 

If to speak about phone example, calling and sending sms shouldn't be in one functoin

3. Clear purpose understood.

4. Don't test the compiler ( like writing/reading to db )

5. Reliable and repetable ( give the same result ).

6. Quality the same as other parts of solution.

7. Valuable for developers

No Comments

Add a Comment

Unit Test Abbreviations

Unit test abbreviations

Hello everybody,

some abbreaviations:

SUT - system under test. AKA as AUT, MUT, CUT.

DUT - device under test

DOC - depend on component

No Comments

Add a Comment

What Are Asserts In Nunit

What are Asserts in NUnit

Hello everybody,

just short notice of NUnit function Assert.That


public void CheckAddition()
  Assert.That(CalculatorClass.Minus (5, 2), Is.EqualTo(3));
public void CheckAddition()
   //old styel
Assert.AreEqual(3, CalculatorClass.Minus (5, 2));

No Comments

Add a Comment

Email Control In Acumatica

Email control in Acumatica

Hello everybody,

just short glimpse of how to make mail control in Acumatica. It's very simple task. Make dac class with string and bind it to page in the following way:

<px:PXMailEdit ID="edUsrPersonalMail" runat="server" DataField="UsrPersonalMail" CommitChanges="True" ></px:PXMailEdit>

That's it, you'll get email control

No Comments

Add a Comment

Encog Propogation Training Algorithms

Encog propogation training algorithms

Hello everybody,

today I want to describe in simple words some training algos of Encog.

Before I'll continue, I want to show general block schema of training algorithms:

Init NN can look like this:

public BasicNetwork CreateNetwork()
   var network = new BasicNetwork();
   network.AddLayer(new BasicLayer(WindowSize));
   network.AddLayer(new BasicLayer(10));
   network.AddLayer(new BasicLayer(1));
   return network;

Steps "NN error < acceptable error" -> "Update weights according to learning algorithim" can look like this:

public void Train(BasicNetwork network, IMLDataSet training)
    ITrain train = new ResilientPropagation(network, training);
    int epoch = 1;
            Console.WriteLine(@"Epoch #" + epoch + @" Error:" + train.Error);
         } while (train.Error > acceptableError);

The first in my decription goes Backpropogation algorithm. Few characteristics of it: 

1. probably the oldest training algorithm

2. As name says error are propogated back

3.  usage: 

var train = new Backpropogation(network, trainingSet, learningRate, momentum);

The next learning algorithm goes Manhattan update rule. Few characteristics of it:

1. use only sign of the gradient

2. Updates weiht by constant value ( depending from the sign it will be added or subtracted from the weithg )

3.  usage: 

var train = new ManhattanPropogation(network, trainingSet, constantValue);

as usually constantValue is very low ( around 0.00001 )

Other one is Quck propogation algorithm. Features of this method:

1. Newton's method of error minimization

2. Use learning trate which is higher value ( I used values starting from 2 )

3. Usage: 

var train = new QuickPropogation(network, trainingSet, learningRate);

Yet one more training algorithm is Resilent propogation algorithm. Some details of this method:

1. One of the fastest algorithm in Encog.

2. Easiest usage ( you don't care about rate, momentum or constant value )

3. It uses sign of gradient value.

4. It tries to compute delta value for each weight and use it for calculated weight.

5.  Example of usage:

var train = new ResilentPropogation(network, trainingSet);

6. It has four variants of resilent propogation: RPROP+, RPROP-, iRPROP+, iRPROP- . The best is considered iRPROP+

One more detail goes Scaled conjugate gradient. Features of it:

1. Use conjugate gradient method

2. Not applicable to all dataset

3. Doesn't require any learning paramethers

4. Example of usage:

var train = new ScaledConjugateGradient(network, trainingSet);

And the last one propogation algorithm is Levenberg Marquardt Algorithm ( aka LMA ). This is my favorite method. Features of it:

1. Something in between Gauss-Newton Algorithm ( GNA ) & Gradient Method

2. Easy to use ( no learning parameters )

3. Example of creating:

var train = new LevenbergMarquardtTraining(network, trainingSet);

4. Sometime it can be very efficient.

No Comments

Add a Comment

Transfer To New Acumatica Version

Transfer to new Acumatica version


today I had task of switching to new version of Acumatica. From 4.2 to 5.1. 

The first surprise which I faced was lack of .Net framework 4.5.1. This shocked me especially from viewpoint that I had Visual Studio 2012 installed with service pack 4. Then I found that I need "Microsoft .NET Framework 4.5.1 Developer Pack for Windows Vista SP2, Windows 7 SP1, Windows 8, Windows 8.1, Windows Server 2008 SP2 Windows Server 2008 R2 SP1, Windows Server 2012 and Windows Server 2012 R2" .

No Comments

Add a Comment

T200 Acumatica Certificate

T200 Acumatica certificate

Hello everybody,

I want to boast that I finally received T200 Acumatica certificate!!!!!! And now I can proudly say that I'm certified Acumatica developer,

which gained

  • T100

  • T101

  • T200

  • T300

  • T900 certificates.

1 Comment

Add a Comment

How To Get Tstamp In Acumatica

How to get tstamp in Acumatica

Hello everybody,

today I want to shre small note of how to generate timestamp for Acumatica objects ( in case if you use for some reason PXDataBase.Insert or  PXDataBase.Update)

PXDataBase has public method SelectTimeStamp.

public static byte[] SelectTimeStamp()
    return Provider.SelectTimeStamp();

So, in case if you need to put in variable t TimeStamp you can do the following:

var t = PXDatabase.SelectTimeStamp();

And variable t will have timestamp

No Comments

Add a Comment



I want to start campaign for making machine for my neural networks investigations. 

Here is the link if you want to participate


No Comments

Add a Comment

Pxaccumulatorattribute In Acumatica

PXAccumulatorAttribute in Acumatica

Few notes about PXAccumulatorAttribute

  1. If to inherit from PXAccumulatorAttribute, you'll have access to member _SingleRecord. If to set in constructor to true, you'll configure single record update mode.
  2. There is PrepareInsert method. This method intended for updating policy for the data fields.
  3. PrepareInsert is invoked within Persist method, before Acumatica framework generates SQL commands for inserted data records.
  4. Among paramethers of PrepareInsert method there is PXAccumulatorCollection, which has method Update. In this method it's possible to configure fields which will be updated during PrepareInsert
  5. Method Update has following policies: PXDataFieldAssign.AssignBehavior.Initialize ( new value is inserted into the database column only if value is null ), PXDataFieldAssign.AssignBehavior.Replace ( new value replces old value ), PXDataFieldAssign.AssignBehavior.Summarize ( new value is added to the value stored in the database ), PXDataFieldAssign.AssignBehavior.Maximize ( maximum of the new value and the value from the database is saved in the database ), PXDataFieldAssign.AssignBehavior.Minimize ( minimum of the new value and the value form database is saved in the database )

So, in order to implement Accumulator attribute, following steps are needed:

1. Inherit class from PXAccumulatorAttribute

2. Implement PrepareInsert method

3. Implement PersistInserted method.

No Comments

Add a Comment