How To Call Recordccpayment Action From Screen Based Web Api Call In Acumatica For Payments And Applications Screen


Hello everybody,

today I want to write a few words with code samples on how to work with Screen based web API in Acumatica. 

Stage preparation

Before you go, first step that is needed is to have Acumatica instance ready. So install Acumatica with sales demo database.

As mentioned in the title of the article, I'm going to work with "Payments and Applications" screen ( AR302000 ) and with Action Record CC Payment: 

As usually in cases of WEB API calls in .Net, you'll need somehow to create Web Reference. 

For Record CC Payment following sequence of Actions is needed.

  1. In Acumatica instance navigate to page AR302000, and click there on Help -> Web Service:

2. Copy into clipboard url, of the opened window. 

3. Go to Visual Studio and create there some Application. For example console app.

4. Then click on References with right mouse click and in pop up choose Add Service Reference:

5. In window that will appear click on Advanced

6. In next window click on "Add Web Reference... " which will cause appearing one more window:

Initially left window will be opened, and then window at the right will be opened.

After all of those steps, you can type following code for your Main function:

		static void Main(string[] args)
			ServicePointManager.ServerCertificateValidationCallback += new System.Net.Security.RemoteCertificateValidationCallback(
				(s, c, ch, pe) => { return true; });
			using (Screen scr = new Screen())
				scr.CookieContainer = new CookieContainer();
				var lr = scr.Login("admin""123");
				if (lr.Code == ErrorCode.OK)
						var schema = scr.GetSchema();
						var commands = new Command[]
							new Value()
								Value = "BESTYPEIMG",
								LinkedCommand = schema.PaymentSummary.Customer
							new Value()
								Value = DateTime.Now.ToString("yyyy/MM/dd hh:mm:ss"),
								LinkedCommand = schema.PaymentSummary.Description
							new Value()
								Value = "770",
								LinkedCommand = schema.PaymentSummary.PaymentAmount
                                                        // if configured to prompt, then must set DialogAnswer
                                                        new Value()
                                                                 Value = "paymentreffdsafas",
                                                                 LinkedCommand = schema.PaymentSummary.PaymentRef
                                                        new Value()
								Value = "xxx",
								LinkedCommand = schema.RecordCCPaymentCCPaymentData.PCTranNumber
							new Value()
								Value = "xxx",
								LinkedCommand = schema.RecordCCPaymentCCPaymentData.AuthNumber
							new Value()
								Value = "OK",
								LinkedCommand = schema.PaymentSummary.ServiceCommands.DialogAnswer,  // Base code shows that it is PaymentSummary (Document.Ask())
                                                                Commit = true
						var status = scr.GetProcessStatus();
						while (status.Status == ProcessStatus.InProcess)
							status = scr.GetProcessStatus();
						if (status.Status == ProcessStatus.Completed)
							commands = new Command[] {
							var data = scr.Submit(commands);


In presented code I want to point your attention to those details:

1. This line Snippet

using (Screen scr = new Screen())


will help you to make efficient memory usage, especially for cases when you need to submit a lot of payments

2. try/finally combination or try/catch/finally combination will help you to track error messages

3. This line


while (status.Status == ProcessStatus.InProcess)


will help you necessary amount of time for request result.

4. In case if status.Status will be equal to aborted, or something similar, you can use status.Message and check what is there. 

5. Quite often it is needed to know some values, that was created during process. For this purpose you may use commands list one more time with sycn request as done in the block


if (status.Status == ProcessStatus.Completed)


6. Always, I repeat always call Logout method. Otherwise you'll get error message that you've riched maximum allowed connections for your account.

Submition Types In Acumatica


Hello everybody,

today I want to leave a really short notice on how you can submit data into Acumatica. There are three ways:

  • Contract-based REST API
  • Contract-based SOAT API
  • Screen-based SOAP API

Historically the first was screen based SOAP API, and with time two others were added. Later on I hope to add description of others as well.


How To Refresh Cache Of Acumatica


Hello everybody,

today I want to leave a comment on how to refresh cache of Acumaitca.

As usually I start with this method:




But I found that it not always work. For my surprise RequestRefresh not works for some reasons. I think reason for this may be that Acumatica has two caches: caching of data and caching of queries.  

If that is the case, I use another approach ( more hardcore ) 


ViewName.View.Cache.Clear(); // clearing cached data

ViewName.View.Cache.ClearQueryCache(); // clearing cached queries.


Second approach works a bit better, because it clears not only data, but also clears cached queries, and as outcome clearing results


Where Log Of Visual Studio Is Located


Hello everybody,

today I want to leave a short note on where to search for log file of Visual Studio. 

It lives here:

C:\Users\{USER}\AppData\Roaming\Microsoft\VisualStudio .....\{visual studio version}\ActivityLog.xml

Whenever I speak about log files with any kind of developer, I see round eyes and trembling voice with a question, why on earth should I look into log file?

The reason is simple, sometime you may get error message like this: "Error HRESULT E_FAIL has been returned from a call to a COM component" during adding reference to your class library. 

What may stand behind that error message? No ideas? The same was with me, but after looking into log file, we have found there following statement: within the Microsoft.VisualStudio.Shell.Interop.11.0.dll

and some additional close to abra cadabra statements, but googling that error message was much easier. 

Finally we applied following steps:

#1 Open "Developer Command Prompt for VS 2017" as Admin

#2 CD into "C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE\PublicAssemblies"

#3 Run "gacutil -i Microsoft.VisualStudio.Shell.Interop.11.0.dll"


Remember in which location lives log of Visual Studio. With it's help you'll be able to make miracles related to bugs. Another idea to keep in mind, take Microsoft as a pattern, and have your own log, especially of errors in your apps. 


How To Display Images In Grid Of Acumatica


Hello everybody,

today I want to leave a short post on how to display images in grid of Acumatica. 

I'll demonstrate it on sample of Sales Orders pages, which is known as SO301000.

End result will look like this:


In order to make it work, I've done the following:

  1. Created extension to SOOrderEntry and added there some kind of reading of url images
  2. Created DAC class for demo purposes
  3. Added few controls and css/js on customized so301000 page.

Now a bit more details. For SOOrderEntry I've created following extension:

    public class SOOrderEntryExt : PXGraphExtension<SOOrderEntry>
	    public PXSelect<SOImageItem> Images;
	    protected virtual IEnumerable images()
		    var result = new List<SOImageItem>();
			result.Add(new SOImageItem()
				ID = 1,
				ImageUrl = "",
				ID2 = 2,
				ImageUrl2 = ",255,255&fit=bounds&height=&width="
		    result.Add(new SOImageItem()
			    ID = 3,
			    ImageUrl = ",255,255&fit=bounds&height=&width=",
			    ID2 = 4,
			    ImageUrl2 = ",255,255&fit=bounds&height=&width="
		    result.Add(new SOImageItem()
			    ID = 5,
			    ImageUrl = ",255,255&fit=bounds&height=&width=",
			    ID2 = 6,
			    ImageUrl2 = ",255,255&fit=bounds&height=&width="
		    result.Add(new SOImageItem()
			    ID = 7,
			    ImageUrl = ",255,255&fit=bounds&height=&width=",
				ID2 = null,
				ImageUrl2 = null
			return result;

and following DAC class:

public class SOImageItem : IBqlTable
	public abstract class iD : IBqlField
	[PXInt(IsKey = true)]
	public virtual int? ID { getset; }
	public abstract class imageUrl : IBqlField
	[PXString(IsKey = true)]
	public virtual string ImageUrl { getset; }
	public abstract class iD2 : IBqlField
	[PXInt(IsKey = true)]
	public virtual int? ID2 { getset; }
	public abstract class imageUrl2 : IBqlField
	[PXString(IsKey = true)]
	public virtual string ImageUrl2 { getset; }

Then on page so301000, following staff was added:

    img {

And final piece tab:

	<px:PXTabItem Visible="True" Text="Images"  BindingContext="form">
			<px:PXGrid runat="server" ID="grdImages" Width="100%" DataSourceID="ds">
					<px:PXGridLevel DataMember="Images">
						    <px:PXNumberEdit runat="server" ID="CstPXNumberEdit5" DataField="ID"  Width="100"/>
							<px:PXTextEdit runat="server" ID="CstPXTextEdit4" DataField="ImageUrl"  />
						    <px:PXNumberEdit runat="server" ID="PXNumberEdit2" DataField="ID2"  Width="100"/>
						    <px:PXTextEdit runat="server" ID="PXTextEdit1" DataField="ImageUrl2"  />
                            <px:PXGridColumn DataField="ID" Width="100" />
                            <px:PXGridColumn DataField="ImageUrl" Type="Icon" Width="350"   />
                            <px:PXGridColumn DataField="ID2" Width="100" />
                            <px:PXGridColumn DataField="ImageUrl2" Type="Icon" Width="350"   />
            <Search CommitChanges="True" PostData="Page" ></Search>
            <Refresh CommitChanges="True" PostData="Page" ></Refresh>
            <Refresh CommitChanges="True" PostData="Page" ></Refresh>
            <Search CommitChanges="True" PostData="Page" ></Search>
        <AutoSize Enabled="True" Container="Window" ></AutoSize>

After that I was able to see result, that is shown on the screenshot of begining of the article.



Most Useful Git Commands


Hello everybody,

today I want to make post about most useful IMHO commands. Honestly speaking as usually I prefer to use tortoise git with it's gui. But quite often it happens that even the best gui tool can't give you necessary flexibility. For this purpose git commands come on your rescue. In this post I'll describe some basics of git, some most useful commands, so it's going to be one of my longest articles, and hopefully not the most useless.

Configuring Git

In git you can have three levels of configuration:

  • --local  - configuration of single repository
  • --global - configuration of user for your account ( for example user Administrator on your local machine )
  • --system - configuration for all users ( all users on your machine )

For example if I want to see, what is global email set, I can use folloiwng command in git console window:

$ git config --global

It will give as output:

If you want to set working email, you can do it like this:

 git config --local

after that local repository email will be set to

User name checking

Ever wondered what is your user name in git? Actually it can be recorded at three levels: local, global, system. 

You can get what is there like this:

$git config --system user.username

If you want to set it to value Yuriy Zaletskyy, you can do something like this:

$ git config --system user.username "Yuriy Zaletskyy"

It will set your global user name to Yuriy Zaletskyy.

Checing global configuration information

Ever wondered what is global configuration values? Following command can tell you:

$ git config --global --list

Similarly you can check local and system list.

 Also for local level you can go to file .git\config and see local settings.

Setting line endings

Have you ever been in situation when somebody changed each line in the file, you open it, and to your surprise, you don't see any change visually? That can happen in case if that somebody uses another line endings style then you. For dealing with it I recommend to use following command:

$git config --global core.autocrlf true

This commands says to git the following: please add CRs back when you check out file to the working directory.

Two types of push defaults

I don't know have you ever heard about it, but in the past by default Git used to push all branches. For now it pushes only currently selected branch. Those two kinds of pushes are named:

  1. Matching
  2. Simple ( default starting from Git 2.0 )

Do you know how to configure it? 

$git config --global push.default simple

with this setting only one currrently selected branch will be pushed. Others will not. For you it means that if you made changes in ten branches, then only one will travel to server, and others will not. Also it means that push will happen faster.

Avoid merging on each pull

If you want to avoid merging messages for each pull, I recommend you the following:

$git config --global pull.rebase true

Cooperation on source control

In gitlab on distinction from GitHub you don't have role of colaborator. You have two options:

  1. Work through branches
  2. Work through Fork and Merge request

Workflow for branches is the following:

  1. You clone repostiory
  2. Make Branch based on Master
  3. Make changes in the code
  4. Push your changes to your branch
  5. Somebody or you makes merge of branches into master

Workflow for Fork-> Merge request is like this:

1. You create fork. During fork you can give some name to your repository

2. Clone forked repository on your dev machine

3. Make changes to code

4. Make pushes to your repository

5. You create Merge request, and then somebody ( or maybe you ) fulfills your request

Useful commands for Pull Requests

$ git fetch

This command will download all branches

$ git branch -a

this command will show you all branches

$git checkout <branch_name>

obtain working copy of branch <branch_name>

Fast forward vs Recursive

In git you can have two kinds of history: fast forward and recursive. Difference you can see on the picture below:

difference between recursive and fast-forward is how changes will be displayed. Question for meditation, which tree is more informative: recursive or fast-forward?

I for me personally recursive is much more informative then fast-forward. 

Tagging, branching, releasing

Another interesting feature of git+GitLab is tagging. Question which often raises is how often tag should be added? General rule of thumb is to add tag for those commits that will go to production. 

The only exception can be case if you have CI/CD configured, in that case there is not sence in adding tags. 

In Gitlab you can use two kinds of tags:

  1. lightweight
  2. annotated

Take not that this set of tags is different from Github. Github has three kinds of tags:

  1. Liightweight
  2. Annotated
  3. Signed

Lightweight tag stands for tag without any message. Annotated tag stands for adding info on who tagged, possibly when and why. Signed ( not available in GitLab ) has public key to prove identity of tagger.

Consider following example of tags usage:

$git tag -a v1.5.3 -m "Tag message"

this tag will add tag v1.5.3 with message "Tag message". 

Important detail. After you've created tag, you need to inform git that tags should be pushed. You can do it like this:

$git push --tags

There are some commonly accepted standards in naming tags related to version. Often following format is used:

Release: major.minor.patch. In case of this command $git tag -a v1.5.3 -m "Tag message" following is understood: version 1, with minor changes 5 and patch changes 3. 

Main difference between major and minor is backward compatibility. Quite often ( not always ) changes in major version means breaking of backward compatibility. While minor hardly can mean and minor actually means fixing of bugs. This principle is not written in stone, but I observed it myself in plenty of products.


Honestly speaking I not very often use git commands from console when work with sourcecode in gitlab. I prefer to use Tortoise Git for source code management. But sometime when I can't grasp where is this menu item in Tortoise Git I jump to bash, and enter commands there.



How To Deal With Read Commited Snapshot Error Message In Acumatica


Hello everybody,

today I want to leave SQL fix for error message:

"There are problems on database server side:

READ_COMMITTED_SNAPSHOT is not set for current database.

On your Acumatica instance it may look liike this:

For fixing run this SQL:



How To Use Autofac In Acumatica With Global Graph And Single Registration


Hello everybody,

today I want to leave a short notice on how to use Autofac in Acumatica, but with single registration. In this article I've descirbed how to use Autofac for resolving interface < -- > class implementation. 

But one of my collegues Deebhan Hari pointed that with my usage registration of classes will happen on each roundtrip and for each graph leading to potential memory leack.

Definetely not something, we would like to have. After small conversation, we managed to add to Autofac singleton, which allowed us to have situation when only once per lifetime of the process Autofac 

registers class only once.



using System;

using PX.Data;

using Autofac;


namespace SingleRegistrationDemo


       public class AllGraphsExtension : PXGraphExtension<PXGraph>


         public override void Initialize()


             // Implemented Singleton, so that our KNCustomCaseCommonEmailProcessor is registered only once when needed.

             var single = SingletonCustomCaseCommon.Instance;




    public sealed class SingletonCustomCaseCommon


        private static readonly Lazy<SingletonCustomCaseCommon>

            lazy =

            new Lazy<SingletonCustomCaseCommon>

                (() => new SingletonCustomCaseCommon());


        public static SingletonCustomCaseCommon Instance { get { return lazy.Value; } }


        private SingletonKNCustomCaseCommon()


            PX.Objects.EP.EmailProcessorManager.Register(new ClosedCasesReOpenV2.CustomCaseCommonEmailProcessor());






How To Debug Acumatica With Dnspy


Hello everybody,

today I want to share with you few words about debugging of Acumatica. There are plenty of wonderful posts for doing this with help of already provided PDB files of existing graphs and first of all I definetely recommend to use them.

But there are scenarios when default pdb provided is not enough. One of recent examples when I faced it was debugging of web api rest calls. 

Below I'll provide you with steps, which you can accomplish in order to debug error stack trace like this ( screenshot from postman ):