Archives / 2016 / October
  • ASP .Net LifeCycle

    Hello everybody,

    today I want to write few words about ASP .Net lifecycle. 

    Actually just sequence of events. 

    Begin Request

    Resolve request cache

    Map request handler

    Acquire request state

    Request handler execute

    Update request cache

    Log request

    End request

    Here are the 8 main steps of ASP .Net lifecycle. MVC controllers are processed in step 5, RequestHandlerExecute more

  • How closures are implemented by .Net

    Hello everybody,

    today I want to share piece of wisdom which is interesting, but hard to explain why somebody may need it. That is how closures are implemented. 

    Consider the following code:


    class Program


            static void Main(string[] args)


                int a = 54;

                Task t = new Task(

                    () =>



                        Console.WriteLine("Inside task");






    Now you may wonder, how a will be passed to closure? How Console.WriteLine will be executed?

    For my surprise .Net for closures generates behind the … more

  • C# Task, ContinueWith

    Hello everybody,

    today I want to write few words about Task Paralel library, and in particular two methods: Task.Start, and Task.ContinueWith.


    System.Threading.Tasks.Task t = new System.Threading.Tasks.Task(

                    () => {

                        //some heavy code like intensive calculations, reading from db, etc.




                System.Threading.Tasks.Task t2 = t.ContinueWith(

                    (a) => {

                    //Code to update UI




    Following template has the following structure. You need to split your code … more

  • Ways to reduce overfitting in NN

    Hello everybody,

    today I want to write short summary of how to reduce overfitting. Here it goes:

    Weight decay.

    Weight sharing

    Early stopping of training

    Model averaging

    Bayesian fitting of NN


    Generative pre-training

    Some explanations about some points.

    Weight decay stands for keeping weights small

    Insist that weights will be similar to each other

    Early stopping stands for not training NN to full memorizing of test set

    In other words usage of different models 

    Little bit another usage of model averaging according to some rules

    random ommiting of hidden units in order to validate results


  • Types of fields in Dynamics CRM

    Hello everybody,

    today I want to write a short note on types of fields in Dynamics CRM.

    So, below goes list of types:

    Single line of text

    Multiple line of text

    Option set

    Two options

    Whole numbers

    Floating point



    Date and Time


    Names of them are pretty self-explanatory, but few words I'd like to add.

    Single line of text is good for storing some small portions of text like first name, last name, title of book, title of product etc. Due to this it has limitation to 4K of data.

    Multiple line is larger, up to 1 Mb of data. 

    Option set is a list of specific items. They can be local or global. Local means that they can be applied to only one entity. … more

  • Dynamics CRM licensing

    Hello everybody,

    another note for today about Dynamics CRM licensing elements.

    So, let's get started. 

    External and Internal users for humans and computers

    First of all it's worth of mentioning, that Dynamisc CRM can be divided in two main groups: external and internal users. External license is applicable to your customers, to which you would like to give permission to communicate with your CRM instance which aren't your employees. Internal license has two divisions as well. Device lincense or person license. Names are self explanatory, but if to summarize, person lincense or personal CAL is applicable to your employer which has plenty of devices and from each of devices he provides … more

  • Perceptron learning algorithm

    Hello everybody,

    today I want to document perceptron learning algorithm for classifications

    Below goes following steps for teaching perceptron.

    Add one more column with value to 1 to each input row.

    Pick training cases according to some rules that gives you guarantee that every training case will be picked

    if output is correct, leave weights unchanged

    if ouput is 0 but 1 is expected then add input vector to weights of perceptron

    if output is 1 but 0 is expected then subtract input vector from weights of perceptron

    This simple algorithm will find you a set of weights for right classification of your vector space if such set of weights exists. It will depend only from … more