Contents tagged with recall

  • How to merge precision and recall

    Hello everybody,

    today I continue writing notes about measuring quality of learning. You can read my previuos artcile 1 and article 2 about measures of quality of learning. If to summarize two of those articles we have the following:

    accuracy is good measure, but if samplings is unbalanced then accuracy can have great numbers, but total model will be very bad.

    precision tells you how objects model a(x) can find

    precision and recall work fine on unbalanced samplings

    And then question arises, is it possible somehow to merge them, but not as accuracy but as something more meaningful then accuracy?

    I'll decribe different ways to describe it going from worsest to better and hopefully … more

  • How to measure quality of learning part 2

    Hello everybody,

    today I want to add few more notes about measuring of quality of learning, but today about tasks of classification. 

    So, one of the ways can be measuring number of wrong answers. For example with usage of the following formula:

    Imagine that your classification set has three possible labels: a (10 elements ), b ( 15 elements ), c ( 20 elements ). And let's say that your model wrongly classified 2 out of a, 3 out of b and 4 out of c. In that case following formula is applicable:

    Historically it happend that in classification tasks it is common to maximaze function, while in regression learning vice versa. 

    Another common measurement of quality of classification is … more