# Calculate classification accuracy from dbf?

Question asked by ziziz on Feb 9, 2015
Latest reply on Mar 10, 2015 by blake.terhune

I have dozens of DBFs and each contains a classification error matrix. From each of these matrices I would like to calculate overall, user's and producer's accuracies.

What would be the best method to do that? I have one additional obstacle, just to make the problem a little more challenging; sometimes (but not always) I have more/less predicted classes than the "truth" class has. See the example below, where in the "predicted (Class)" column there is class "B" while the same class does not appear in the "truth" classes.

OrderedDict([(u'Class', u'A'), (u'_A', 14), (u'_C', 0), (u'_D', 3), (u'_E', 9), (u'_F', 8)])

OrderedDict([(u'Class', u'B'), (u'_A', 0), (u'_C', 0), (u'_D', 29), (u'_E', 1), (u'_F', 0)])

OrderedDict([(u'Class', u'C'), (u'_A', 0), (u'_C', 149), (u'_D', 101), (u'_E', 0), (u'_F', 2)])

OrderedDict([(u'Class', u'D'), (u'_A', 33), (u'_C', 0), (u'_D', 594), (u'_E', 41), (u'_F', 96)])

OrderedDict([(u'Class', u'E'), (u'_A', 62), (u'_C', 1), (u'_D', 38), (u'_E', 12), (u'_F', 28)])

OrderedDict([(u'Class', u'F'), (u'_A', 95), (u'_C', 34), (u'_D', 665), (u'_E', 38), (u'_F', 47)])