I resolved the issue. The confusion matrix classes were corresponding to the "Value" column of the training sample manager and the values were (apparently) random, so I changed the values to correspond with the Class ID for both the training and testing data.
... View more
Using ArcMap 10.4.1 Interactive Supervised Classification, I have classified a Sentinel 2 image in Malawi, Africa and am trying to assess the accuracy but having issues with the confusion matrix. While I initially identified 14 classes, I understand that this is more than needed and will try to amalgamate some classes. I would first, however, like to see the per-class and overall accuracy to know how to improve the classification. My training samples are as follows: My testing samples are below: Then I constructed the confusion matrix, using create accuracy assessment points (inputting the testing data, target field ground truth), update accuracy assessment points (inputting the classified image layer, target field classified) and compute confusion matrix. This is the result of the confusion matrix: I don't know how the classes listed in the confusion matrix relate to the classes I produced in the training and testing data (as there are many more classes listed than 14 and they are not labelled 1-14 as were my accuracy assessment points), and why no Producer's and User's accuracy values are shown. If anyone has suggestions as to correcting the confusion matrix, I would be very appreciative. Thank you.
... View more