I'm working on an evaluation of the Data Reviewer extension for our organization, and after walking through a few examples of running checks and batch jobs I'm curious to know the limits of the Reviewer table. I see how you can push the results of your batch job(s) into it, and then how you can manually step through each record to fix the problem (and update the correction/verification status) and/or just delete the record.
Perhaps I missed something under the Help section, but I don't see an obvious way to re-run batch job(s) and have it remove any of the errors that have now been corrected in your GIS data. In other words, an automated removal of the Reviewer record because it doesn't apply anymore as an error. (example: my first run produces REVIEWSTATUS = "Material: Invalid domain value", so after fixing the record's "Material" field value in the feature class I re-run the batch job and this Reviewer table record no longer exists)
Is the only way to remove it by either selecting and pressing Delete, or else some sort of Python geo-processing script?
An interesting thought. But Data Reviewer does not do what you are describing.
May I suggest creating a new session and re-running the batch job to re-validate you data. Any errors that did not get fixed and any new errors that may have been introduced will be written to that new session.
And if you want to delete old sessions that are no longer needed, you can use the ReviewerConsole.exe to delete sessions. Also at 10.2, there is a new GP tool to delete sessions.
An interesting thought. But Data Reviewer does not do what you are describing.
May I suggest creating a new session and re-running the batch job to re-validate you data. Any errors that did not get fixed and any new errors that may have been introduced will be written to that new session.
And if you want to delete old sessions that are no longer needed, you can use the ReviewerConsole.exe to delete sessions. Also at 10.2, there is a new GP tool to delete sessions.