There is not a limitation on the number of domains per feature class Reviewer can check that I know of. But there was a limit to how many values (5,000) are in a domain that the Reviewer Domain check can validate. This has been addressed at 10.
There is a limitation of how many records the browse features dialog can load, I think it's about 15,000.Yep, I concur, tested...
Was the reviewer workspace you were writing the errors to a personal geodatabase? I know that there is a 2GB limit for an MS Access database. You can try writing to a file geodatabase as your Reviewer Workspace.File GB, who uses PGDB anymore 🙂
BTW, what version are you using?10
When running a batch job, errors that are returned from the checks are stored in a temporary workspace before they are written to the Reviewer table. That temporary workspace was a personal geodatabase. We made a fix in the 10 release so that the temporary workspace is now a file geodatabase because people were reaching that 2GB MS Accces database limit since they had so many errors being returned.This explains a lot of why I have had previous issues with Reviewer all the way back to 92. We test big datasets, so good... Hopefully with SP1 for core, perf will get a lot better.
Also, consider running the domain check on a smaller area to find systematic domain errors that can be fixed. Another reason why there could be large results being returned is the Search for Null Values option is checked on and you have fields that are not populated that have domains.For our Parcel Layer issues with Domains, I am actually going to go through all 582 books, and calc the values that need calc'd like Null where they are currently '', so yep pretty stoked so far outside of size limitations, Parcel Lines with Domains Over 2 million features ran for a real short time and found 9 issues. Thats awesome... Parcels, not sweet, but hopefully in a month with SP, and our internal fixes we can revisit these benchmarks.