Select to view content in your preferred language

Domain Check limitation

3363
3
10-27-2010 03:41 PM
TedCronin
MVP Alum
Is there a domain check limitation, or a known limitation on checking domains on poly datasets?

I just ran the Domain Check on my parcel lines that have over 2 million features and 5 domains associated with this layer and it was successful after about 30 seconds.

On my parcel layer that has 770,000 + records, I ran a check that has 12 domains, and the mem usage shot up to 1.2 GB, in about 30 seconds, before I had to kill it, so either their is a limitation on the number of domains that can be accessed, or there is a mem leak on using polygons with high numbers when using the domain check.
Tags (2)
0 Kudos
3 Replies
TedCronin
MVP Alum
Running on a subset of the data around 200,000 parcel records does not scream to 1.5 GBs in 30 Seconds, but after about 10 minutes I am at 1.6 GBs and expecting a crash any second now.

Ok, it ran through and did not crash.  It found 439938 potential domain issues, which I already know I cannot write this many to the Table, so I suppose I will try and browse them, but I guess we will see if that will work with this many records.  For sure, this check will need to be done on far fewer parcels, and I can check this check off the list that can not be run at night on the Service.  Bummer, another one that can not be automated.

Ok, It seems to be frozen, stuck at 1 GB mem usage, so Browse can't handle this many records.  Good to know.
0 Kudos
MichelleJohnson
Esri Contributor
There is not a limitation on the number of domains per feature class Reviewer can check that I know of.  But there was a limit to how many values (5,000) are in a domain that the Reviewer Domain check can validate.  This has been addressed at 10.

There is a limitation of how many records the browse features dialog can load, I think it's about 15,000.

Was the reviewer workspace you were writing the errors to a personal geodatabase?  I know that there is a 2GB limit for an MS Access database.  You can try writing to a file geodatabase as your Reviewer Workspace.

BTW, what version are you using? 

When running a batch job, errors that are returned from the checks are stored in a temporary workspace before they are written to the Reviewer table.  That temporary workspace was a personal geodatabase.  We made a fix in the 10 release so that the temporary workspace is now a file geodatabase because people were reaching that 2GB MS Accces database limit since they had so many errors being returned.

Also, consider running the domain check on a smaller area to find systematic domain errors that can be fixed.  Another reason why there could be large results being returned is the Search for Null Values option is checked on and you have fields that are not populated that have domains.
0 Kudos
TedCronin
MVP Alum
There is not a limitation on the number of domains per feature class Reviewer can check that I know of.  But there was a limit to how many values (5,000) are in a domain that the Reviewer Domain check can validate.  This has been addressed at 10.


Ok, good to know.

There is a limitation of how many records the browse features dialog can load, I think it's about 15,000.
  Yep, I concur, tested...

Was the reviewer workspace you were writing the errors to a personal geodatabase?  I know that there is a 2GB limit for an MS Access database.  You can try writing to a file geodatabase as your Reviewer Workspace.
  File GB, who uses PGDB anymore 🙂

BTW, what version are you using? 
  10

When running a batch job, errors that are returned from the checks are stored in a temporary workspace before they are written to the Reviewer table.  That temporary workspace was a personal geodatabase.  We made a fix in the 10 release so that the temporary workspace is now a file geodatabase because people were reaching that 2GB MS Accces database limit since they had so many errors being returned.
This explains a lot of why I have had previous issues with Reviewer all the way back to 92. We test big datasets, so good...  Hopefully with SP1 for core, perf will get a lot better.

Also, consider running the domain check on a smaller area to find systematic domain errors that can be fixed.  Another reason why there could be large results being returned is the Search for Null Values option is checked on and you have fields that are not populated that have domains.
  For our Parcel Layer issues with Domains, I am actually going to go through all 582 books, and calc the values that need calc'd like Null where they are currently '', so yep pretty stoked so far outside of size limitations, Parcel Lines with Domains Over 2 million features ran for a real short time and found 9 issues.  Thats awesome... Parcels, not sweet, but hopefully in a month with SP, and our internal fixes we can revisit these benchmarks.


Are there any fixes for Anno?
0 Kudos