Our organization maintains a parcel fabric of about 2.4 million parcels. We have strict business logic we adhere to, which we enforce via a number of validation style attribute rules. We have also implemented a custom reconcile and post tool to ensure mappers have successfully validated all rules prior to posting. Since our business logic is so critical to our data quality, we also want to ensure none of the mappers accidentally edit default directly, so we have a constraint attribute rule in place that prevents this:
However, we have discovered the evaluate rules process does not appear to honour this attribute rule, and mappers can accidentally evaluate rules against the default version of the database. In our case, this ended up crashing our entire production service. Typically, we would run evaluate rules only on features modified in this version - but in the case, since the mapper was accidentally running evaluate rules against default, there were no features modified in this version, so the entire database was evaluated (and it was goodnight on our service).
We would like to request that the evaluate rules process honour the constraint rule to prevent editing against default.
Are you able to share the attribute rule you have created to protect against edits to default?
Has your organization considered using a protected default to prevent edits to default in your workflows?
It is in the screenshot but here is the code it is an extremely simple rule and works for all other tables except for Records in the parcel fabric:
return !(GdbVersion($feature)=='sde.DEFAULT');
We have not done a protected default since that would involve a lot more overhead and is likely impossible to deal with versions correctly without ton of human involvement to deal with conflicts since we do a version per work package and a work package can have multiple items in the same area and touching the same geometry.
Evaluate attribute rules only work with batch calculation and validation rules, it writes to the features being evaluated and may generate error features when failures are detected. Constraint and immediate calculation are triggered when a user edit is made.
To prevent evaluate attribute rules from changing data on DEFAULT, this Arcade script condition must be added to all batch calculation and validation rules. This will skip the execution of the rule during evaluation and proceed with no changes. Note that all the features within the extent/selections will still be read and processed but because of this condition they will process faster.
if (GdbVersion($feature)=='sde.DEFAULT')) return; //quit evaluation
//process the rest
Another suggestion to minimize the accidental edits/evaluation is to Enable editing from the editing tab. This introduces an explicit step to start editing which prevents the accidental edits.
Also as Sean suggested, making default protected and assigning the version-admin portal rule to those users with higher privileges may solve the accidental issues. I do understand that it introduces alot of friction in some cases.
@HusseinNasser2 - interesting! I am still not totally following why eval rules would allow batch calcs and validation rules to execute on default if you have a constraint rule to prevent editing on default...batch calcs and validation rules are still edits, no?
But the script condition you've listed seems like something we can add - @SeanLyons do you agree?
I am not a fan of your suggestion to enable editing from the Editing tab (no offense!)...I think that maybe it is ok if you have only one source of data in your map, but if you have multiple sources of data this becomes a huge headache workflow wise. Also, you may not be aware of BUG-000168717 - The Object Error table shows as a separate workspace in a branch version when added first to Default, which results in further grief with this idea.
@SarahSibbett Yes that is something we can add. I would prefer to not add more conditions to check and just have it work the same on all features since the constraint rule should take priority I agree but I will test this in a development environment and see how it goes and if there is performance impacts or not for day to day use.
@HusseinNasser2 - just so I understand - are you thinking of the script condition as a workaround, or as a solution? Will validation rules and batch field calcs be enhanced to honour the constraint rule so that they can't run against default?
Reason I ask: Sean's organization has 2.4 million parcels. We have another parcel customer in Canada with 18 million (this amounts to about 200 million features in the database when you factor in all the associated lines and points, etc.).
As you point out in your original post, even with the additional condition, it will still read all the features, which will still take time. So your suggestion seems like a good workaround, but I think we still need a solution, particularly for those customers with really large databases. Thoughts?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.