Select to view content in your preferred language

Database Triggers vs. Calculation Attribute Rules

4166
9
02-27-2022 02:37 PM
Bud
by
Esteemed Contributor

 What are the pros & cons of database triggers vs. calculation attribute rules

Example: a trigger or calculation attribute rule that would modify the M-values of polylines after edits (in real-time).

SDE.St_Geometry; Oracle 18c; versioned with moving edits to base

 

Database Triggers:

Pros:

  • Don't need to worry about users locking data -- which seems like it would be a problem with calculation attribute rules.
  • Lots of functionality available in the DB: There are more St_Geometry functions than there are Arcade functions. And SQL has a lot more functionality in general than Arcade.
  • Oracle trigger functionality will never be deprecated. Whereas there is a chance ArcGIS functionality like attribute rules/arcade will be deprecated at some point. I know it's hard to imagine that happening in the foreseeable future, but deprecation does seem to be something that happens in Esri applications. So it's not beyond the realm of possibility.
  • Fast performance.

Cons:

  • Only SQL developers (with SQL client software) can manage triggers. 
  • We consider triggers to be customization...we try to avoid customization if we can.
  • If the triggers use St_Geometry functions, then that means we are married to St_Geometry.
    (We've always dreamt of moving to SDO_GEOMETRY, since it has a lot more functionality, including linear referencing functions. And Oracle Spatial is now free. So if we wanted to switch to SDO_Geometry, we would need to consider how much it would impact any existing St_Geometry-based triggers.)
  • No access to true curves information -- since it's hidden in the CAD BLOB. Although, Arcade doesn't seem to be much better.
  • Accessing all of a feature's vertices is difficult


Calculation Attribute Rules:

Pros:

  • Anyone who's a data owner can make attribute rules right in the application. No need to be a SQL developer. Although, basic JavaScript skills are still required.
  • I wouldn't consider attribute rules to be customization, which is good. (There is a benefit to putting logic in the application, where it is visible and accessible to everyone.)
  • Not specific to a certain database vendor or spatial data type.
  • Easily prevent a field from being edited.
  • I imagine Esri would be quick to provide support for a problem with attribute rules. And reluctant to provide support for DB triggers, especially on versioned data.
  • Editing related FCs is relatively straightforward.
  • "Rules can be targeted for specific subtypes in the rule definition. This is helpful when you want a rule to only apply to a subset of features in a dataset." Link
  • We can use attribute rules to enforce domains

Cons:

  • Can't manage attribute rules if data is locked. I imagine a table would be locked if even a single user has the table in their map.
    I'm a data owner, not a geodatabase administrator, so I think that could be a pain.
  • Arcade has limited functionality -- for example, geometry has limited functions and properties. 
  • There is a chance attribute rules/arcade could be deprecated at some point. We saw that happen with Python add-ins. People lost a lot of work when that happened.
  • Performance might be relatively slow -- for batch scenarios.
  • Note: Attribute rules require a GlobalID field. That's not really a problem, just worth noting.
  • No direct access to true curves information.
  • Polylines get automatically densified (similar to ST_Geometry).
  • Arcade doesn’t handle nulls properly.
  • We can't develop scripts in a proper IDE -- and test them right there in the IDE.
  • Batch calculation attribute rules seem clunky. They have lots of very specific requirements: the FC must be in a feature service, the feature service must have validation capability, and the FC must be branch versioned.
    For my notes: Batch calculation rules are not to be confused with a scheduled job that calculates an entire field. Batch calculation rules are more about deferring slow edits until later.

What do you think? What did I miss?

 

 

0 Kudos
9 Replies
JohannesLindner
MVP Frequent Contributor

Disclaimer: I have never worked with SQL triggers.

 

SQL triggers need additional software / SQL developers in the IT department. In my organization, server management software is not available for non-IT staff. So I would have to coordinate with IT staff to get what I want. AFAIK, nobody in our IT department knows enough about SQL to write the triggers. I need lots of automatic edits in my database, and during development, my requirements frequently change. In that case it's much easier to learn about Arcade and implement Attribute Rules myself than wait forever for IT to find the time.

With Attribute Rules, you can view, create, and edit the data structures, triggers, and data in the same application.

You can use Arcade to do other stuff (e.g. popups, labels, visualization, dashboards), so it's actually quite useful to learn.

With SQL triggers, you lose all geodatabase functionality. Things like Editor Tracking, Versioning, Archiving are all done automatically with Attribute Rules, while you have to implement that yourself with SQL Triggers. For example, if you edit a versioned table with SQL Triggers and want to emulate the versioning, you have to edit at least the two delta tables, maybe also some geodatabase system tables.

Attribute Rules tell you when something is wrong. If you made obvious errors, it won't even let you save the Rule. If you made less obvious errors that only emerge at runtime, you get an error message telling you where the error occured, which is of course extremely helpful for development. I don't know if that is the case with SQL Triggers.

Constraint Attribute Rules are a helpful tool. For example, you can reject edits that change a value to be out of some arbitrary bounds, or you can reject edits if the geometry falls inside/outside of a specific area. You can probably do something like that with SQL Triggers, too, but with Constraint Attribute Rules you can define a custom error message to tell the user what is wrong.

Validation Attribute Rules let you define some rules that are checked on demand, so that you can do manual quality assurance.

Deprecation: Of course this is always a possibility. But Arcade is a new feature that is under constant development and is meant as a language for the whole ArcGIS environment, so I doubt it is going away any time soon. Attribute Rules are a feature of the Geodatabase, whereas Python addons were a feature of the application, that probably also makes a (favorable) difference in regard to deprecation.

Locking: Yes, that can suck. In my case, I have only one coworker with write access, all others have only read access. And they don't see (and therefore don't lock) the actual tables, but database views. Another way is to disallow connections as geodatabase admin and then ask the database/server admin to close all existing connections.

Limited functionality: I think all the basic stuff is there, and they add new functionality quite frequently. Of course, it isn't as extensive as older languages (and probably never will, as it's developed for very specific purposes).

Performance: Rules that only affect the edited feature are quite fast. If you do huge chunks of edits at once, you will notice it. If you do single manual edits, probably not. More complex rules (filter tables, intersection, edit other tables) have a higher performance impact. Manual edits on some of my feature classes can take up to 3 seconds to evaluate the rules. But they are filtering 2-4 tables, do at least multiple intersections and send edit requests to multiple tables that trigger rules on those tables.

Accessing vertices: Polylines and Polygons are modelled as array of arrays of points (multipart features). You can access vertices like this:

 

// polyline
var geo = Geometry($feature)
var multipart_path = geo.paths // [ [Point] ]
var singlepart_path = geo.paths[0] // [Point]

// polygon
var geo = Geometry($feature)
var multipart_path = geo.rings // [ [Point] ]
var singlepart_path = geo.rings[0] // [Point]


for(var i in singlepart_path) {
    var p = singlepart_path[i]
    Console(p.X + "/" + p.Y)
}

 

 

All in all, I'm a big fan of Attribute Rules and Arcade. It has enabled me to do stuff that I couldn't do with Attribute Assistant and Python Addins. My IT department would hate me if I would come to them asking to change SQL triggers all the time, and I can use the Arcade knowledge I gained to create good-looking popups and do some light label and symbol customization. If you don't do constant automated edits of huge chunks of data, I'd go for Attribute Rules.


Have a great day!
Johannes
jcarlson
MVP Esteemed Contributor

Very excellent, thorough post!

I'll add my two cents that, if I recall correctly, Attribute Rules are slated to come to AGOL / Portal hosted layers someday, which is an environment where users don't have access to the underlying DB.

Also, Attribute Rules give you access to other functions like FeatureSetByPortalItem, which let your expression access features and layers outside of the database. Even outside of your portal, if it's a layer hosted by some other org. Not that you always want your expression to rely on outside layers staying put, but having that ability is nice.

- Josh Carlson
Kendall County GIS
Tiff
by
Frequent Contributor

Hi @jcarlson, this is two years late but I am curious about the FeatureSetByPortalItem access in attribute rules. I tested out my first spatial calculation attribute rule in the database this week, and I tried referencing data as you would in AGOL using the FeatureSetByPortalItem - and it said the attribute rule profile does not include that function, but rather only FeatureSetByName within the map or datastore. Is this something that used to be possible but is no longer? Thank you!

0 Kudos
Bud
by
Esteemed Contributor

Edit: Disregard

0 Kudos
JohannesLindner
MVP Frequent Contributor

Attribute Rules do trigger Editor Tracking, the respective tracking fields are set. Depending on your setup, you maybe won't get the time of the actual insert/update, but the time these changes got saved.


Have a great day!
Johannes
0 Kudos
Bud
by
Esteemed Contributor

One thing to add to list:

I've learned that a significant con to attribute rules is: We can't develop scripts in a proper IDE -- and test them right there in the IDE.

  • I've installed VS Code so that I can lint scripts -- using the JavaScript linter.
  • And I've installed node.js so that I can run the scripts. But that's just barely useful, because there are quite a few differences between Arcade and node.js scripts. So I have to make lots of modifications to my Arcade to get it to run as node.JS.
  • So it ends up being pretty horrible jumping back and forth between ArcGIS Pro and VS Code. 

Whereas we don't have that problem with SQL Triggers. We can develop in a IDE like Oracle SQL Developer, Toad, etc...and it all works as expected.


That's not a total deal breaker. Lots of people use calculation attribute rules successfully -- even without an IDE. I just thought I'd add it to the list, since it's something that's causing me trouble at the moment.

 

0 Kudos
Bud
by
Esteemed Contributor

@JohannesLindner 

Good points.

Regarding versioning:
I suspect that versioning with the option to move edits to base could solve some of the issues you mentioned.

Regarding error handling: 
I believe it's possible to put error handling in Oracle PL/SQL triggers...and then that error message gets returned to the ArcGIS application (such as ArcGIS Pro). 

    IF v_upd_row = 0 THEN
raise_application_error ( -20001, 'a custom message that pops up in ArcGIS Pro');
END IF;

That error message would be displayed in a popup message in the application. It'd be the same kind of message that you'd get for any other database-related error.

Source: Prevent deletes in a versioned table at the database level


By the way, if you're a data owner, the data owner likely has CREATE TRIGGER privileges. So in a pinch,  you could probably create triggers if you needed to. But it sounds like you don't.
You could install an SQL client or you could execute SQL statements using the ArcSDESQLExecute ArcPy class.

I just thought I'd mention that...for anyone else who might be reading this post.

0 Kudos
JohannesLindner
MVP Frequent Contributor

Or you could execute SQL statements using the ArcSDESQLExecute ArcPy class

That's a good point. I find this class quite useful for

  • reading user tables, because it's faster than arcpy.da.SearchCursor
  • reading system tables (e.g. listing all tables, listing fields, listing all states), because it's either faster than the arcpy equivalents or not possible in arcpy

I tried using it for writing user tables (again, because using SQL is faster than the arcpy cursors), but it didn't trigger Editor Tracking and, far worse for me, it didn't trigger Attribute Rules. I didn't have versioning in place then, so I don't know how it interacts with that.

 

The documentation page you linked raises some interesting points which seem to apply to SQL in general, too:

  • Do not alter enterprise geodatabase system tables using SQL. Corruption can occur if these system tables are edited directly using SQL.
  • Edits on traditional versioned data performed using SQL should only be made through versioned views. Do not edit branch versioned data using SQL.
  • [...]
  • [...] Be aware, though, that SQL access to the geodatabase bypasses geodatabase functionality, such as topology, networks, terrains, or other class or workspace extensions and bypasses triggers and stored procedures used to maintain the relationships between tables needed for certain geodatabase functionality. Circumventing geodatabase functionality can corrupt the relationships between data in your geodatabase.
  • Before attempting to access or modify any enterprise geodatabase objects, read all enterprise geodatabase documentation about using SQL against geodatabase objects in the DBMS.

Have a great day!
Johannes
0 Kudos