So I am looking for assistance working with triggers on a point feature class in a 10.7.1 Enterprise GDB on SQL Server 2016. The feature class is archived-enabled. The data is non-versioned.
We are migrating an old ArcGIS mobile project to an ArcGIS Runtime for .NET app for field use, data is taken offline and edits sync'd via a feature service on ArcGIS Server 10.7.1. The core of this feature class is a Crossing_ID which is our own unique ID that binds a larger solution together, we have always managed the Crossing_ID by using triggers on the business table to increment it (ie. MAX + 1) when a new feature was inserted. The feature class and triggers have been in place and working great for many years, since we enabled archiving a few weeks back, development / testing of the new solution has gone nowhere.
The trigger as we have it is working fine on the business table, we are using an AFTER INSERT so it increments the Crossing_ID as expected when new features are created; the issue lies on the device running the ArcGIS Runtime app that created the new feature. When created on the mobile device that new feature does not have a Crossing_ID, the edit is sent to the ArcGIS Server / DB and one is assigned. That new Crossing_ID is never synced back to the device that created it....unless that feature is edited elsewhere. It appears when we assign the Crossing ID in the trigger we are not performing a 'normal' edit which I assume would increment the GDB_ARCHIVE_OID, GDB_FROM_DATE, and GDB_TO_DATE; as a result that device does not see the addition of a new Crossing_ID as a change that needs to be pulled back to its Runtime Geodatabase. Also, if that same device makes an attribute change to that same feature it created, since the Crossing_ID is still NULL, when it sync's the new edit it wipes out the Crossing_ID in the database setting it to NULL which effectively breaks our entire solution.
So....is it possible to continue using our own triggers to accomplish tasks like this (maintain our own Unique ID) when using an archive-enabled feature class? If it is, how should this be handled in the database / feature class to force that new Crossing_ID back to the device that created it? Thanks in advance. Our current trigger is below:
CREATE TRIGGER [Schema].[RCCS_RoadCrossings_Inserted]
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON
DECLARE @insertedID as int
SELECT @insertedID=ins.CROSSING_ID from inserted ins;
--Check for new crossing
IF @insertedID is NULL
--Get Largest Crossing ID
DECLARE @myMAX as int
SELECT @myMAX=MAX(CROSSING_ID) from Database.Schema.ROADCROSSINGS
--Update the table
SET CROSSING_ID = @myMAX+1
FROM Database.Schema OriginalRC
INNER JOIN INSERTED InsertedRC ON OriginalRC.OBJECTID = InsertedRC.OBJECTID;
Since I never got any replies we must be operating on the fringe, incase somebody finds this I am going to update it with our solution.
We had to abandon the use of database triggers that worked fine for years with ArcGIS Mobile, it appeared they were not fast enough for the new "Sync" process which meant Crossing_ID was not being sent back the field user as described. The solution was to use ArcGIS pro and implement a database sequence and attribute rules to manage the Crossing_ID, this ensured Crossing_ID was properly managed between mobile devices and the enterprise GDB.
While this fixed our issue and allowed our upgrade to continue it had far reaching effects on other processes and integrations with other systems that used the data. Because of the DB Sequence and Attribute Rules, ArcMap could no longer view and use the data, this was a problem because we have not transitioned to Pro completely. FME which we used to use to process updates to data from another vendors mobile solution also did not support the attribute rule and as of the time I am writing this update still does not (major issue). We had to modify the FME process and switch to writing to a staging File GDB instead of direct to the Enterprise GDB, from there we had to implement a python script to handle the Enterprise GDB update. This also proved problematic since the use of insert and update cursors also did not work well with the attribute rule causing duplicate Crossing_ID's and other issues. The solution to that was to have Python reference ArcGIS Pro functions like Append to handle the same tasks. After a lot of trial and error and many work arounds to a process that had been running issue free for years we have a solution in place that seems to be working.
Lesson learned, and this applies to very recent initiatives, adoption of attribute rules in Pro works great but only if that is the sole tool being used to access the data. If you have Python scripts, FME workbenches, still have ArcMap in use, or even edit that data via web services from ArcGIS Server, the decision to implement attribute rules needs to be very well thought out and thoroughly tested.