Select to view content in your preferred language

Updating feature class from view on daily using python

1784
10
Jump to solution
10-14-2024 08:04 AM
Labels (2)
vijaybadugu
Frequent Contributor

We have created a view to get the data from multiple tables. I tried publishing same view with sync enabled feature service, However, it did not work due to limitation on view. So, I created same feature class with same fields, published with sync (archiving) enabled. I need to update this feature class on daily basis to see the latest data in field maps. 

I have already python script (job) to update this feature class on daily. which deletes entire data from the feature class and copies from view. 

Is there a way to do this process ESRI best practices?

0 Kudos
1 Solution

Accepted Solutions
RobertKrisher
Esri Regular Contributor

Instead of deleting all the features every day, push inserts/updates/deletes to the layer. If you're only ever adding/updating values you can use the Append tool to do upserts by specifying a match key. If you need to handle deletes you can run the append then delete any rows that are no longer present in the source data.

Remember that when you edit a table that has archiving enabled that every edit is tracked in your database. This means that if you delete the contents and reinsert them every day, your database is recording all those edits. If your table is large, you should consider periodically trimming the table.

View solution in original post

10 Replies
RobertKrisher
Esri Regular Contributor

Instead of deleting all the features every day, push inserts/updates/deletes to the layer. If you're only ever adding/updating values you can use the Append tool to do upserts by specifying a match key. If you need to handle deletes you can run the append then delete any rows that are no longer present in the source data.

Remember that when you edit a table that has archiving enabled that every edit is tracked in your database. This means that if you delete the contents and reinsert them every day, your database is recording all those edits. If your table is large, you should consider periodically trimming the table.

vijaybadugu
Frequent Contributor

Thanks Robert.  

0 Kudos
vijaybadugu
Frequent Contributor

How to preserve Globalids and objectids during this execution ?

0 Kudos
RobertKrisher
Esri Regular Contributor

If you're just updating rows in the table every night global ids and object ids will be preserved because you're not deleting features every night.

0 Kudos
vijaybadugu
Frequent Contributor

I checked append tool. it did not update existing report based on matching field and always inserts new record. I am using arcgis pro 3.3

0 Kudos
RobertKrisher
Esri Regular Contributor

Double-check that the data type and formats of the input/target match fields are identical. As an example, if the field is a long in one database but a double in another this could cause an issue (this can happen with object id fields if you create them using views).

0 Kudos
vijaybadugu
Frequent Contributor

 

I have assigned GlobalID for input dataset and target dataset as a matching field in Update option,  

The results are duplicated .

vijaybadugu_0-1729171832505.png

 

0 Kudos
RobertKrisher
Esri Regular Contributor

Field names for the screenshot would be appreciated, so I'm going to have to make some assumptions. I'm going to call your first column a GUID, your second column and Object ID, and the third column a Global ID.

Global IDs are a special, unique identifier for a table, so your input dataset (the one with changes you want to push) needs to reference the Global ID of each row in the target dataset (the one being updated). So your input dataset should have a foreign key column the values from the Global ID (third column) of the target table. This will allow the append tool to know how to link the values from the two tables.

In your screenshot it looks like both tables have their own unique Global IDs, which means they'll never be able to match, which is why your rows are being appended and not updated..

0 Kudos
vijaybadugu
Frequent Contributor

In this below logs, 737 records were inserted and 737 were updated.  How it is possible two DML operations at the same time using APPEND 

vijaybadugu_0-1729190918445.png

 

0 Kudos