Select to view content in your preferred language

Advice sought: best way to regularly import data from Postgresql to ArcGIS Online

121
1
02-04-2026 04:10 AM
Labels (1)
MappyIan
Frequent Contributor

We have some data in a Postgresql database that is continuously updated by another team.  The DB admin has created a read-only view of this data that I can access.  What I would like to do is regularly (every 15 minutes) get the latest data from Postgresql and import it in to ArcGIS Online, replacing (not appending to) the previous data that was in AGOL.

What I'd like to know is: what is the best way of achieving this?

I'm familiar with ArcGIS Notebooks and I reckon I could script something that would accomplish what I'm after, and I can set the Notebook to run as a scheduled task every 15 minutes.  But the DB admin would need to open up the Postgresql view to make it publicly accessible for ArcGIS Notebooks to be able to access it.  They're reluctant to do this and given the database is hosted on premise this does seem unnecessary.

What other options are there?  We obviously want to keep cost/effort to a minimum.  I'd love to hear your thoughts.

0 Kudos
1 Reply
David_McRitchie
Esri Regular Contributor

Hey Mappylan 

ArcGIS Notebooks sounds like a great way to do this, its a bit of work at first but you can kick your feet up after implementing it. Rewards of automation.

If you have access to ArcGIS Pro I'd take a look to see if you can load the view into this and if so, consider taking an export of the data and using this with your Notebook if doing this directly from the database is not feasible. 

I would also caution you on a few common pitfalls. I would question if we really need this to kick off evey 15minutes? Updates happening this often can often need a premium datastore depending on the data size and total datastore utilisation for the ArcGIS Online subscription. If we can push this into less frequent small updates it will help keep things in order. The key thing is when testing on this to keep an eye on the Datastore performance graph for your ArcGIS Online organisation, and make sure it has enough overhead to deal with other heavy processes (either background ones, or other activities such as reindexes or peak user usage)

Secondly, I would keep a close eye on your feature service settings, in particular Sync and Change Tracking. For a workflow such as this I would recommend keeping these disabled if possible, or have a view to do a frequent tidy up of the service by trimming the Change Tracking tables.

If we had change tracking enabled and ran this workflow, you are going to get backend tables recording and maintaining every single change. On a 15minute basis it could result in the service rapidly bloating in size. (This also will run up a high AGOL credit bill)

I previously had ran a similiar notebook that wrote out service size at different intervals and response time to return a 1=1 query and while for the most part this works, when we scale the data up it can quickly run into niche issues, such automation failure, or very poor performance. The issues encountered tend to be sudden rather than gradual.

Just something to consider. Hope it helps,

David

Esri UK -Technical Support Analyst
0 Kudos