I have created a hosted table in ArcGIS Online which is updated from a csv file on my local machine. I run this process every 15 minutes using the python API (code attached).
The general process is:
1. connect to AGOL and "get content" from the hosted table using it's alphanumeric ID string
2. make the table a feature layer collection
3. overwrite the table collection with the local csv file (from my pc).
The process works correctly for a couple of days, but then fails. We speculate it might be a network connection problem (usually the process takes 30 seconds, but sometimes it takes 5 mins). Ideally we would like to update the hosted table with a csv in AGOL instead of the local file but can't get it to work.
the ask:
1. how can we make this process work consistently with the local csv file?
2. how can we update the hosted table with the csv on AGOL?
Note: we have tried deleting, truncating, upsert-ing but our hosted table does not have those capabilities.
Have you logged the error that it gives when failing? Or do you have an example of the script?
I have a similar automation running on a quarterly basis that randomly failed recently, throwing an error that the referenced service did not exist (something that was uploaded to AGOL and immediately referenced). Wondering if something like this is happening on your end as well.
I attached the script, but am pasting it below as well. I didn't get an error message, the process simply stops running.
EDIT: I DO get an error message from my python scripts - but currently the process is working and so the errors have scrolled off. I'll post the error next time it breaks. I believe it mentioned multiple instances running.
Code:
from arcgis.gis import GIS
from arcgis.features import FeatureLayer, FeatureLayerCollection
gis = GIS ("https://www.arcgis.com", "login", "pwd")
my_online_hosted_Table = gis.content.get('xxxxxxxxxxxxxxxx')
my_table_collection = FeatureLayerCollection.fromitem(my_online_hosted_Table)
my_table_collection.manager.overwrite(r'my_local_file.csv')
#LOGOUT FROM ARCGIS ONLINE
gis._con.logout()
It would be a great idea to add logging into this so that you can more easily go back and examine errors. Without being able to call the truncate method on the table or other, this would be the method. Updates like this are always extremely fickle in AGOL though.
Thanks for the suggestion. I added logging and received the following error message today. Our process name is "work 3".
Execution of job "work3 (trigger: cron[minute='2,17,32,47'], next run at: 2025-08-27 13:02:00 UTC)" skipped: maximum number of running instances reached (1)
Thanks for any insights.
That error seems more so related to the way the task is being scheduled. What method are you using to trigger this?
I'm using the apscheduler to call a .bat program every 15 minutes, which creates .csv files and then updates the AGOL hosted table with the new .csv file. I have pasted the code below. Note that i increased the sleep time to 3 seconds as a proposed solution to the multiple instances error, but this fix didn't work.
Here's the apscheduler code:
from apscheduler.schedulers.background import BackgroundScheduler
cmd3 = r"C:\Users\path\myfile.bat"
def work3(): returned_value = os.system(cmd3)
sched = BackgroundScheduler(daemon=True,timezone="UTC")
sched.add_job(work3,'cron',minute='02,17,32,47')
sched.start()
while True:
time.sleep(3)