This has been going on for years but I am finally posting since we are totally stuck this year.
We collect a lot of photos. Every year just after the service hits 10GB the Export to GDB starts to fail.
It will sit like this Exporting for hours. Then the Export button goes back to not greyed out and clickable again.
The resulting file that is created is way smaller than it should be.
Been trying for weeks.
This is also happening in my nightly backup script. It fails on this line every night.
fsLink = gis.content.get(itemId)
result = fsLink.export("tempOut"+ HFSname, "File Geodatabase")
raise Exception("Could not export item: %s" % self.itemid)
Exception: Could not export item: 713e3aaef9674e34555555333b618
It fails in arcgis.gis here
if wait == True:
status = "partial"
while status != "completed":
status = export_item.status(job_id=res["jobId"], job_type="export")
if status["status"] == "failed":
raise Exception("Could not export item: %s" % self.itemid)
And again it leaves a export file in AGOL that is way too small.
Last year I could get some to download over 10 GB but pretty rare. I also tried using REST in the past but that fails also once the file gets around 10 GB.
Next I took out all layers but photos and that usually works but photos are important to backup.
Most of the time now to get an export I have to make a replica and download that. Not a great option.
Why is it failing at 10GB? Does it have something to do with a AGOL limit - I think photos used to be 10GB.
We really need a nightly backup, and since none is available in AGOL strangely this is the only option.
Appreciate any input. thanks!
This article suggests to export in chunks:
https://support.esri.com/en/technical-article/000014156
However I am also having problems exporting... I think it has to do with the size of the chunks which makes scripting this a little more challenging
I'm not quite that large yet, but have had some issues with incomplete or errors on exporting my hosted feature layer.
However, so far, using the arcpy.management.Append to append the hosted feature data to a local geodatabase has been working fine so far. Takes a long time, but seems to work better than my export attempts.
R_
Yes we are doing some exports now using Feature Class to Feature class or even copy paste in Pro.
In most cases though we have 30-40 layer and tables plus 30 some relationship classes so it falls apart.
I do have some running until 18 GB so I am not sure why some have an issue.
I have found that cleaning up any old replicas helps. Has anyone seen things like this help?
Hi Doug - did you end up figuring out a programmatic solution for this? We have 42 clients, each with between 10 and 200 feature services we need to backup, and our Python scripts all fail in the same way. Considering using a script to use the REST API to create replicas and manually download them but not sure if that would have the same pitfalls as the Python API.
Thanks!
Aidan
My bug was marked as fixed and it has been better. I have found it works better late at night. For now we took out any layers with photos, and its is much smaller download so we have been ok. Then I just do photos one a week.