I'm trying to increase the performance of a script we run to download all images from selected records in a hosted feature layer and save them down with specific naming formatting. We've been doing this by exporting the hosted feature layer to a local geodatabase and following the rest of the process. But the time to process keeps increasing exponentially when accessing these intermediate steps from the local geodatabase.
I'm trying to get these intermediate steps to all run in memory until saving the final photos down to our network server. I can easily get the input_layer to memory and added to the map. But it doesn't come with any of its attachments and I can't find a way to save the attachments table as well. Any thoughts?
p = arcpy.mp.ArcGISProject("CURRENT")
m = p.listMaps()[0]
tempExportFeatureClass = arcpy.conversion.ExportFeatures(<input_layer>, r"memory\TempExport")
tempExportFeatureClass_object = arcpy.management.MakeFeatureLayer(tempExportFeatureClass, "TempExport_lyr")
TempExport_lyr = tempExportFeatureClass_object.getOutput(0)
m.addLayer(TempExport_lyr)
Have you tried using the new Attachment Export tool? You could pretty easily rebuild the attachment table by building the directory structure and file naming from the attachment ID field.
In-memory workspaces aren't geodatabases, they don't support attachments (among many other GDB features).
They may not be able to support attachments, but in theory, they should be able to support a table, e.g. an attachment table.
I think the larger issue here is that it's really hard to look for the attachment table in a hosted feature service with arcpy, to the point where I'm really not sure how best to do it without downloading to local? It's a serious deficiency.
You could look at the ArcGIS Python API Layer Attachments | ArcGIS API for Python but quite frankly, I find it confusing.
I think @HaydenWelch 's suggestion of the Export Attachments Tool is going to be the most straightforward for you to use?