ArcEnterprise 10.8 Service Definition File Size Limit

07-09-2020 10:32 AM
Frequent Contributor

Hello - I don't think that this section from the Enterprise help

Manage hosted feature layers—Portal for ArcGIS | Documentation for ArcGIS Enterprise 

at 10.8 is entirely accurate:

"Allow attachments

You can allow editors to attach images and other files to individual features in a layer in a hosted feature layer. This is useful, as it allows you to associate documentation or photos to specific, relevant features. For example, a code compliance officer might attach a photo of the code violation for a specific address point, or a building inspector might include a PDF of a permit for a building feature.

Each hosted feature layer view inherits the attachment setting of the hosted feature layer from which it was created. By default, all view users can see the attachments inherited from the hosted feature layer. To control who has access to these attachments, you can hide the attachments in the hosted feature layer view.

Each file you attach to a feature can be a maximum size of 2 GB. To attach files larger than 2 GB, you can use the Upload Part and Add Attachment operations from ArcGIS REST API to do a multipart upload. . . . . . "

What I have found is that when sharing or overwriting a hosted feature layer via the 2.5.x ArcGIS Pro sharing module to Enterprise 10..8, the maximum service definition file size appears to be 2GB.  

In my case, I have a point layer with some 3600 records and pdf attachments.  The largest attachement is about 9MB.  However, the total size of the service definition file exceeds 2GB, and the sharing module overwrite now fails with a 99999.

Specifically, the:

## arcpy.UploadServiceDefinition_server('C:/ArcProProjects/PortalUpdates/Environmental/' + + '.sd', 'My Hosted Services', "","","","","","OVERRIDE_DEFINITION","SHARE_ONLINE","PUBLIC",

fails because I think that the entire sd file size cannot exceed 2GB.

Has anyone encountered this?

0 Kudos
13 Replies
Occasional Contributor

Buen día David Coley‌, proba subir en ventana incógnito de google chrome. 



Telecom Argentina

0 Kudos
Frequent Contributor

Grasias Anibal - es posible.  Pero luego tendria que usar la api  . .

And I didn't really want to do that because the file does publish to AGOL using the Pro sharing module and I didn't want to have to re-write this part of my workflow unless necessary

0 Kudos
Esri Contributor

Hi David,

I'm posting my response here as well to keep this post up-to-date. There is no file size limit for SD files. If your attachments are below the 2GB file size limit and you still see an error, there is likely a different issue here. We would need to see your python script to look into the problem.



0 Kudos
Frequent Contributor

Hi Jonah - yes thanks.  And yes, that's what I thought as well re SD files.  I'd never read anywhere about a size limit for those.  Yes, the largest attachment (in this case pdf's) is about 9.5Mb.  I like to work in the IDLE for Pro - here is a snippet of my code:

def stageSDDraft(mp,lyrName,summary,lyrs,descript,folder,tags,sdfolder):
 sddraft_out = "C:/ArcProProjects/PortalUpdates/" + sdfolder + "/" + lyrName + ".sddraft"
 draft = mp.getWebLayerSharingDraft("HOSTING_SERVER", "FEATURE", lyrName, lyrs)
 draft.portalFolder = folder
 draft.tags = tags
 ##draft.allowExporting = True
 draft.summary = summary
 draft.description = descript
 draft.overwriteExistingService = True
 portal = arcpy.SignInToPortal("", "userName_portal", "userNamePw")
 aprx ='C:/ArcProProjects/PortalUpdates/PortalUpdates.aprx')
 for m in aprx.listMaps("FloodAttachments"):
 print("Map: " +
 for lyr in m.listLayers():
 desc = arcpy.Describe(lyr)
 if == 'ElevationCertificate':
 lyrList = []
 stageSDDraft(m,, "The Elevation Certificate point layer contains the addresses of homes and businesses that have a documented finished floor elevation that is above the Effective \
 FEMA designated base flood elevation.",lyrList,"The Elevation Certificate point layer contains the addresses of homes and businesses that have documented finished floor elevation \
 that is above the Effective FEMA designated base flood elevation. The certificate is accessed as a layer attachment.",
 print( + " Draft Created")
 arcpy.StageService_server('C:/ArcProProjects/PortalUpdates/Environmental/' + + '.sddraft', 'C:/ArcProProjects/PortalUpdates/Environmental/' + + '.sd')
 print( + " Service Staged")
 arcpy.UploadServiceDefinition_server('C:/ArcProProjects/PortalUpdates/Environmental/' + + '.sd', 'My Hosted Services', "","","","","","OVERRIDE_DEFINITION","SHARE_ONLINE","PUBLIC",
 print( + " Service Uploaded")
 message = message + "\n" + "1. " + ( + " Service Uploaded"‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

It's pretty straight-forward.   However, you are probably right about there being another issue.  I have been overwriting this feature layer weekly for a couple of years because of how the attachments are constructed. 

The attachments are pulled in from another system of record and then we use a GP to dump them into another directory before attaching them to the points via a match table. 

I am going to try setting this up with a new layer, new SD file, etc and see it works.....

0 Kudos
Esri Contributor

Hi David,

Could you try the following as well? This might help us identify the issue.

  1. Try the same workflow from Pro UI instead of Python.
  2. Try uploading the SD directly to portal from portal home app.
  3. Check the logs on server while using Python script and share the logs, if possible. 



0 Kudos
Frequent Contributor

Hi Shilpi - I did try using the Pro UI with the same result.  99999 fail code at the upload stage. I will try adding as an update later today.

0 Kudos
Frequent Contributor

This is the portal error i receive:

Failed to add or update item '1cab88c8bef743baab32716aa17e9724'. Error writing to file: D:\arcgisportal\content\items\1cab88c8bef743baab32716aa17e9724\

Note this is an entirely new layer (but the same data and attachments) copied to my file gdb and added to my pro project's map and so in this case is not an overwrite but a new write.

I think i've got either a corrupt attachment or attachments or there are communication errors between my portal, server and datastore (although all components test correctly). So as Jonah Lay‌ suggested earlier there are probably other issues going on with my environment or the layer.

Unfortunately, my org is really abusing the attachment capability.  I don't think attachments should replace a document search system or repository.

0 Kudos
Frequent Contributor

So it turns out that this is a result of invalid characters in the attachment file name and some invalid attachment types  The Postgres datastore at 10.8 does not like certain characters in the name - Brackets, braces, dashes, periods can all cause issues. 

In this case some invalid email attachment types ".msg" and some period "." characters in the file names caused an 'Object Store Exception' error. Datastore thought that was the file name. 

Amazingly ArcGIS Online hosted this layer with no issue, i guess it's datastore can handle these things.  

Frequent Contributor

interesting development Jonah Lay In this case, after removing bad naming conventions and invalid attachment types, the sd file will still not complete an overwrite with either the pro sharing module or the api.  The pro module fails with the same 5 9's code while the api featureLayerCollection.manager.overwrite(data_file=sdFile) method fails with 

Expecting value: line 1 column 1 (char 0) after a certain amount of time. (@ 6 minutes on portal, ~ 21 minutes to agol) 

this suggests a bad name or type of attachment, but if that were the case I would think the layer would fail it's .add and .publish when adding as a new layer.  On the other hand, there are issues with size and hosted feature layers:

BUG-000127663 Exporting a hosted feature layer containing large attachments fails if the size of the
exported file geodatabase exceeds 2 GB.

To be clear, other service definition files / layers with attachments (with standard attachment names and types) that are less than 2Gb in size don't have any add, publish, overwrite or pro sharing module issues.

0 Kudos