I am using the addItem and addPart APIs from the ArcGIS REST API to upload large files (eg >1GB) to an ArcGIS Portal.
During my testing I have found the disk on the server keeps filling up. Even thought this is only a test server, and the disk is only 100GB, I don't think this should be happening. Even though my test uploads are up to 30GB in size, I have always deleted the content from portal before my next test.
My theory is that any time an upload fails because a user aborts the operation, or there is a network failure, that there are "parts" that have been uploaded to the server that no-one has called the commit API to finalize the upload. Since no-one is keeping track of failed uploads, there are a whole bunch of zombie parts filling up the hard disk.
Does anyone know how long a non-committed part will remain on the disk? Anyone got any hints on how to manage these non-committed parts?
Why not simply use ArcCatalog?
We just finished setting up Portal and completely loaded it from ArcCatalog
In all fairness to the OP, the ArcGIS API for Python | ArcGIS for Developers "provides simple and efficient tools for sophisticated vector and raster analysis, geocoding, map making, routing and directions, as well as for organizing and managing a GIS with users, groups and information items." As much as ArcCatalog may work for people, falling back to ArcCatalog because ArcGIS API for Python doesn't work isn't a great solution and let's Esri off the hook.
I think this would be determined by your current cleanup schedule settings: About server directories—ArcGIS Server Administration (Linux) | ArcGIS Enterprise
If I understand your problem correctly, you may want to take a look at your uploads folder to see what's in there after running your tests. I've had similar issues when testing upload of large raster datasets to an Image Service.