Hello,
I'll try to keep this short. I represent a large goverment organisation in Sweden. I am a GIS coordinator for the user side of the organisation and of my daily tasks is to recommend best practices, to keep performance up within the typical large organisation restraints (central SDE, network drives)
I have a benchmark script that does this:
1 - creates two file gdbs at the same location (arcpy.CreateFileGDB_management x 2)
2 - creates two empty featureclasses in one of them (arcpy.CreateFeatureclass_management x 2)
3 - exports both featureclasses with one command (arcpy.FeatureClassToGeodatabase_conversion(fcs))
4 - then exports them both again, this time one by one
When I do this on my local harddrive, i.e. my computer, it takes around 10 seconds all together to execute the script.
When I run the script on a network drive, it takes upwards of 8 to 10 minutes (!!!!!)
Where do I start? Any good resources I should read? I am not a windows network specialist and I have no idea why it is like this. Any feedback is greatly appreciated, thank you.
WIndows shares will always be slower than using a local drive and your network will determine how much slower. You've already benchmarked performance, but if you want to take ArcMap/Esri products out of it, you can use Python to create/delete files on the share or IoZone, which can provide benchmarking statistics for throughput on reading/writing to directories:
Iozone Filesystem Benchmark
You'll likely need to involve the organizations IT staff.