Im calling on "arcpy.conversion.FeatureClasstoGeodatabase" to download a couple features into a gdb. these features are located within enterprise and both are published as feature services.
1) Roads, 600k features takes about 22min to download
2) Addresses, 3.2mil features never finishes. after 6hrs of runtime there are only 200k features in gdb
im wondering if someone could help me out here with running this basic command i expected the addresses to take longer since there are more features but i dont understand why there is a major discrepancy in runtime vs # of features downloaded between the two data sets?
is there a better command to call on within python 3 to achieve the same goal?
that all makes sense, unfortunately the data is from an external source. we dont have any control of the source other than we've been given access to the REST endpoints.
Copy Features has the same result, for example. Another option depends on the capabilities of the source feature service, you could look into creating a replica (with file GDB as export option) and downloading that.