Running into a bit of expectation management quandary. I have multi-machine deployment of ArcGIS Enterprise (dedicated web adaptors (x1), portal server (x1), host site (x2), data store (relational x1), etc). I'm trying to push 350K records into a REST service that is for a hosted feature layer service. In ArcGIS Online (and I am aware that there is a lot of tuning that went into it) I can push ~46K per minute which loads the data set in a reasonable 8 minutes or so. In my Enterprise Portal, the best effort I am getting is ~3.5K per minute, a several hour endeavor. I have optimized the process as much as I can by using in_memory (memory as I am using the Pro flavor of the Python API to do the heavy lifting here).
I first truncate the service and then use append to push the new data in. In an ideal world, this would be done using delta's, but this data set does not natively so a dump and upload seems the most logical at the moment.
Everything I am doing is on premise and within the same data center. I shut off any IT management tools like security, etc. Is what I am seeing actually reasonable or am I should it be more than that? Or does the speed enhancement come in if you have GeoAnalytics Server?
Thank you all for your insight ahead of time!