I am fairly new working with SDE and I am having a lot of issues trying to run any arcpy cursor, or any arcpy tools on tables that are over 10k records. These tables were created by a stored procedure on SQL, then register within the SDE, they are not versioned, OBJECTID was added as part of the process as well as indexes. Accessing the database on SQL management Studio and creating the tables seems to work as expected. I've been told by peers this may be normal with SDE and it is only intended for editing version tables, and compromises on performance are made for everything else. I am not ready to give up yet. is the SDE not design for these kind of processes? Does anybody think it may be an issue with the original install and configuration?
Cursors are found throughout every programming language. arcpy's cursors are wrappers for the underlying database language, and they are a primary means of performing CRUD operations on feature data, not just for editing versioned tables. Some databases might be faster than others, but there are a lot of other factors that come into play outside of the cursor itself.
Are you using the arcpy.da.<Type>Cursor cursors, or the old arcpy.<Type>Cursor? Try the .da. cursors- they are documented as being the faster of the two.
Are you accessing the data through VPN, firewall, proxies, etc?
What functions/ data manipulation are you doing within the cursor?
Is this SQL Express, or full SQL server? How much RAM is available to the SQL instance?
thanks for your feedback.
my response to your questions:
I am using arcpy.da type of cursors.
Yes, accessing the data through VPN
Mostly update cursors, but also other tools such as add fields (arcpy.AddFieldd_managmenet) and statistics ( arcy.Statistics_analysis). None are currently working on large tables
full SQL server
My experience with the VPN we use, is the same as @RhettZufelt's. I gave up on it and just remote in to a local pc to do anything gis. Try, if you can, to run the script on the hosting server or on a pc that is not going through a VPN. You can copy the fc to a local file geodatabase (or local test sql express db) and see if the performance is any different as a test.
Suspect you are running into a VPN issue.
You might try on a local copy and see if it is better. My experience shows that SDE/SQL server operations are dreadfully slow when accessing through VPN.
Thank you both. I do have one follow up: I am just experiencing issues when connecting and trying to run tools or update cursors on to the SDE via ArcMap/Arc Pro, however I have no issues at all when accessing the database via SQL and I can create complex stored procedures that run normal regardless of the size of the tables. Do you still think is a VPN thing?
Give a local database a try and that will probably give a good idea if it is or not. Accessing the db is really reading the list of tables within the db and Pro likes to cache things locally. Mine is slow as well when it connects to the database and reads the datasets. 2-4 minutes of spinning and thinking... so each click has to be very calculated and deliberate.
The stored procedures are computed on the server, so it's getting a signal to run them and returning the results. Accessing the data through the cursor (or the other functions) can involve multiple send receive requests/ and results, updated data returns from your pc's processor/ memory to the server, and back so it's getting shoved through that working that VPN tunnel a lot. If you have virus/malware watching the VPN activity, it gets checked on each request and slowed down even further.