import pyodbc, arcpy cnxn = pyodbc.connect(trusted_connection='yes', driver='(SQL Server)', server = 'abcserver', database='TEST') cursor = cnxn.cursor() try: query = " SELECT surflat, surflong FROM table1 cursor.execute(query) cnxn.commit() except: print "error"
After you get your data from your source, I think the easiest next step would be to use an insert cursor. Either directly into your SDE feature class or into a temp in memory feature class to then upload to SDE.
See the last example in this help page for inserting geometry. It is a little easier than depicted since you are only inserting point geometry.
http://resources.arcgis.com/en/help/main/10.1/index.html#//018w0000000t000000
Hello. Your workflow should work just fine. One thing I would recommend is setting up OLE DB Connections in ArcCatalog for your SQL Server connections. Define the Provider as SQL server and then select your server name, credentials, and database you are reading data from in your script.
For your workflow you should be able to simply Make XY event layer from your SQL table (e.g. name it "SQL_XY")and then use the Copy features GP tool. The input features for the tool would be your new "SQL_XY" table and the output feature class would be the SDE feature class you want to export the data to (just ensure that you have also setup a spatial database connection in Catalog to the SDE database) . See attached code and ModelBuilder workflow for an example. Good luck!
Tara