This question comes from a couple places.
There are several threads about automating SDE compression, and good python examples on how to get started too.
To my understanding, the ListWorkspaces functions can find SDEs in a work space (in this case Database Connections). However, what I want is something similar to the DA walk function that discovers workspaces, except I want it to discover all possible database connections associated with a server instance (assuming I could deal with the UN/PW concerns). I know I could create database connections, if I had the SDE names, but I am sure I have found anything that enables you to list them in python. Is this something that requires ArcObjects? Is there just some function or step I am missing related to the DB connections?
Thanks in advance Data Management Community.
It appears you are talking crawling file systems and not connections from layers in map documents or similar. The ArcPy Walk function will see/find SDE connection files if followlink is True, but I always hesitate to enable that because I usually don't want it following link to GIS servers and what not.
If you use the regular Python walk and search for SDE connection files (*.sde), you can use ArcPy Describe to check the connection properties of the connection file for the server instance of interest.
I guess I do want the files or specifically the SDE names, but I guess I want to spontaneously create the SDE connection files based on the server instance. I think were were interested in one server in particular, but I understand it is a risky proposition generally. I will give walk a try then and report back.
I guess do the SDE connection files have to exist before you look for them (IE have you had to connect to the SDE before). My assumption is the script will only know the server instance to crawl.
It might be what I want already exists, so I think I will give the walk function more scrutiny in this example.
Honestly, I am more confused than ever as to what you are trying to do. Yes, an SDE connection file has to exist in order to see the properties within the file. If you want to "spontaneously create the SDE connection files," why do you need to search for them?
If you could try elaborating again, I am willing to take another stab and offering a suggestion.
Sorry for the confusion. As some background I am familiar with working with ArcGIS server in the desktop environment, but not necessarily in python directly. I am imagining that in order to connect to an SDE that is using SQLServer, I need to use the Create DB Connection tool (using 10.3 or later correct?). The issue is I am assuming in only know the instance name, but not the name of the SDE which is also required by the tool.
The goal is to run a compression tool on all SDEs encountered in the instance essentially.
A script I am thinking about would look like this.
# Name: CreateDatabaseConnections.py # Description: Will use a server instance and authentication to derive SDEs connections that will then have the Compress tool ran on them as # as part of on going server maintance. # Import system modules import arcpy def find_sdes_from_server_instance(server_instance): "Function will look at all SDEs on a server instance and return their names as a list" return ["Roads.sde","Bikes.sde","BuiltEnv.sde", SomeOther.sde] # Example return sde_name_list=find_sdes_from_server_instance("MYSQL_SERVER") for sde in sde_name_list: arcpy.CreateDatabaseConnection_management("Database Connections", sde, "SQL_SERVER", "utah", "DATABASE_AUTH", "gdb", "gdb", "SAVE_USERNAME", "garfield", "#", "TRANSACTIONAL", "sde.DEFAULT") .... more stuff arcpy.Compress_management(connection_file) clean up....
OK, now I think I get what you are after.
The first question that comes to mind is, how often do you expect to have geodatabases within this/these SQL Server database instances created and removed? In my history of working with ArcGIS, creating and deleting geodatabases happens much, much less frequently than creating or deleting feature classes and tables. If geodatabases won't be created and deleted very often, I think the simplest solution is to create an SDE connection file ahead of time for each geodatabase and then loop through the SDE connection files on the file system for the compressions.
If you really don't know what geodatabases exist in a given SQL Server instance, how do you know how the security is setup, what username and password to create the connection file with?
I will have to think a bit on listing geodatabases within a SQL Server instance. It may be one has to use arcpy.ArcSDESQLExecute to query for certain system tables associated with SDE.
Not sure if there is a simpler way, but I found a fairly straightforward way that works against a couple SQL Server instances that I tested.
import arcpy import os inst = # SQL Server instance sql = ("SELECT name " "FROM sys.databases " "WHERE CASE " "WHEN state_desc = 'ONLINE' THEN " "OBJECT_ID(QUOTENAME(name) + '..[SDE_Version]', 'U') " "END IS NOT NULL") conn_path = # Output folder for database connection file conn_name = # Output name of database connection file, including .sde extension conn_file = os.path.join(conn_path, conn_name) arcpy.CreateDatabaseConnection_management(conn_path, conn_name, "SQL_SERVER", inst, "OPERATING_SYSTEM_AUTH") sde_conn = arcpy.ArcSDESQLExecute(conn_file) sde_res = sde_conn.execute(sql) print sde_res
The code above looks for a specific SDE system table to determine whether a SQL Server database is a geodatabase. I don't know whether I picked the best system table, maybe there is a reason the one I picked isn't a good idea, I don't know.
The code should print a list of geodatabases within a SQL Server instance (note, if no geodatabases are found, a Boolean True will be returned instead of an empty list or string). Once you have the list, you can create a database connection file for each geodatabase and do whatever other work you planned.
The SQL code was adapted from Display all the names of databases containing particular table.
UPDATE: Code updated to address offline databases, thanks Rebecca Strauch, GISP for pointing out the problem with offline databases.
I was trying the code this morning....couple comments
sde_res = sde_conn.execute(sql)
Error I got
Traceback (most recent call last):
Python Shell, prompt 24, line 1
File "C:\Program Files (x86)\ArcGIS\Desktop10.3\arcpy\arcpy\arcobjects\arcobjects.py", line 27, in execute
AttributeError: ArcSDESQLExecute: StreamExecute ArcSDE Extended error 942 [Microsoft][SQL Server Native Client 11.0][SQL Server]Database 'sde_imagery_pre2000' cannot be opened because it is offline.
For me, that is the only one out of about 15 that is offline. Any thoughts on getting it to skip over offline databases? Not critical for my needs, but thought it might be a nice utility script to have for future.