Skip navigation
All Places > Geodatabase > Blog

Geodatabase

11 posts

Esri's 39th user conference is around the corner! As you are making travel plans and scheduling your time, consider sessions offered by the Geodatabase team. We've organized them into several learning paths that you can explore in this story map : Data Management Learning Paths at UC2019

For Enterprise related workshops checkout their blog:  ArcGIS Enterprise at 2019 UC 

For Utility users , visit  Electric Gas and Telecom Industry Activities UC2019 

Ran into something interesting that might be worth sharing with the group. One of the developers on staff ran into an issue where he couldn't pull lat/long from the database because the shape fields are stored as Geometry (not geography) and we didn't store Lat/Long within the tables. For those who weren't aware you can put a SQL query in to generate X,Y for you from the shape fields present in our databases. The syntax will be a bit different based on database version (not SDE version). For MS SQL coming out of a Geometry Column you would use something like Example 1.

 

So how do you pull that information out for an application to use/reference? Here is what I came up with for him.

 

Option 1 Using Arcade:

For those using ArcGIS Pro and/or ArcGIS Online, a increasingly useful capability would be to use the Arcade scripting language. In this case, build a function. As a proof of concept, I looped it into a URL could be easily parsed via Google Maps. They can change the URL to match a Collector, Survey 123, or other application at will. See examples 2A and 2B(accidently labeled again 2A to keep everyone on their toes).

 

 

Option 2 Using SQL:

With ArcGIS you can create query layers, but when published via ArcGIS Server, which this developer was doing, there is a performance hit to the server each time it pulls one of these query layers. Sure it is more of a pain, but your Database Administrator can create and adjust views just as fast and they put the burden of rendering on the database server which usually has capacity to spare. That said, you can add columns in queries and even do math along the way. See Example 3. Keep in mind the syntax of Example 3 is for MS SQL. As most of you are using Oracle, it may be slightly different for some of the operators like PI, though that is a bad example as PI is called the same way in Oracle. The moral of the story though is double check that all the operators (functions) are correct.

 

declare @shift numeric
declare @x numeric
declare @y numeric

set @shift = 2.0 * PI() * 6378137.0 / 2.0

SELECT structure.OBJECTID
,structure.FACILITYID
,structure.DSTYPE AS [Type]
,structure.shape.STY as Y_Coord
,structure.shape.STX as X_Coord
, (
180.0 / PI() * (2.0 * Atan( Exp( (((select structure2.shape.STY FROM WebGIS.UTIL.swDrainageStructure as structure2 where structure2.objectid = structure.OBJECTID) / @shift) * 180.0) * PI() / 180.0)) - PI() / 2.0)
) as [Latitude]
,((select structure1.shape.STX FROM WebGIS.UTIL.swDrainageStructure as structure1 where structure1.OBJECTID = structure.objectid) / @shift) * 180.0 as [Longitude]
FROM WebGIS.UTIL.swDrainageStructure as structure

This blog will give you information about the ability to register database views with the geodatabase at 10.5.

 

ArcGIS allows access to database views and tables that are not registered with the geodatabase via a query layer. An unregistered view within a geodatabase can be a view that is created using DBMS client tools or via the Create Database View tool.

 

Many users that work with enterprise geodatabases and database views have requested the ability to register views with the geodatabase. This will store information about the database view within the geodatabase system tables to allow the user experience with views to be similar for viewing and querying geodatabase tables and feature classes.

 

At the 10.5 release the Register with Geodatabase tool has been enhanced with additional parameters and now allows the registration of views with the geodatabase. You will also notice the right click Manage > Register with Geodatabase context menu for unregistered objects will now open the Register with Geodatabase tool.

 

The process of creating a registered view:
 1. Create a view using DBMS client tools or Create database view geoprocessing tool
  2. Register with Geodatabase using Register with geodatabase geoprocessing tool
 
The registration process requires a view to exist that is created via the database client tools, or by using the Create Database View tool. After the view exists the view can be registered using the Register with Geodatabase tool. The right click context menu from existing database views will also open the Register with Geodatabase tool with some of the initial parameters populated. Depending on the type of columns and if the view contains existing records parameters are pre-populated when opening the tool from the right click context menu. Not all tables or views are good candidates to register with the geodatabase, so this two-step process allows users the flexibility to interact with unregistered database views via a query layer and optionally register with the geodatabase.

 

Registered views:

After the database view is registered with the geodatabase it will appear the same as a geodatabase table or feature class, depending on if the view definition contains a spatial column. Views with a spatial column will be displayed with a feature class icon corresponding to the geometry type chosen during registration.

 

Context menu for unregistered database views within the geodatabase connection. 

 Enhanced Register with Geodatabase tool for 10.5.

Benefits:

  • Performance improvements for viewing and querying due to table metadata (rowid, extent, etc) being stored within the system tables 
  • No prompt to specify a unique identifier when adding database views to ArcMap/Pro clients
  • Ability to create and store geodatabase metadata on the registered view

 

 

User experience:

  • Registered views are read only geodatabase objects. This means that operations such as altering the schema or underlying data will be prevented. A error message 'Not supported on a view' will be received to alert you when the operation is not supported against a registered view.
    • Registered views can be renamed, but the schema itself is read only in ArcGIS
    • To prevent participation in geodatabase functionality where workflows will involve editing registered views cannot be added to feature datasets and registered views cannot participate in geodatabase relationship classes. Additionally other geodatabase functionality that implies editing (such as editor tracking, attachments, etc) are blocked.
    • Registered views can be published via map services or query only feature services.
  • Schema modifications to the underlying tables the view is based on should be handled by refreshing the view definition using the RDBMS client tools. Next, making a connection via the ArcGIS client will update these changes. 

  

 

Requirements:  

  • The OBJECT ID FIELD parameter is required for database view registration. This requires a unique integer column that can represent the ROW ID for the registered view.
  • For tables with a spatial column:
    • A single spatial column
    • Spatial column must contain a single entity type
    • If table is empty with a spatial column - it is still required to input these parameters for Shape Field, Geometry Type, and Coordinate System during registration.

 

 

I ran into an issue with connecting to a geodatabase in MS SQL after restoring the database from a backup. The restore process created several orphaned users. I was able to identify the orphaned users using the following query in Microsoft Management Studio.

 

SELECT dp.type_desc, dp.SID, dp.name AS user_name 
FROM sys.database_principals AS dp
LEFT JOIN sys.server_principals AS sp
     ON dp.SID = sp.SID
WHERE sp.SID IS NULL
     AND authentication_type_desc = 'INSTANCE';

This generated the list of users based on missing SQL Server authenticated logins.

 

Next I needed to map the orphaned user to the existing login with the following query in Microsoft Management Studio.

 

ALTER USER <user_name> WITH Login = <login_name>;

 

The source of the solution came from: https://msdn.microsoft.com/en-us/library/ms175475.aspx

 


Changing the order of fields in an existing geodatabase feature class or table has been challenging since the early days of ArcGIS - ArcMap and ArcCatalog.  Without bothering to dig up all of the historical references, blog posts, issue reports, bug reports, great idea postings, workarounds, previous methods and tools, etc, I point at one simple solution to this 16 year old problem (it's old enough to drive in most US states).

 

  1. Purchase and install Sparx Enterprise Architect (EA) version 12.x or higher, Professional edition or higher, floating or single use license.
  2. ArcMap:  export your disorderly geodatabase as a Workspace XML
  3. EA:
    • Tools > Options > Objects > uncheck "Sort Features Alphabetically" > Close
    • File > New Project > name it > Model Wizard > click "Geospatial" > check the "ArcGIS Workspace" box > OK
    • Extensions > ArcGIS > Import ArcGIS Workspace XML
    • double-click the "Workspace" object in the project browser to open a graphic view of your geodatabase schema
    • find the disorderly table or feature class graphic
    • left-click any of the fields in the table or feature class graphic to select it
    • right-click the selected field > View Properties...
    • right-click the field that you want to be in a different order and choose "Move up" or "Move down" as pleases you
    • Close the field properties viewer
    • Extensions > ArcGIS > Export ArcGIS Workspace XML
  4. ArcMap:
    • import your orderly schema from your Workspace XML into your geodatabase
    • load your data into your new schema
    • breathe, sigh, rejoice...  move along

 

That's it.  That's all I got.

 

Happy Friday,

tim

tpcolson

SQL/SDE Health Check

Posted by tpcolson Champion Feb 22, 2016

Another Monday....another bleary-eyed white-knuckle drive to work, both hands wrapped around a coffee mug...which I forgot to fill....phone buzzes....**** I'm late again...nope...it's work: "The geodatabase isn't working. Did you know that?".

 

If only had I known. I could have, at my leisure, not on a day when I have 234 conference calls and 2156 meetings, fired up the vpn from home, quick server reboot.

 

There are a few "Service" health checks out there which test a GIS REST service and can fire an email when something is amiss, but that doesn't tell me if the problem is in the database or the application server, and all it does is tell me that the service  is alive.

 

I'm probably the ONLY GIS SERVER ADMINISTRATOR IN THE ENTIRE WORLD that experiences this, but I often find that Windows Update or a weekend party in the data center leaves some wonky residue in SQL, and "something" ends up "hung up" somewhere, so the maps work, GIS server is running, but can't edit anything. I only hear about it when I least have time to deal with it.


The following, run as a SQL Agent Job, will perform a test edit against a feature class (versioned or not), delete it if everything is working, or send an email if it's not. You can run it hourly (as I do) and modify accordingly to only get one email (what if you don't care it's not working?). Note how the emails appear to come from Captain Kirk, thus increasing chances of being paid attention to. I'm sure there are more elegant ways of doing this, but I like to keep things low-tech.

 

BEGIN TRY
DECLARE @id as integer
EXEC dbo.next_rowid 'dbo', 'TRAILSIGN', @id OUTPUT;
SET QUOTED_IDENTIFIER ON 
INSERT INTO [dbo].[TRAILSIGN]
(OBJECTID,
LOC_NAME,
shape)
VALUES(@id, 'ZZZ_HEALTH_CHECK_TEST_DELETE', geography::STPointFromText('POINT(-83.499752 35.687597)', 4269))
DELETE FROM TRAILSIGN WHERE LOC_NAME = 'ZZZ_HEALTH_CHECK_TEST_DELETE'
END TRY
BEGIN CATCH
    DECLARE @ErrorMessage NVARCHAR(4000);
    DECLARE @ErrorSeverity INT;
    DECLARE @ErrorState INT;
    SELECT 
        @ErrorMessage = ERROR_MESSAGE(),
        @ErrorSeverity = ERROR_SEVERITY(),
        @ErrorState = ERROR_STATE();
    DECLARE @ErrorText NVARCHAR(4000);
  DECLARE @ErrorDisclaimer NVARCHAR(4000);
  SET @ErrorText = 'Dammit Jim! The Engines can''t take this! Something's wrong with the IandM Veg Database! ' 
  SET @ErrorDisclaimer = 'The IandM Veg Database is having some sort of problem. Editing is disabled and read-access may not be working.  ' 
  SET @ErrorMessage = @ErrorText + @ErrorMessage + @ErrorDisclaimer
EXEC msdb.dbo.sp_send_dbmail 
    @profile_name='Captain Kirk',
    @recipients='sasquatch@bigfoot.com',
    @subject='Error in the IandM Veg Database',
    @body = @ErrorMessage;
END CATCH;

 

Combined with the python ArcGIS health checks, you can get a pretty robust error reporting matrix. Warp Speed!

In response to Witch Magic, Snake Oil Medicine, and Spatial Index Tuning

which adds some bling and zing to your SDE database, I've gotten a few comments regarding doing the same type of tuning with the Geography Storage type. Easy Cheesy, just note that this alteration simply replaces geometry with geography, and removes the bounding box from the index creation. Happy tuning!

 

--ORIGINAL SQL CODE FROM  
--http://stackoverflow.com/users/2250424/greengeo  
--MODIFIFIED TO INCLUDE SPATIAL INDEX HELP OUTPUT  
--IN TABLE  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --LOOK FOR THE EXCLAMATION POINTS IN THIS CODE  
  --THERE ARE SEVERAL USER-SUPPLIED VARIABLES THAT MUST  
  --BE INPUT FOR THIS SP TO WORK  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  
  
USE   [GPS_COLLAR]
GO  
  
  
  
ALTER PROCEDURE dbo.sp_tune_spatial_index  
(  
  @tabnm VARCHAR(MAX), -- This parameter stores the name of the spatial table for which you are tuning the index  
  @idxnm VARCHAR(MAX), -- This parameter stores the name of the spatial index of the named table  
  @min_cells_per_obj INT, -- Minimum Cells Per Object to test on. Suggested to start at 2.  
  @max_cells_per_obj INT, -- Maximum Cells Per Object to test on.  
  
  /* The test requires two geography instances to use in test query 1 and 2.  
  The first one should cover the area of default extent. The second should  
  cover an area roughly the size of the area shown when zoomed in, panning  
  around. It is required that the variable store a string that will create  
  the geography instance since this will be done within the procedure and   
  cannot be a variable of type: GEOGRAPHY. The SRID of these instances must  
  match that of the table you are testing. */  
  @testgeom1 VARCHAR(MAX), -- This parameter stores the first geography instance creation string that will be used in the test  
  @testgeom2 VARCHAR(MAX) -- This parameter stores the second geography instance creation string that will be used in the test  
  
  
  
)  
  
AS  
  
SET NOCOUNT ON;  
  
/* Prior to running this procedure, two tables are required. These tables are   
  created here to prepare for running the procedure. */  
  
PRINT 'Checking for required tables...'  
IF EXISTS(SELECT 1 FROM sysobjects WHERE name IN ('cell_opt_perm', 'spat_idx_test_result'))  
  BEGIN  
  PRINT '... The "cell_opt_perm" and "spat_idx_test_result" tables exist.'  
  END  
ELSE  
BEGIN  
  PRINT '... Creating "cell_opt_perm" and "spat_idx_test_result" tables.'  
  CREATE TABLE cell_opt_perm(  
  [perm_id] [smallint] NOT NULL,  
  [permutation] [nvarchar](4) NOT NULL,  
  [level1] [nvarchar](6) NOT NULL,  
  [level2] [nvarchar](6) NOT NULL,  
  [level3] [nvarchar](6) NOT NULL,  
  [level4] [nvarchar](6) NOT NULL  
  )  
  
  INSERT INTO cell_opt_perm ([perm_id], [permutation], [level1], [level2], [level3], [level4])  
  VALUES (1,'LLLL','LOW','LOW','LOW','LOW'),  
  (2,'LLLM','LOW','LOW','LOW','MEDIUM'),  
  (3,'LLLH','LOW','LOW','LOW','HIGH'),  
  (4,'LLML','LOW','LOW','MEDIUM','LOW'),  
  (5,'LLMM','LOW','LOW','MEDIUM','MEDIUM'),  
  (6,'LLMH','LOW','LOW','MEDIUM','HIGH'),  
  (7,'LLHL','LOW','LOW','HIGH','LOW'),  
  (8,'LLHM','LOW','LOW','HIGH','MEDIUM'),  
  (9,'LLHH','LOW','LOW','HIGH','HIGH'),  
  (10,'LMLL','LOW','MEDIUM','LOW','LOW'),  
  (11,'LMLM','LOW','MEDIUM','LOW','MEDIUM'),  
  (12,'LMLH','LOW','MEDIUM','LOW','HIGH'),  
  (13,'LMML','LOW','MEDIUM','MEDIUM','LOW'),  
  (14,'LMMM','LOW','MEDIUM','MEDIUM','MEDIUM'),  
  (15,'LMMH','LOW','MEDIUM','MEDIUM','HIGH'),  
  (16,'LMHL','LOW','MEDIUM','HIGH','LOW'),  
  (17,'LMHM','LOW','MEDIUM','HIGH','MEDIUM'),  
  (18,'LMHH','LOW','MEDIUM','HIGH','HIGH'),  
  (19,'LHLL','LOW','HIGH','LOW','LOW'),  
  (20,'LHLM','LOW','HIGH','LOW','MEDIUM'),  
  (21,'LHLH','LOW','HIGH','LOW','HIGH'),  
  (22,'LHML','LOW','HIGH','MEDIUM','LOW'),  
  (23,'LHMM','LOW','HIGH','MEDIUM','MEDIUM'),  
  (24,'LHMH','LOW','HIGH','MEDIUM','HIGH'),  
  (25,'LHHL','LOW','HIGH','HIGH','LOW'),  
  (26,'LHHM','LOW','HIGH','HIGH','MEDIUM'),  
  (27,'LHHH','LOW','HIGH','HIGH','HIGH'),  
  (28,'MLLL','MEDIUM','LOW','LOW','LOW'),  
  (29,'MLLM','MEDIUM','LOW','LOW','MEDIUM'),  
  (30,'MLLH','MEDIUM','LOW','LOW','HIGH'),  
  (31,'MLML','MEDIUM','LOW','MEDIUM','LOW'),  
  (32,'MLMM','MEDIUM','LOW','MEDIUM','MEDIUM'),  
  (33,'MLMH','MEDIUM','LOW','MEDIUM','HIGH'),  
  (34,'MLHL','MEDIUM','LOW','HIGH','LOW'),  
  (35,'MLHM','MEDIUM','LOW','HIGH','MEDIUM'),  
  (36,'MLHH','MEDIUM','LOW','HIGH','HIGH'),  
  (37,'MMLL','MEDIUM','MEDIUM','LOW','LOW'),  
  (38,'MMLM','MEDIUM','MEDIUM','LOW','MEDIUM'),  
  (39,'MMLH','MEDIUM','MEDIUM','LOW','HIGH'),  
  (40,'MMML','MEDIUM','MEDIUM','MEDIUM','LOW'),  
  (41,'MMMM','MEDIUM','MEDIUM','MEDIUM','MEDIUM'),  
  (42,'MMMH','MEDIUM','MEDIUM','MEDIUM','HIGH'),  
  (43,'MMHL','MEDIUM','MEDIUM','HIGH','LOW'),  
  (44,'MMHM','MEDIUM','MEDIUM','HIGH','MEDIUM'),  
  (45,'MMHH','MEDIUM','MEDIUM','HIGH','HIGH'),  
  (46,'MHLL','MEDIUM','HIGH','LOW','LOW'),  
  (47,'MHLM','MEDIUM','HIGH','LOW','MEDIUM'),  
  (48,'MHLH','MEDIUM','HIGH','LOW','HIGH'),  
  (49,'MHML','MEDIUM','HIGH','MEDIUM','LOW'),  
  (50,'MHMM','MEDIUM','HIGH','MEDIUM','MEDIUM'),  
  (51,'MHMH','MEDIUM','HIGH','MEDIUM','HIGH'),  
  (52,'MHHL','MEDIUM','HIGH','HIGH','LOW'),  
  (53,'MHHM','MEDIUM','HIGH','HIGH','MEDIUM'),  
  (54,'MHHH','MEDIUM','HIGH','HIGH','HIGH'),  
  (55,'HLLL','HIGH','LOW','LOW','LOW'),  
  (56,'HLLM','HIGH','LOW','LOW','MEDIUM'),  
  (57,'HLLH','HIGH','LOW','LOW','HIGH'),  
  (58,'HLML','HIGH','LOW','MEDIUM','LOW'),  
  (59,'HLMM','HIGH','LOW','MEDIUM','MEDIUM'),  
  (60,'HLMH','HIGH','LOW','MEDIUM','HIGH'),  
  (61,'HLHL','HIGH','LOW','HIGH','LOW'),  
  (62,'HLHM','HIGH','LOW','HIGH','MEDIUM'),  
  (63,'HLHH','HIGH','LOW','HIGH','HIGH'),  
  (64,'HMLL','HIGH','MEDIUM','LOW','LOW'),  
  (65,'HMLM','HIGH','MEDIUM','LOW','MEDIUM'),  
  (66,'HMLH','HIGH','MEDIUM','LOW','HIGH'),  
  (67,'HMML','HIGH','MEDIUM','MEDIUM','LOW'),  
  (68,'HMMM','HIGH','MEDIUM','MEDIUM','MEDIUM'),  
  (69,'HMMH','HIGH','MEDIUM','MEDIUM','HIGH'),  
  (70,'HMHL','HIGH','MEDIUM','HIGH','LOW'),  
  (71,'HMHM','HIGH','MEDIUM','HIGH','MEDIUM'),  
  (72,'HMHH','HIGH','MEDIUM','HIGH','HIGH'),  
  (73,'HHLL','HIGH','HIGH','LOW','LOW'),  
  (74,'HHLM','HIGH','HIGH','LOW','MEDIUM'),  
  (75,'HHLH','HIGH','HIGH','LOW','HIGH'),  
  (76,'HHML','HIGH','HIGH','MEDIUM','LOW'),  
  (77,'HHMM','HIGH','HIGH','MEDIUM','MEDIUM'),  
  (78,'HHMH','HIGH','HIGH','MEDIUM','HIGH'),  
  (79,'HHHL','HIGH','HIGH','HIGH','LOW'),  
  (80,'HHHM','HIGH','HIGH','HIGH','MEDIUM'),  
  (81,'HHHH','HIGH','HIGH','HIGH','HIGH')  
  
  CREATE TABLE spat_idx_test_result(  
  [perm_id] [int] NOT NULL,  
  [num_cells] [int] NOT NULL,  
  [permut] [nvarchar](4) NOT NULL,  
  [g1t1] [bigint] NULL,  
  [g1t2] [bigint] NULL,  
  [g1t3] [bigint] NULL,  
  [g1t4] [bigint] NULL,  
  [g2t1] [bigint] NULL,  
  [g2t2] [bigint] NULL,  
  [g2t3] [bigint] NULL,  
  [g2t4] [bigint] NULL,  
  [PF_EFF][float] NULL,  
  [IF_EFF][float] NULL,  
  [GRIDL1] [int] NULL,  
  [GRIDL2] [int] NULL,  
  [GRIDL3] [int] NULL,  
  [GRIDL4] [int] NULL,  
  [TPIR] [bigint] NULL,  
  [TPIP] [bigint] NULL,  
  [ANOIRPBR] [bigint] NULL,  
  [TNOOCILFQ] [bigint] NULL,  
  [TNOOCIL3FQ] [bigint] NULL,  
  [TNOOCIL4FQ] [bigint] NULL,  
  [TNOOCIL0II] [bigint] NULL,  
  [TNOOCIL4II] [bigint] NULL,  
  [TNOIOIL3FQ] [bigint] NULL,  
  [TNOIOIL4FQ] [bigint] NULL,  
  [ITTCNTLGP] [float] NULL,  
  [INTTTCNTLGP] [float] NULL,  
  [BTTCNTLGP] [float] NULL,  
  [ACPONTLGP] [float] NULL,  
  [AOPG] [float] NULL,  
  [NORSBPF] [bigint] NULL,  
  [NORSBIF] [bigint] NULL,  
  [NOTSFIC] [bigint] NULL,  
  [NORO] [bigint] NULL,  
  [PORNBPF] [float] NULL,  
  [POPFRSBIF] [float] NULL  
  
  
  )  
  
  INSERT INTO dbo.spat_idx_test_result  
  VALUES (0,16,0,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL)  
END  
  
  
/* delete all rows from "spat_idx_test_result" table. This makes it ready to stuff in new results.  
  !!!WARNING!!! if your test was interupted, the table will be cleared out and the test will  
  begin from the beginning. You could try to modify this to start where you left off but  
  I didn't have time and this worked well enough for me. */  
DELETE FROM spat_idx_test_result  
  WHERE perm_id != 0  
  
/* set up counters */  
DECLARE @a1 INT  
DECLARE @a2 INT  
DECLARE @a3 INT  
DECLARE @a4 INT  
  
/* set up variables to hold high/medium/low values and permutation to use in rebuilding  
   the spatial index and recording stats */  
DECLARE @lev1 VARCHAR(6)  
DECLARE @lev2 VARCHAR(6)  
DECLARE @lev3 VARCHAR(6)  
DECLARE @lev4 VARCHAR(6)  
DECLARE @permut VARCHAR(6)  
DECLARE @num_cell VARCHAR(4)  
DECLARE @time_str VARCHAR(20)  
DECLARE @perm_id VARCHAR(20)  
DECLARE @x xml  
DECLARE @pf_eff FLOAT  
  
  
  
/* create variables to hold timestamps for beginning and ending of test queries */  
DECLARE @start_t DATETIME  
DECLARE @end_t DATETIME  
DECLARE @elapse_t INT  
  
  
  
/* begin looping through cell option permutations */  
SET @a1 = @min_cells_per_obj  
WHILE @a1 <= @max_cells_per_obj  
  BEGIN  
  SET @a2 = 1  
  PRINT 'Started Testing for ' +CAST(@a1 AS VARCHAR(10)) +' cells per object'  
  WHILE @a2 < 82  
  BEGIN  
  SELECT @lev1 = level1, @lev2 = level2, @lev3 = level3, @lev4 = level4 FROM cell_opt_perm WHERE perm_id = @a2  
  SET @permut = '''' +(SELECT permutation FROM cell_opt_perm WHERE perm_id = @a2) +''''  
  
  
  EXEC  
  ('  
  CREATE SPATIAL INDEX ' +@idxnm +' ON ' +@tabnm +'   
  (  
  [SHAPE]  
  )  
  USING  GEOGRAPHY_GRID   
  WITH  
  (  
  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --MAKE SURE YOU EDIT THE BOUNDING BOX TO BE EXACTLY EQUAL TO THE BOUNDING  
  --BOX OF YOUR SPATIAL TABLE   
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  
  GRIDS =(LEVEL_1 = ' +@lev1 +' ,LEVEL_2 = ' +@lev2 +' ,LEVEL_3 = ' +@lev3 +' ,LEVEL_4 = ' +@lev4 +' ),   
  CELLS_PER_OBJECT = ' +@a1 +' ,  
  PAD_INDEX  = OFF,  
  SORT_IN_TEMPDB = OFF,  
  DROP_EXISTING = ON,  
  ALLOW_ROW_LOCKS  = ON,  
  ALLOW_PAGE_LOCKS  = ON,  
  FILLFACTOR = 100  
  )  
  ON [PRIMARY]  
  '  
  )  
  
  
  PRINT 'Re-built index to ' +@permut  
  SET @a3 = 1  
  SET @a4 = 1  
  WHILE @a3 < 5  
  BEGIN  
  SET @start_t = GETDATE()  
  EXEC  
  (  
  'CREATE TABLE #tmp_tab (shp GEOGRAPHY)  
  DECLARE @g1 GEOGRAPHY  
  SET @g1 = ' +@testgeom1 +'  
  INSERT #tmp_tab (shp)  
  SELECT  
  r.Shape AS shp  
  FROM  
  ' +@tabnm +' r  
  WHERE  
  r.SHAPE.STIntersects(@g1) = 1  
  DROP TABLE #tmp_tab'  
  )  
  SET @end_t = GETDATE()  
  SET @elapse_t = (SELECT DATEDIFF(MS, @start_t, @end_t))  
  SET @num_cell = CAST(@a1 AS VARCHAR(6))  
  SET @time_str = CAST(@elapse_t AS VARCHAR(20))  
  IF @a3 = 1  
  BEGIN  
  IF (SELECT TOP 1 perm_id FROM spat_idx_test_result) IS NULL  
  BEGIN  
  SET @perm_id = 1  
  
  
  END  
  ELSE  
  BEGIN  
  SET @perm_id = CAST((SELECT MAX(perm_id+1) FROM spat_idx_test_result) AS VARCHAR(20))  
  END  
  
  
  EXEC  
  
  
  (  
  '  
  
  
  INSERT INTO spat_idx_test_result (perm_id, num_cells, permut, g1t' +@a3 +')  
  VALUES (' +@perm_id +', ' +@num_cell +', ' +@permut +', ' +@time_str +')'  
  )  
  END  
  ELSE  
  
  
  EXEC  
  (  
  '  
  
  UPDATE spat_idx_test_result  
  SET  
  num_cells = ' +@num_cell +',  
  permut = ' +@permut +',  
  g1t' +@a3 +' = ' +@time_str +'  
  WHERE perm_id = ' +@perm_id   
  
  
  )  
  SET @a3 = @a3 + 1  
  END  
  WHILE @a4 < 5  
  BEGIN  
  
  SET @start_t = GETDATE()  
  EXEC  
  (  
  'CREATE TABLE #tmp_tab (shp GEOGRAPHY)   
  DECLARE @g2 GEOGRAPHY  
  SET @g2 = ' +@testgeom2 +'  
  INSERT #tmp_tab (shp)  
  SELECT  
  r.Shape AS shp  
  FROM  
  ' +@tabnm +' r  
  WHERE  
  r.SHAPE.STIntersects(@g2) = 1  
  DROP TABLE #tmp_tab'  
  )  
  SET @end_t = GETDATE()  
  SET @elapse_t = (SELECT DATEDIFF(MS, @start_t, @end_t))  
  SET @num_cell = CAST(@a1 AS VARCHAR(6))  
  SET @time_str = CAST(@elapse_t AS VARCHAR(20))  
  EXEC  
  (  
  '  
  DECLARE @geom geography  
  DECLARE @x xml  
  DECLARE @PFVALUE float  
  DECLARE @IFVALUE float  
  DECLARE @GRIDL1VALUE int  
  DECLARE @GRIDL2VALUE int  
  DECLARE @GRIDL3VALUE int  
  DECLARE @GRIDL4VALUE int  
  DECLARE @TPIRVALUE bigint  
  DECLARE @TPIPVALUE bigint  
  DECLARE @ANOIRPBRVALUE bigint  
  DECLARE @TNOOCILFQVALUE bigint  
  DECLARE @TNOOCIL0IIVALUE bigint  
  DECLARE @TNOOCIL4IIVALUE bigint  
  DECLARE @TNOOCIL3FQVALUE bigint  
  DECLARE @TNOOCIL4FQVALUE bigint  
  DECLARE @TNOIOIL3FQVALUE bigint  
  DECLARE @TNOIOIL4FQVALUE bigint  
  DECLARE @ITTCNTLGPVALUE float  
  DECLARE @INTTTCNTLGPVALUE float  
  DECLARE @BTTCNTLGPVALUE float  
  DECLARE @ACPONTLGPVALUE float  
  DECLARE @AOPGVALUE float  
  DECLARE @NORSBPFVALUE bigint  
  DECLARE @NORSBIFVALUE bigint  
  DECLARE @NOTSFICVALUE bigint  
  DECLARE @NOROVALUE bigint  
  DECLARE @PORNBPFVALUE float  
  DECLARE @POPFRSBIFVALUE float  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --MAKE SURE YOU EDIT GEOGRAPHY VARIABLE BELOW TO REPRESENT A POLYGON  
  --THAT IS WITHIN YOUR BOUNDING BOX  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  SET @geom = geography::STPolyFromText(''POLYGON ((-83.883490 35.440092, -83.896426 35.715280, -83.121432 35.787017, -83.124960 35.510653, -83.883490 35.440092))'', 4269)
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --MAKE SURE YOU SPECIFY THE NAME OF YOUR SPATIAL TABLE  
  --AND THE NAME OF THE SPATIAL INDEX  
  -- IN THE sp_help_spatial_geography_index_xml VARIABLES  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  exec sp_help_spatial_geography_index_xml GRSM_TRAILS , GRSM_TRAILS_IDX , 1, @geom, @x output  
  SET @PFVALUE =  @x.value(''(/Primary_Filter_Efficiency/text())[1]'', ''float'')  
  SET @IFVALUE =  @x.value(''(/Internal_Filter_Efficiency/text())[1]'', ''float'')  
  SET @GRIDL1VALUE =  @x.value(''(/Grid_Size_Level_1/text())[1]'', ''int'')  
  SET @GRIDL2VALUE =  @x.value(''(/Grid_Size_Level_2/text())[1]'', ''int'')  
  SET @GRIDL3VALUE =  @x.value(''(/Grid_Size_Level_3/text())[1]'', ''int'')  
  SET @GRIDL4VALUE =  @x.value(''(/Grid_Size_Level_4/text())[1]'', ''int'')  
  SET @TPIRVALUE =  @x.value(''(/Total_Primary_Index_Rows/text())[1]'', ''bigint'')  
  SET @TPIPVALUE =  @x.value(''(/Total_Primary_Index_Pages/text())[1]'', ''bigint'')  
  SET @ANOIRPBRVALUE =  @x.value(''(/Average_Number_Of_Index_Rows_Per_Base_Row/text())[1]'', ''bigint'')  
  SET @TNOOCILFQVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level0_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOOCIL0IIVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level0_In_Index/text())[1]'', ''bigint'')  
  SET @TNOOCIL4IIVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level4_In_Index/text())[1]'', ''bigint'')  
  SET @TNOOCIL3FQVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level3_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOOCIL4FQVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level4_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOIOIL3FQVALUE =  @x.value(''(/Total_Number_Of_Interior_ObjectCells_In_Level3_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOIOIL4FQVALUE =  @x.value(''(/Total_Number_Of_Interior_ObjectCells_In_Level4_For_QuerySample/text())[1]'', ''bigint'')  
  SET @ITTCNTLGPVALUE =  @x.value(''(/Interior_To_Total_Cells_Normalized_To_Leaf_Grid_Percentage/text())[1]'', ''float'')  
  SET @INTTTCNTLGPVALUE =  @x.value(''(/Intersecting_To_Total_Cells_Normalized_To_Leaf_Grid_Percentage/text())[1]'', ''float'')  
  SET @BTTCNTLGPVALUE =  @x.value(''(/Border_To_Total_Cells_Normalized_To_Leaf_Grid_Percentage/text())[1]'', ''float'')  
  SET @ACPONTLGPVALUE =  @x.value(''(/Average_Cells_Per_Object_Normalized_To_Leaf_Grid/text())[1]'', ''float'')  
  SET @AOPGVALUE =  @x.value(''(/Average_Objects_PerLeaf_GridCell/text())[1]'', ''float'')  
  SET @NORSBPFVALUE =  @x.value(''(/Number_Of_Rows_Selected_By_Primary_Filter/text())[1]'', ''bigint'')  
  SET @NORSBIFVALUE =  @x.value(''(/Number_Of_Rows_Selected_By_Internal_Filter/text())[1]'', ''bigint'')  
  SET @NOTSFICVALUE =  @x.value(''(/Number_Of_Times_Secondary_Filter_Is_Called/text())[1]'', ''bigint'')  
  SET @NOROVALUE =  @x.value(''(/Number_Of_Rows_Output/text())[1]'', ''bigint'')  
  SET @PORNBPFVALUE =  @x.value(''(/Percentage_Of_Rows_NotSelected_By_Primary_Filter/text())[1]'', ''float'')  
  SET @POPFRSBIFVALUE =  @x.value(''(/Percentage_Of_Primary_Filter_Rows_Selected_By_Internal_Filter/text())[1]'', ''float'')  
  
  
  UPDATE spat_idx_test_result  
  SET  
  num_cells = ' +@num_cell +',  
  permut = ' +@permut +',  
  g2t' +@a4 +' = ' +@time_str +',  
  PF_EFF = @PFVALUE,  
  IF_EFF = @IFVALUE,  
  GRIDL1 = @GRIDL1VALUE,  
  GRIDL2 = @GRIDL2VALUE,  
  GRIDL3 = @GRIDL3VALUE,  
  GRIDL4 = @GRIDL4VALUE,  
  TPIR = @TPIRVALUE,  
  TPIP = @TPIPVALUE,  
  ANOIRPBR = @ANOIRPBRVALUE,  
  TNOOCILFQ = @TNOOCILFQVALUE,  
  TNOOCIL0II = @TNOOCIL0IIVALUE,  
  TNOOCIL4II = @TNOOCIL4IIVALUE,  
  TNOOCIL3FQ =  @TNOOCIL3FQVALUE,  
  TNOOCIL4FQ =  @TNOOCIL4FQVALUE,  
  TNOIOIL3FQ = @TNOIOIL3FQVALUE,  
  TNOIOIL4FQ = @TNOIOIL4FQVALUE,  
  ITTCNTLGP = @ITTCNTLGPVALUE,  
  INTTTCNTLGP = @INTTTCNTLGPVALUE,  
  BTTCNTLGP = @BTTCNTLGPVALUE,  
  ACPONTLGP = @ACPONTLGPVALUE,  
  AOPG = @AOPGVALUE,  
  NORSBPF = @NORSBPFVALUE,  
  NORSBIF = @NORSBIFVALUE,  
  NOTSFIC = @NOTSFICVALUE,  
  NORO = @NOROVALUE,  
  PORNBPF = @PORNBPFVALUE,  
  POPFRSBIF = @POPFRSBIFVALUE  
  WHERE perm_id = ' +@perm_id  
  )  
  SET @a4 = @a4 + 1  
  END  
  SET @a2 = @a2 + 1  
  END  
  SET @a1 = @a1 + 1  
  
  END  
PRINT 'Testing of ' +@tabnm +' spatial index: ' +@idxnm +' is complete!'  
GO 


And execute like so:

 

DECLARE @BOUNDING VARCHAR(MAX)   
SET @BOUNDING = 'geography::STPolyFromText(''POLYGON ((-83.883490 35.440092, -83.896426 35.715280, -83.121432 35.787017, -83.124960 35.510653, -83.883490 35.440092))'', 4269)'  
  
DECLARE @QUERY VARCHAR(MAX)   
SET @QUERY = 'geography::STPolyFromText(''POLYGON ((-83.883490 35.440092, -83.896426 35.715280, -83.121432 35.787017, -83.124960 35.510653, -83.883490 35.440092))'', 4269)'    
  
EXEC sp_tune_spatial_index 'GRSM_TRAILS', 'GRSM_TRAILS_IDX', 4096, 4096, @BOUNDING, @QUERY   
GO

Hello everyone,

 

The following source code shows how to copy features from one feature class (Source) to another feature class (Destination) when both features are versioned.And then remove features copied from the Source

 

The steps:

  • create a new SDE version a  WorkingVersion
  • select Source Feature Class with features to be copied and make a Feature Layer
  • select Destination Feature Class make a Feature layer
  • ChangeVersion of both feature layers to the  WorkingVersion
  • append features to destination Feature Class
  • delete features from Source Feature Class that were appended
 versions = arcpy.ListVersions(database)
     # Print the versions available to the user

     for version in versions:
        print(version)

        if version == WorkingVersion:
            # Execute DeleteVersion
            arcpy.DeleteVersion_management(database, WorkingVersion)


     # Execute CreateVersion
     arcpy.CreateVersion_management(database, ParentVersion, WorkingVersion, "PUBLIC")


     # Select records with STATUS = 'Completed'  and Historical  records

     arcpy.MakeFeatureLayer_management(SourceFeatureCalss, SourceLayer, "STATUS = 'Completed'", "", " .....")
    # Process: Make Feature Layer (2)
     arcpy.MakeFeatureLayer_management(EditableFeatureClass, AppendedLayer, "", "", "......")


    #
    #  Change Both layers SourceLayer and AppendedLayer To OWNER.WORKINGVERSION version
    #

     arcpy.ChangeVersion_management('SourceLayer','TRANSACTIONAL', WorkingVersion,'')
     # Change to HISTORICALAASIS version
     arcpy.ChangeVersion_management('AppendedLayer','TRANSACTIONAL', WorkingVersion,'')

     # Process: Append...
     arcpy.Append_management("SourceLayer", AppendedLayer, "TEST", "", "")


     # Process: Delete Features...
     arcpy.DeleteFeatures_management(SourceLayer)

     arcpy.ReconcileVersions_management(database,
                                   "ALL_VERSIONS",
                                   "OWNER.ParentVersion",
                                   "OWNER.WORKINGVERSION",
                                   "LOCK_ACQUIRED",
                                   "NO_ABORT",
                                   "BY_OBJECT",
                                   "FAVOR_TARGET_VERSION",
                                   "POST",
                                   "KEEP_VERSION",
                                   scratchFolder + "RecLog.txt")

With ArcGIS, to administer an enterprise geodatabase, you need to utilize the catalog window in ArcGIS or the many geoprocessing tools. They operate through the use of a database connection file (.sde file). The connection file can also be shared and used for scripting. This works great and is easy to manage if you do all your administration through ArcCatalog/ArcMap or the geoprocessing window.

 

However, what if you have many databases to manage, want to use Linux, want to script tasks, or just don't want to have to deal with creating connection files all the time? With the old sde command line this was very easy, no .sde files to manage, just pass in the connection parameters and away you go.

 

Well, here's some good news! You can easily mimic the same behavior using some simple python concepts along with the powerful arcpy geoprocessing scripting environment. The sample code is available here if you want dig in immediately!

 

The example below works on Linux and Windows and uses the concepts detailed in the ArcGIS geoprocessing tools help to create a database connection file and then loads data from a specified directory to the enterprise geodatabase. This can easily be modified to run other tools or to do other tasks. I'll highlight the key pieces of the script.

 

First, the usage.

 

You pass in the connection properties to a database and a location to where the data is you would like to load:

 

Examples:
   
(Linux)  
/tmp>loaddata_sde_params.py --DBMS ORACLE -i myserver/orcl -u user1 -p user1 --dataloc /net/sharedata/location/data  
  
(Windows)  
c:\tmp>loaddata_sde_params.py --DBMS ORACLE -i myserver/orcl -u user1 -p user1 --dataloc \\sharedata\location\data  
    
>loaddata_sde_params.py --help  
  
Usage: loaddata_sde_params.py [Options]  
  
Options:  
  --version             show program's version number and exit  
  -h, --help            show this help message and exit  
  --DBMS=DATABASE_TYPE  Type of enterprise DBMS: SQLSERVER, ORACLE, or  
                        POSTGRESQL.  
  -i INSTANCE           DBMS instance name  
  --auth=ACCOUNT_AUTHENTICATION  
                        Authentication type options (case-sensitive):  
                        DATABASE_AUTH, OPERATING_SYSTEM_AUTH.  
                        Default=DATABASE_AUTH  
  -u USER               user name  
  -p PASSWORD           password  
  --dataloc=DATALOCATION  
                        Path to the data (either a geodatabase or a directory  
                        with shapefiles  
  -D DATABASE           Database name (Not required for Oracle)  


 

Let's break it down a bit more...

 

The core component is the ability to accept parameters similar to the old command line, parse them correctly, and then automate the creation of the database connection file (It uses some code from the Upgrade Geodatabase (Data Management) help documentation).

 

The parser used is in the python module "optparse". This first defines the program, and then all the options:

 

# Define usage and version
parser = optparse.OptionParser(usage = "usage: %prog [Options]", version="%prog 2.0; valid for 10.1+ only")

#Define help and options
parser.add_option ("--DBMS", dest="Database_type", type="choice", choices=['SQLSERVER', 'ORACLE', 'POSTGRESQL', 'DB2','INFORMIX','DB2ZOS',''], default="", help="Type of enterprise DBMS: SQLSERVER, ORACLE, or POSTGRESQL.")
parser.add_option ("-i", dest="Instance", type="string", default="", help="DBMS instance name")
parser.add_option ("--auth", dest="account_authentication", type ="choice", choices=['DATABASE_AUTH', 'OPERATING_SYSTEM_AUTH'], default='DATABASE_AUTH', help="Authentication type options (case-sensitive):  DATABASE_AUTH, OPERATING_SYSTEM_AUTH.  Default=DATABASE_AUTH")
parser.add_option ("-u", dest="User", type="string", default="", help="user name")
parser.add_option ("-p", dest="Password", type="string", default="", help="password")
parser.add_option ("--dataloc", dest="DataLocation", type="string", default="", help="Path to the data (either a geodatabase or a directory with shapefiles")
parser.add_option ("-D", dest="Database", type="string", default="none", help="Database name (Not required for Oracle)")

 

Then we need to assign values to local variables from the options that get passed in:

 

#Usage parameters for spatial database connection.
account_authentication = options.account_authentication.upper()
username = options.User.lower()
password = options.Password
dataloc = options.DataLocation
database = options.Database.lower()
database_type = options.Database_type.upper()
instance = options.Instance

 

Next, we need to do some basic checks and find a location to put the .sde connection file. On Windows, the path to files uses backslashes, whereas on Linux, forward slashes are needed.

 

# Local variables
instance_temp = instance.replace("\\","_")
instance_temp = instance_temp.replace("/","_")
instance_temp = instance_temp.replace(":","_")
Conn_File_NameT = instance_temp + "_" + database + "_" + username

if os.environ.get("TEMP") == None:
    temp = "c:\\temp"
else:
    temp = os.environ.get("TEMP")
if os.environ.get("TMP") == None:
    temp = "/usr/tmp"
else:
    temp = os.environ.get("TMP")

Connection_File_Name = Conn_File_NameT + ".sde"
Connection_File_Name_full_path = temp + os.sep + Conn_File_NameT + ".sde"

 

Create the connection file:

 

print "\nCreating Database Connection File...\n"
# Process: Create Database Connection File...
# Usage:  out_file_location, out_file_name, DBMS_TYPE, instance, database, account_authentication, username, password,  save_username_password(must be true)
outFile = arcpy.CreateDatabaseConnection_management(out_folder_path=temp, out_name=Connection_File_Name, database_platform=database_type, instance=instance, database=database, account_authentication=account_authentication, username=username, password=password, save_user_pass="TRUE")

 

Then the connection file can be used to run any other task or set of tasks. Here it loads data using the FeatureClassToGeodatabase tool:

 

# Set environment settings
# Set the workspace to the specified data location
env.workspace = dataloc

# Get all the feature classes in the environment
# list of fc's should be similar to this: ["accident.shp", "veg.shp"]
inFeatures = arcpy.ListFeatureClasses()

# Process: Load Data (Use the "Feature Class To Geodatabase" Tool...)
try:
    # Execute FeatureClassToGeodatabase
    print "Loading data...\n"
    arcpy.FeatureClassToGeodatabase_conversion(inFeatures, outLocation)

except:
    for i in range(arcpy.GetMessageCount()):
        arcpy.AddReturnMessage(i)

 

 

Full code is available here.

 

Enjoy!

In Using SQL to Auto-Populate XY Values I described how to use SQL Geography in SDE to automate the population of Lat and Lon values. IN some applications, my users are also looking for Degrees, Minutes, and Seconds, depending on the application, mission, and the end-user. In my line of business, we deal with a lot of folks whom are still looking at GPS read-outs on their mobile or GPS device in that format.


With the automation of data production, I'd like for end users to be able to "click" on a feature (in Portal or AGOL) and be able to see coordinate pairs in numerous formats, without having to "push more buttons". Real handy when a helo pilot whom learned how to fly in 'Nam with a topo map strapped to his knee radios in for the location of the drop point....

 

By adding some text attributes to a feature class and spinning some SQL magic on the back-end, you can automate the population of Degrees, Minutes, and Seconds, with proper annotation, on data creation. Now downstream users can get coordinate pairs in whatever format they need!

 

Here's the business end:

SELECT
[SHAPE].[Lat] as [DecimalLatitude],
floor(ABS([SHAPE].[Lat]))*(CASE WHEN [SHAPE].[Lat] < 0 then -1 ELSE 1 END) as [LatDegrees],
convert(int,[SHAPE].[Lat]*60) % 60 as [LatMinutes],
convert(decimal(4,2),convert(decimal(17,10),[SHAPE].[Lat]*3600) % 60) as [LatSeconds],

[SHAPE].[Long] as [DecimalLongitude],
floor(ABS([SHAPE].[Long]))*(CASE WHEN [SHAPE].[Long] < 0 then -1 ELSE 1 END) as [LongDegrees],
convert(int,ABS([SHAPE].[Long])*60) % 60 as [LongMinutes],
convert(decimal(4,2),convert(decimal(17,10),ABS([SHAPE].[Long])*3600) % 60) as [LongSeconds]

FROM [SOME TABLE IN GEOGRAPHY FORMAT]

 

which returns a bunch of columns:

 

DecimalLatitudeLatDegreesLatMinutesLatSecondsDecimalLongitudeLongDegreesLongMinutesLongSeconds
35.685538500547435417.94-83.5370928999037-833213.53
35.6871696002781354113.81-83.5376158999279-833215.42
35.6880251001567354116.89-83.5350660998374-83326.24

 

Mixing things up we can:

 

SELECT
cast(convert(int,(floor(ABS([SHAPE].[Lat]))))as varchar) +NCHAR(176)+' '+ cast( convert(int,[SHAPE].[Lat]*60) % 60 as varchar) + ''' '+
cast( convert(decimal(17,3),[SHAPE].[Lat]*3600) % 60 as varchar) + '"',


'-'+cast(convert(int,(floor(ABS([SHAPE].[Long]))))as varchar) +NCHAR(176)+' '+ cast( convert(int,ABS([SHAPE].[Long])*60) % 60 as varchar) + ''' '+
cast( convert(decimal(17,3),ABS([SHAPE].[Long])*3600) % 60 as varchar) + '"'
FROM [SOME TABLE IN GEOGRAPHY FORMAT]

Note the addition of some ABS to keep negative symbols where they belong, and NCHAR(176) which gives us the Degree Symbol.

 

(No column name)(No column name)
35° 41' 7.939"-83° 32' 13.534"
35° 41' 13.811"-83° 32' 15.417"
35° 41' 16.890"-83° 32' 6.238"
35° 41' 16.673"-83° 32' 17.458"

 

And here's how it works in a trigger:

 

CREATE TRIGGER [dbo].[YAY_GEOMETRY]
ON [dbo].[YAY]
after INSERT,UPDATE NOT FOR REPLICATION
AS   
BEGIN    
  SET NOCOUNT ON;  

  UPDATE p SET  
         SHAPE = CASE WHEN i.SHAPE IS NOT NULL   
        THEN p.SHAPE ELSE Geography::STPointFromText('POINT('  
          + CAST(p.LON AS VARCHAR(20)) + ' '   
          + CAST(p.LAT AS VARCHAR(20)) + ')', 4269) END,  
      LON = CASE WHEN p.SHAPE IS NULL THEN p.LON ELSE p.SHAPE.Long END,  
      LAT = CASE WHEN p.SHAPE IS NULL THEN p.LAT ELSE p.SHAPE.Lat END,
  DMSLAT = cast(convert(int,(floor(ABS(p.SHAPE.Lat))))as varchar) +NCHAR(176)+' '+ cast( convert(int,p.SHAPE.Lat*60) % 60 as varchar) + ''' '+
 cast( convert(decimal(17,3),p.SHAPE.Lat*3600) % 60 as varchar) + '"' ,
  DMSLON = '-'+cast(convert(int,(floor(ABS(p.SHAPE.Long))))as varchar) +NCHAR(176)+' '+ cast( convert(int,ABS(p.SHAPE.Long)*60) % 60 as varchar) + ''' '+
 cast( convert(decimal(17,3),ABS(p.SHAPE.Long)*3600) % 60 as varchar) + '"'
  FROM  YAY
     AS p 
;
END

For Degrees Decimal-minutes you could use:

 

DDMLAT = cast(convert(int,(floor(ABS(p.SHAPE.Lat))))as varchar) +NCHAR(176)+' '+cast( floor(convert(decimal(17,5),p.SHAPE.Lat*60) % 60) as varchar) + '.'+cast( parsename(convert(decimal(17,7),p.SHAPE.Lat*60) % 60,1) as varchar) + ''' ',

DDMLON =   '-'+cast(convert(int,(floor(ABS(p.SHAPE.Long))))as varchar) +NCHAR(176)+' '+cast( floor(convert(decimal(17,5),abs(p.SHAPE.Long*60)) % 60) as varchar) + '.'+cast( parsename(convert(decimal(17,7),p.SHAPE.Long*60) % 60,1) as varchar) + ''' ',

I have an incredible amount of point data in hundreds of feature classes for which one user demand is constant: they want to "see" the "GPS Coordinates" of the point for a variety of reasons, including proprietary (non-GIS) applications which report the GIS coordinate.

 

I could perhaps build into a data editing workflow which allows data editors to perform an XY calculation task, or even trick out some custom Pynthon or an ArcObjects toolbar, but those solutions require custom code which, frankly, I dropped out of computer programming (C++) after the first class.

 

SQL Server is ideally suited to handle this type of repetitive data maintenance task when you're using the Microsoft Spatial Storage types.

 

The following assumes you're working with a Point Feature Class using the "Geography" storage type using GCS_North_American_1983 as the coordinate system (SRID 4269) and there is a GLOBALID unique identifier.

 

One way to accomplish this is with a Trigger, in this case, a trigger that fires when the edit or update hits the base table. Lets write a trigger:

 

 

CREATE TRIGGER [dbo].[TEST_GEOGRAPHY]
ON [dbo].[TEST]
/****** fire on inserts and updates  ******/
/****** disable trigger when SQL replication or mirroring is enabled  ******/
after INSERT,UPDATE NOT FOR REPLICATION
AS   
BEGIN    
  SET NOCOUNT ON;  
   UPDATE p SET  
/****** hypothetically we could enter the lat/lon as text and create a geography object  ******/
         SHAPE = CASE WHEN i.SHAPE IS NOT NULL   
        THEN p.SHAPE ELSE Geography::STPointFromText('POINT('  
          + CAST(p.LON AS VARCHAR(20)) + ' '   
          + CAST(p.LAT AS VARCHAR(20)) + ')', 4269) END,  
/****** usual case point is created with ARC and casts the LAT/LON as text  ******/
/****** from the geography object  ******/
      LON = CASE WHEN p.SHAPE IS NULL THEN p.LON ELSE p.SHAPE.Long END,  
      LAT = CASE WHEN p.SHAPE IS NULL THEN p.LAT ELSE p.SHAPE.Lat END
  FROM  TEST
     AS p 
/****** allow upate of lat/lon on update ******/
  INNER JOIN  
    inserted AS i 
    ON i.globalid = p.globalid 
  ; 
END
GO

 

How this works when:

 

  1. Versioned without moving edits to base checked: The LAT and LON column will be populated with the WKT (Well-known Text) value of the geographic coordinates upon data entry and update AFTER your edits have been compressed to the default version. Personally I'm not a big fan of this versioning scenario;
  2. Versioned with moving edits to base checked: The LAT and LON column will be populated with the WKT (Well-known Text) value of the geographic coordinates upon data entry and update as soon as the edit is saved;
  3. Not versioned: The LAT and LON column will be populated with the WKT (Well-known Text) value of the geographic coordinates upon data entry and update as soon as the edit is made.

If you are using the Geometry Storage Type:

 

  UPDATE p SET  
    X_Coord = CASE WHEN i.shape.STDimension() = 2  
      THEN i.shape.STCentroid().STX  
      ELSE i.shape.STEnvelope().STCentroid().STX  
    END, 
    Y_Coord = CASE WHEN i.shape.STDimension() = 2  
      THEN i.shape.STCentroid().STY  
      ELSE i.shape.STEnvelope().STCentroid().STY  
    END,  

 

 

Or a polygon:

 

   UPDATE p SET
   SHAPE = CASE WHEN p.SHAPE IS NOT NULL   
        THEN p.SHAPE ELSE Geography::STPointFromText('POINT('
          + CAST(p.LON AS VARCHAR(20)) + ' '   
          + CAST(p.LAT AS VARCHAR(20)) + ')', 4269) END,  
      LON = CASE WHEN p.SHAPE IS NULL THEN p.LON ELSE p.SHAPE.EnvelopeCenter().Long  END,  
      LAT = CASE WHEN p.SHAPE IS NULL THEN p.LAT ELSE p.SHAPE.EnvelopeCenter().Lat  END,

 

 

This is a personal blog and does not recommend, endorse, or support the methods described above. Alteration of data using SQL outside of the ESRI software stack, of course, is not supported and should not be applied to a production database without a thorough understanding and disaster recovery plan.