Data Management Blog - Page 2

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Latest Activity

(77 Posts)
mdonnelly
Esri Contributor

Hello,

Problem

After upgrading a SQL Server database should you leave the compatibility level at the lowest level?

 

Solution

 

No, after upgrading a SQL Server database you should upgrade the database compatibility level.

 

Reference:

https://bousteadgis.sharepoint.com/teams/au-ps/SupportCenter/Knowledge%20Library/Administering%20SQL...

"A Note About Compatibility Level"

 

For example, if upgrading the database to SQL Server 2014 change the following setting on each database:

 

SQL Server Management Studio > Databases > Right-click your database > Properties > Options > Compatibility Level -> SQL Server 2014 (120)

 

Reference:

https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-database-transact-sql-compatibility-leve...

Mark

more
1 0 736
AnnieJames
Esri Contributor

Esri's 39th user conference is around the corner! As you are making travel plans and scheduling your time, consider sessions offered by the Geodatabase team. We've organized them into several learning paths that you can explore in this story map : Data Management Learning Paths at UC2019

For Enterprise related workshops checkout their blog:  ArcGIS Enterprise at 2019 UC 

For Utility users , visit  Electric Gas and Telecom Industry Activities UC2019 

more
0 0 931
John_Spence
Frequent Contributor

Ran into something interesting that might be worth sharing with the group. One of the developers on staff ran into an issue where he couldn't pull lat/long from the database because the shape fields are stored as Geometry (not geography) and we didn't store Lat/Long within the tables. For those who weren't aware you can put a SQL query in to generate X,Y for you from the shape fields present in our databases. The syntax will be a bit different based on database version (not SDE version). For MS SQL coming out of a Geometry Column you would use something like Example 1.

So how do you pull that information out for an application to use/reference? Here is what I came up with for him.

Option 1 Using Arcade:

For those using ArcGIS Pro and/or ArcGIS Online, a increasingly useful capability would be to use the Arcade scripting language. In this case, build a function. As a proof of concept, I looped it into a URL could be easily parsed via Google Maps. They can change the URL to match a Collector, Survey 123, or other application at will. See examples 2A and 2B(accidently labeled again 2A to keep everyone on their toes).

Option 2 Using SQL:

With ArcGIS you can create query layers, but when published via ArcGIS Server, which this developer was doing, there is a performance hit to the server each time it pulls one of these query layers. Sure it is more of a pain, but your Database Administrator can create and adjust views just as fast and they put the burden of rendering on the database server which usually has capacity to spare. That said, you can add columns in queries and even do math along the way. See Example 3. Keep in mind the syntax of Example 3 is for MS SQL. As most of you are using Oracle, it may be slightly different for some of the operators like PI, though that is a bad example as PI is called the same way in Oracle. The moral of the story though is double check that all the operators (functions) are correct.

declare @shift numeric
declare @x numeric
declare @y numeric

set @shift = 2.0 * PI() * 6378137.0 / 2.0

SELECT structure.OBJECTID
,structure.FACILITYID
,structure.DSTYPE AS [Type]
,structure.shape.STY as Y_Coord
,structure.shape.STX as X_Coord
, (
180.0 / PI() * (2.0 * Atan( Exp( (((select structure2.shape.STY FROM WebGIS.UTIL.swDrainageStructure as structure2 where structure2.objectid = structure.OBJECTID) / @shift) * 180.0) * PI() / 180.0)) - PI() / 2.0)
) as [Latitude]
,((select structure1.shape.STX FROM WebGIS.UTIL.swDrainageStructure as structure1 where structure1.OBJECTID = structure.objectid) / @shift) * 180.0 as [Longitude]
FROM WebGIS.UTIL.swDrainageStructure as structure

more
4 1 2,196
MelissaJarman
Esri Contributor

This blog will give you information about the ability to register database views with the geodatabase at 10.5.

 

ArcGIS allows access to database views and tables that are not registered with the geodatabase via a query layer. An unregistered view within a geodatabase can be a view that is created using DBMS client tools or via the Create Database View tool.

 

Many users that work with enterprise geodatabases and database views have requested the ability to register views with the geodatabase. This will store information about the database view within the geodatabase system tables to allow the user experience with views to be similar for viewing and querying geodatabase tables and feature classes.

 

At the 10.5 release the Register with Geodatabase tool has been enhanced with additional parameters and now allows the registration of views with the geodatabase. You will also notice the right click Manage > Register with Geodatabase context menu for unregistered objects will now open the Register with Geodatabase tool.

The process of creating a registered view:
 1. Create a view using DBMS client tools or Create database view geoprocessing tool
  2. Register with Geodatabase using Register with geodatabase geoprocessing tool
 
The registration process requires a view to exist that is created via the database client tools, or by using the Create Database View tool. After the view exists the view can be registered using the Register with Geodatabase tool. The right click context menu from existing database views will also open the Register with Geodatabase tool with some of the initial parameters populated. Depending on the type of columns and if the view contains existing records parameters are pre-populated when opening the tool from the right click context menu. Not all tables or views are good candidates to register with the geodatabase, so this two-step process allows users the flexibility to interact with unregistered database views via a query layer and optionally register with the geodatabase.

 

Registered views:

After the database view is registered with the geodatabase it will appear the same as a geodatabase table or feature class, depending on if the view definition contains a spatial column. Views with a spatial column will be displayed with a feature class icon corresponding to the geometry type chosen during registration.

Context menu for unregistered database views within the geodatabase connection. 

 Enhanced Register with Geodatabase tool for 10.5.

Benefits:

  • Performance improvements for viewing and querying due to table metadata (rowid, extent, etc) being stored within the system tables 
  • No prompt to specify a unique identifier when adding database views to ArcMap/Pro clients
  • Ability to create and store geodatabase metadata on the registered view

 

 

User experience:

  • Registered views are read only geodatabase objects. This means that operations such as altering the schema or underlying data will be prevented. A error message 'Not supported on a view' will be received to alert you when the operation is not supported against a registered view.
    • Registered views can be renamed, but the schema itself is read only in ArcGIS
    • To prevent participation in geodatabase functionality where workflows will involve editing registered views cannot be added to feature datasets and registered views cannot participate in geodatabase relationship classes. Additionally other geodatabase functionality that implies editing (such as editor tracking, attachments, etc) are blocked.
    • Registered views can be published via map services or query only feature services.
  • Schema modifications to the underlying tables the view is based on should be handled by refreshing the view definition using the RDBMS client tools. Next, making a connection via the ArcGIS client will update these changes. 

  

Requirements:  

  • The OBJECT ID FIELD parameter is required for database view registration. This requires a unique integer column that can represent the ROW ID for the registered view.
  • For tables with a spatial column:
    • A single spatial column
    • Spatial column must contain a single entity type
    • If table is empty with a spatial column - it is still required to input these parameters for Shape Field, Geometry Type, and Coordinate System during registration.

 

more
14 11 12.2K
SamQuon
Occasional Contributor

I ran into an issue with connecting to a geodatabase in MS SQL after restoring the database from a backup. The restore process created several orphaned users. I was able to identify the orphaned users using the following query in Microsoft Management Studio.

SELECT dp.type_desc, dp.SID, dp.name AS user_name 
FROM sys.database_principals AS dp
LEFT JOIN sys.server_principals AS sp 
     ON dp.SID = sp.SID
WHERE sp.SID IS NULL
     AND authentication_type_desc = 'INSTANCE'; 

This generated the list of users based on missing SQL Server authenticated logins.

Next I needed to map the orphaned user to the existing login with the following query in Microsoft Management Studio.

ALTER USER <user_name> WITH Login = <login_name>;

The source of the solution came from: https://msdn.microsoft.com/en-us/library/ms175475.aspx

more
0 2 2,506
TimMinter
Frequent Contributor


Changing the order of fields in an existing geodatabase feature class or table has been challenging since the early days of ArcGIS - ArcMap and ArcCatalog.  Without bothering to dig up all of the historical references, blog posts, issue reports, bug reports, great idea postings, workarounds, previous methods and tools, etc, I point at one simple solution to this 16 year old problem (it's old enough to drive in most US states).

  1. Purchase and install Sparx Enterprise Architect (EA) version 12.x or higher, Professional edition or higher, floating or single use license.
  2. ArcMap:  export your disorderly geodatabase as a Workspace XML
  3. EA:
    • Tools > Options > Objects > uncheck "Sort Features Alphabetically" > Close
    • File > New Project > name it > Model Wizard > click "Geospatial" > check the "ArcGIS Workspace" box > OK
    • Extensions > ArcGIS > Import ArcGIS Workspace XML
    • double-click the "Workspace" object in the project browser to open a graphic view of your geodatabase schema
    • find the disorderly table or feature class graphic
    • left-click any of the fields in the table or feature class graphic to select it
    • right-click the selected field > View Properties...
    • right-click the field that you want to be in a different order and choose "Move up" or "Move down" as pleases you
    • Close the field properties viewer
    • Extensions > ArcGIS > Export ArcGIS Workspace XML
  4. ArcMap:
    • import your orderly schema from your Workspace XML into your geodatabase
    • load your data into your new schema
    • breathe, sigh, rejoice...  move along

That's it.  That's all I got.

Happy Friday,

tim

more
1 4 4,913
curtvprice
MVP Alum

The Rules

  1. Avoid spaces or any special characters in pathnames. Including spaces or other special characters in folder names may work, but it is best to avoid them (see 7, below)
  2. Avoid long pathnames. Try to "flatten" your folder structures so that your system pathnames under 128 characters. It is likely you can go higher but not that much: the absolute Windows pathname limit (MAXPATH) is 260 characters. (This actually makes sense  if 256 is the real ancestral limit: 1+2+256+1 or [drive][:\][path][null] = 260)
  3. Start with a letter. Never begin an ArcGIS object name (table, field, relationship class, mxd file, folder, etc) with a number or any other non-alpha character. 
  4. No reserved words. Avoid names that may conflict with SQL or other language's reserved words. For example: OBJECTID, VALUE, COUNT, NOT, OR, ON, IN, OVER, SELECT.  This is to avoid confusing both people and SQL.
  5. Be brief. All field names should contain 10 characters or less. This is a dBase limit, so it applies to .dbf files AND shapefiles. For that reason it's best practice in case you need to go "through" a dbf based format. Coverage and grid names have a limit of 13 characters, but 10 is safer. If you really want a longer name around or one with special characters, use the alias.
  6. Raster names can be particularly problematic. Raster dataset names MUST start with a letter and it is best to keep the filename under 14 characters to support Esri grid format, and special characters in the path (space, &, -) may cause raster exports and tools to fail with cryptic error messages (like 999999).
  7. Use code in scripting to check names. The arcpy methods ValidateFieldName and ValidateTableName can be used at parameter validation time, or in your code, to prevent users of your tool from inserting invalid output names. The CreateScratchName method if provided the proper arguments, will generate valid dataset names.

You have been warned. Violating these suggestions may work, but one thing I have learned in my long career is that one should avoid tempting fate.

References

more
15 4 15.1K
ThomasColson
MVP Alum

Another Monday....another bleary-eyed white-knuckle drive to work, both hands wrapped around a coffee mug...which I forgot to fill....phone buzzes....crap I'm late again...nope...it's work: "The geodatabase isn't working. Did you know that?".

If only had I known. I could have, at my leisure, not on a day when I have 234 conference calls and 2156 meetings, fired up the vpn from home, quick server reboot.

There are a few "Service" health checks out there which test a GIS REST service and can fire an email when something is amiss, but that doesn't tell me if the problem is in the database or the application server, and all it does is tell me that the service  is alive.

I'm probably the ONLY GIS SERVER ADMINISTRATOR IN THE ENTIRE WORLD that experiences this, but I often find that Windows Update or a weekend party in the data center leaves some wonky residue in SQL, and "something" ends up "hung up" somewhere, so the maps work, GIS server is running, but can't edit anything. I only hear about it when I least have time to deal with it.


The following, run as a SQL Agent Job, will perform a test edit against a feature class (versioned or not), delete it if everything is working, or send an email if it's not. You can run it hourly (as I do) and modify accordingly to only get one email (what if you don't care it's not working?). Note how the emails appear to come from Captain Kirk, thus increasing chances of being paid attention to. I'm sure there are more elegant ways of doing this, but I like to keep things low-tech.

BEGIN TRY
DECLARE @id as integer
EXEC dbo.next_rowid 'dbo', 'TRAILSIGN', @id OUTPUT;
SET QUOTED_IDENTIFIER ON 
INSERT INTO [dbo].[TRAILSIGN]
(OBJECTID,
LOC_NAME,
shape)
VALUES(@id, 'ZZZ_HEALTH_CHECK_TEST_DELETE', geography::STPointFromText('POINT(-83.499752 35.687597)', 4269))
DELETE FROM TRAILSIGN WHERE LOC_NAME = 'ZZZ_HEALTH_CHECK_TEST_DELETE'
END TRY
BEGIN CATCH
    DECLARE @ErrorMessage NVARCHAR(4000);
    DECLARE @ErrorSeverity INT;
    DECLARE @ErrorState INT;
    SELECT 
        @ErrorMessage = ERROR_MESSAGE(),
        @ErrorSeverity = ERROR_SEVERITY(),
        @ErrorState = ERROR_STATE();
    DECLARE @ErrorText NVARCHAR(4000);
  DECLARE @ErrorDisclaimer NVARCHAR(4000);
  SET @ErrorText = 'Dammit Jim! The Engines can''t take this! Something's wrong with the IandM Veg Database! ' 
  SET @ErrorDisclaimer = 'The IandM Veg Database is having some sort of problem. Editing is disabled and read-access may not be working.  ' 
  SET @ErrorMessage = @ErrorText + @ErrorMessage + @ErrorDisclaimer
EXEC msdb.dbo.sp_send_dbmail 
    @profile_name='Captain Kirk',
    @recipients='sasquatch@bigfoot.com',
    @subject='Error in the IandM Veg Database',
    @body = @ErrorMessage;
END CATCH;

Combined with the python ArcGIS health checks, you can get a pretty robust error reporting matrix. Warp Speed!

more
4 1 2,539
curtvprice
MVP Alum

This function maps the field types reported by Describe or Field to the keyword used by the Add Field tool.

def AddFieldType(tbl, field_name):
    """Determines keyword to create a field with the AddField tool of
    the same type as the specified field.


    Integer > "LONG", Float -> "SINGLE" etc


    example


    AddFieldType("test.dbf", "Shape_Leng")
    "DOUBLE"
    arcpy.AddField_management("test.dbf", "Shape_Len2",
                              AddFieldType("test.dbf", "Shape_Leng"))
    """
    dtypes = {'OID': 'LONG', 'SmallInteger': 'SHORT', 'Integer': 'LONG',
              'Double': 'DOUBLE', 'Single': 'FLOAT', 'String': 'TEXT',
              'Date': 'DATE'}
    fld_list = arcpy.Describe(tbl).Fields
    ftype = [f.type for f in fld_list
             if f.name.upper() == field_name.upper()]
    if not ftype:
        raise Exception("Field {} not found".format(field_name))
    ftype = ftype[0]
    try:
        return dtypes[ftype]
    except KeyError:
        msg = "{}: field type {} not supported".format(field_name, ftype)
        raise Exception(msg)

more
0 2 1,552
ThomasColson
MVP Alum

In response to https://community.esri.com/migration-blogpost/54807

which adds some bling and zing to your SDE database, I've gotten a few comments regarding doing the same type of tuning with the Geography Storage type. Easy Cheesy, just note that this alteration simply replaces geometry with geography, and removes the bounding box from the index creation. Happy tuning!

--ORIGINAL SQL CODE FROM  
--http://stackoverflow.com/users/2250424/greengeo  
--MODIFIFIED TO INCLUDE SPATIAL INDEX HELP OUTPUT  
--IN TABLE  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --LOOK FOR THE EXCLAMATION POINTS IN THIS CODE  
  --THERE ARE SEVERAL USER-SUPPLIED VARIABLES THAT MUST  
  --BE INPUT FOR THIS SP TO WORK  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  
  
USE   [GPS_COLLAR]
GO  
  
  
  
ALTER PROCEDURE dbo.sp_tune_spatial_index  
(  
  @tabnm VARCHAR(MAX), -- This parameter stores the name of the spatial table for which you are tuning the index  
  @idxnm VARCHAR(MAX), -- This parameter stores the name of the spatial index of the named table  
  @min_cells_per_obj INT, -- Minimum Cells Per Object to test on. Suggested to start at 2.  
  @max_cells_per_obj INT, -- Maximum Cells Per Object to test on.  
  
  /* The test requires two geography instances to use in test query 1 and 2.  
  The first one should cover the area of default extent. The second should  
  cover an area roughly the size of the area shown when zoomed in, panning  
  around. It is required that the variable store a string that will create  
  the geography instance since this will be done within the procedure and   
  cannot be a variable of type: GEOGRAPHY. The SRID of these instances must  
  match that of the table you are testing. */  
  @testgeom1 VARCHAR(MAX), -- This parameter stores the first geography instance creation string that will be used in the test  
  @testgeom2 VARCHAR(MAX) -- This parameter stores the second geography instance creation string that will be used in the test  
  
  
  
)  
  
AS  
  
SET NOCOUNT ON;  
  
/* Prior to running this procedure, two tables are required. These tables are   
  created here to prepare for running the procedure. */  
  
PRINT 'Checking for required tables...'  
IF EXISTS(SELECT 1 FROM sysobjects WHERE name IN ('cell_opt_perm', 'spat_idx_test_result'))  
  BEGIN  
  PRINT '... The "cell_opt_perm" and "spat_idx_test_result" tables exist.'  
  END  
ELSE  
BEGIN  
  PRINT '... Creating "cell_opt_perm" and "spat_idx_test_result" tables.'  
  CREATE TABLE cell_opt_perm(  
  [perm_id] [smallint] NOT NULL,  
  [permutation] [nvarchar](4) NOT NULL,  
  [level1] [nvarchar](6) NOT NULL,  
  [level2] [nvarchar](6) NOT NULL,  
  [level3] [nvarchar](6) NOT NULL,  
  [level4] [nvarchar](6) NOT NULL  
  )  
  
  INSERT INTO cell_opt_perm ([perm_id], [permutation], [level1], [level2], [level3], [level4])  
  VALUES (1,'LLLL','LOW','LOW','LOW','LOW'),  
  (2,'LLLM','LOW','LOW','LOW','MEDIUM'),  
  (3,'LLLH','LOW','LOW','LOW','HIGH'),  
  (4,'LLML','LOW','LOW','MEDIUM','LOW'),  
  (5,'LLMM','LOW','LOW','MEDIUM','MEDIUM'),  
  (6,'LLMH','LOW','LOW','MEDIUM','HIGH'),  
  (7,'LLHL','LOW','LOW','HIGH','LOW'),  
  (8,'LLHM','LOW','LOW','HIGH','MEDIUM'),  
  (9,'LLHH','LOW','LOW','HIGH','HIGH'),  
  (10,'LMLL','LOW','MEDIUM','LOW','LOW'),  
  (11,'LMLM','LOW','MEDIUM','LOW','MEDIUM'),  
  (12,'LMLH','LOW','MEDIUM','LOW','HIGH'),  
  (13,'LMML','LOW','MEDIUM','MEDIUM','LOW'),  
  (14,'LMMM','LOW','MEDIUM','MEDIUM','MEDIUM'),  
  (15,'LMMH','LOW','MEDIUM','MEDIUM','HIGH'),  
  (16,'LMHL','LOW','MEDIUM','HIGH','LOW'),  
  (17,'LMHM','LOW','MEDIUM','HIGH','MEDIUM'),  
  (18,'LMHH','LOW','MEDIUM','HIGH','HIGH'),  
  (19,'LHLL','LOW','HIGH','LOW','LOW'),  
  (20,'LHLM','LOW','HIGH','LOW','MEDIUM'),  
  (21,'LHLH','LOW','HIGH','LOW','HIGH'),  
  (22,'LHML','LOW','HIGH','MEDIUM','LOW'),  
  (23,'LHMM','LOW','HIGH','MEDIUM','MEDIUM'),  
  (24,'LHMH','LOW','HIGH','MEDIUM','HIGH'),  
  (25,'LHHL','LOW','HIGH','HIGH','LOW'),  
  (26,'LHHM','LOW','HIGH','HIGH','MEDIUM'),  
  (27,'LHHH','LOW','HIGH','HIGH','HIGH'),  
  (28,'MLLL','MEDIUM','LOW','LOW','LOW'),  
  (29,'MLLM','MEDIUM','LOW','LOW','MEDIUM'),  
  (30,'MLLH','MEDIUM','LOW','LOW','HIGH'),  
  (31,'MLML','MEDIUM','LOW','MEDIUM','LOW'),  
  (32,'MLMM','MEDIUM','LOW','MEDIUM','MEDIUM'),  
  (33,'MLMH','MEDIUM','LOW','MEDIUM','HIGH'),  
  (34,'MLHL','MEDIUM','LOW','HIGH','LOW'),  
  (35,'MLHM','MEDIUM','LOW','HIGH','MEDIUM'),  
  (36,'MLHH','MEDIUM','LOW','HIGH','HIGH'),  
  (37,'MMLL','MEDIUM','MEDIUM','LOW','LOW'),  
  (38,'MMLM','MEDIUM','MEDIUM','LOW','MEDIUM'),  
  (39,'MMLH','MEDIUM','MEDIUM','LOW','HIGH'),  
  (40,'MMML','MEDIUM','MEDIUM','MEDIUM','LOW'),  
  (41,'MMMM','MEDIUM','MEDIUM','MEDIUM','MEDIUM'),  
  (42,'MMMH','MEDIUM','MEDIUM','MEDIUM','HIGH'),  
  (43,'MMHL','MEDIUM','MEDIUM','HIGH','LOW'),  
  (44,'MMHM','MEDIUM','MEDIUM','HIGH','MEDIUM'),  
  (45,'MMHH','MEDIUM','MEDIUM','HIGH','HIGH'),  
  (46,'MHLL','MEDIUM','HIGH','LOW','LOW'),  
  (47,'MHLM','MEDIUM','HIGH','LOW','MEDIUM'),  
  (48,'MHLH','MEDIUM','HIGH','LOW','HIGH'),  
  (49,'MHML','MEDIUM','HIGH','MEDIUM','LOW'),  
  (50,'MHMM','MEDIUM','HIGH','MEDIUM','MEDIUM'),  
  (51,'MHMH','MEDIUM','HIGH','MEDIUM','HIGH'),  
  (52,'MHHL','MEDIUM','HIGH','HIGH','LOW'),  
  (53,'MHHM','MEDIUM','HIGH','HIGH','MEDIUM'),  
  (54,'MHHH','MEDIUM','HIGH','HIGH','HIGH'),  
  (55,'HLLL','HIGH','LOW','LOW','LOW'),  
  (56,'HLLM','HIGH','LOW','LOW','MEDIUM'),  
  (57,'HLLH','HIGH','LOW','LOW','HIGH'),  
  (58,'HLML','HIGH','LOW','MEDIUM','LOW'),  
  (59,'HLMM','HIGH','LOW','MEDIUM','MEDIUM'),  
  (60,'HLMH','HIGH','LOW','MEDIUM','HIGH'),  
  (61,'HLHL','HIGH','LOW','HIGH','LOW'),  
  (62,'HLHM','HIGH','LOW','HIGH','MEDIUM'),  
  (63,'HLHH','HIGH','LOW','HIGH','HIGH'),  
  (64,'HMLL','HIGH','MEDIUM','LOW','LOW'),  
  (65,'HMLM','HIGH','MEDIUM','LOW','MEDIUM'),  
  (66,'HMLH','HIGH','MEDIUM','LOW','HIGH'),  
  (67,'HMML','HIGH','MEDIUM','MEDIUM','LOW'),  
  (68,'HMMM','HIGH','MEDIUM','MEDIUM','MEDIUM'),  
  (69,'HMMH','HIGH','MEDIUM','MEDIUM','HIGH'),  
  (70,'HMHL','HIGH','MEDIUM','HIGH','LOW'),  
  (71,'HMHM','HIGH','MEDIUM','HIGH','MEDIUM'),  
  (72,'HMHH','HIGH','MEDIUM','HIGH','HIGH'),  
  (73,'HHLL','HIGH','HIGH','LOW','LOW'),  
  (74,'HHLM','HIGH','HIGH','LOW','MEDIUM'),  
  (75,'HHLH','HIGH','HIGH','LOW','HIGH'),  
  (76,'HHML','HIGH','HIGH','MEDIUM','LOW'),  
  (77,'HHMM','HIGH','HIGH','MEDIUM','MEDIUM'),  
  (78,'HHMH','HIGH','HIGH','MEDIUM','HIGH'),  
  (79,'HHHL','HIGH','HIGH','HIGH','LOW'),  
  (80,'HHHM','HIGH','HIGH','HIGH','MEDIUM'),  
  (81,'HHHH','HIGH','HIGH','HIGH','HIGH')  
  
  CREATE TABLE spat_idx_test_result(  
  [perm_id] [int] NOT NULL,  
  [num_cells] [int] NOT NULL,  
  [permut] [nvarchar](4) NOT NULL,  
  [g1t1] [bigint] NULL,  
  [g1t2] [bigint] NULL,  
  [g1t3] [bigint] NULL,  
  [g1t4] [bigint] NULL,  
  [g2t1] [bigint] NULL,  
  [g2t2] [bigint] NULL,  
  [g2t3] [bigint] NULL,  
  [g2t4] [bigint] NULL,  
  [PF_EFF][float] NULL,  
  [IF_EFF][float] NULL,  
  [GRIDL1] [int] NULL,  
  [GRIDL2] [int] NULL,  
  [GRIDL3] [int] NULL,  
  [GRIDL4] [int] NULL,  
  [TPIR] [bigint] NULL,  
  [TPIP] [bigint] NULL,  
  [ANOIRPBR] [bigint] NULL,  
  [TNOOCILFQ] [bigint] NULL,  
  [TNOOCIL3FQ] [bigint] NULL,  
  [TNOOCIL4FQ] [bigint] NULL,  
  [TNOOCIL0II] [bigint] NULL,  
  [TNOOCIL4II] [bigint] NULL,  
  [TNOIOIL3FQ] [bigint] NULL,  
  [TNOIOIL4FQ] [bigint] NULL,  
  [ITTCNTLGP] [float] NULL,  
  [INTTTCNTLGP] [float] NULL,  
  [BTTCNTLGP] [float] NULL,  
  [ACPONTLGP] [float] NULL,  
  [AOPG] [float] NULL,  
  [NORSBPF] [bigint] NULL,  
  [NORSBIF] [bigint] NULL,  
  [NOTSFIC] [bigint] NULL,  
  [NORO] [bigint] NULL,  
  [PORNBPF] [float] NULL,  
  [POPFRSBIF] [float] NULL  
  
  
  )  
  
  INSERT INTO dbo.spat_idx_test_result  
  VALUES (0,16,0,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL)  
END  
  
  
/* delete all rows from "spat_idx_test_result" table. This makes it ready to stuff in new results.  
  !!!WARNING!!! if your test was interupted, the table will be cleared out and the test will  
  begin from the beginning. You could try to modify this to start where you left off but  
  I didn't have time and this worked well enough for me. */  
DELETE FROM spat_idx_test_result  
  WHERE perm_id != 0  
  
/* set up counters */  
DECLARE @a1 INT  
DECLARE @a2 INT  
DECLARE @a3 INT  
DECLARE @a4 INT  
  
/* set up variables to hold high/medium/low values and permutation to use in rebuilding  
   the spatial index and recording stats */  
DECLARE @lev1 VARCHAR(6)  
DECLARE @lev2 VARCHAR(6)  
DECLARE @lev3 VARCHAR(6)  
DECLARE @lev4 VARCHAR(6)  
DECLARE @permut VARCHAR(6)  
DECLARE @num_cell VARCHAR(4)  
DECLARE @time_str VARCHAR(20)  
DECLARE @perm_id VARCHAR(20)  
DECLARE @x xml  
DECLARE @pf_eff FLOAT  
  
  
  
/* create variables to hold timestamps for beginning and ending of test queries */  
DECLARE @start_t DATETIME  
DECLARE @end_t DATETIME  
DECLARE @elapse_t INT  
  
  
  
/* begin looping through cell option permutations */  
SET @a1 = @min_cells_per_obj  
WHILE @a1 <= @max_cells_per_obj  
  BEGIN  
  SET @a2 = 1  
  PRINT 'Started Testing for ' +CAST(@a1 AS VARCHAR(10)) +' cells per object'  
  WHILE @a2 < 82  
  BEGIN  
  SELECT @lev1 = level1, @lev2 = level2, @lev3 = level3, @lev4 = level4 FROM cell_opt_perm WHERE perm_id = @a2  
  SET @permut = '''' +(SELECT permutation FROM cell_opt_perm WHERE perm_id = @a2) +''''  
  
  
  EXEC  
  ('  
  CREATE SPATIAL INDEX ' +@idxnm +' ON ' +@tabnm +'   
  (  
  [SHAPE]  
  )  
  USING  GEOGRAPHY_GRID   
  WITH  
  (  
  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --MAKE SURE YOU EDIT THE BOUNDING BOX TO BE EXACTLY EQUAL TO THE BOUNDING  
  --BOX OF YOUR SPATIAL TABLE   
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  
  GRIDS =(LEVEL_1 = ' +@lev1 +' ,LEVEL_2 = ' +@lev2 +' ,LEVEL_3 = ' +@lev3 +' ,LEVEL_4 = ' +@lev4 +' ),   
  CELLS_PER_OBJECT = ' +@a1 +' ,  
  PAD_INDEX  = OFF,  
  SORT_IN_TEMPDB = OFF,  
  DROP_EXISTING = ON,  
  ALLOW_ROW_LOCKS  = ON,  
  ALLOW_PAGE_LOCKS  = ON,  
  FILLFACTOR = 100  
  )  
  ON [PRIMARY]  
  '  
  )  
  
  
  PRINT 'Re-built index to ' +@permut  
  SET @a3 = 1  
  SET @a4 = 1  
  WHILE @a3 < 5  
  BEGIN  
  SET @start_t = GETDATE()  
  EXEC  
  (  
  'CREATE TABLE #tmp_tab (shp GEOGRAPHY)  
  DECLARE @g1 GEOGRAPHY  
  SET @g1 = ' +@testgeom1 +'  
  INSERT #tmp_tab (shp)  
  SELECT  
  r.Shape AS shp  
  FROM  
  ' +@tabnm +' r  
  WHERE  
  r.SHAPE.STIntersects(@g1) = 1  
  DROP TABLE #tmp_tab'  
  )  
  SET @end_t = GETDATE()  
  SET @elapse_t = (SELECT DATEDIFF(MS, @start_t, @end_t))  
  SET @num_cell = CAST(@a1 AS VARCHAR(6))  
  SET @time_str = CAST(@elapse_t AS VARCHAR(20))  
  IF @a3 = 1  
  BEGIN  
  IF (SELECT TOP 1 perm_id FROM spat_idx_test_result) IS NULL  
  BEGIN  
  SET @perm_id = 1  
  
  
  END  
  ELSE  
  BEGIN  
  SET @perm_id = CAST((SELECT MAX(perm_id+1) FROM spat_idx_test_result) AS VARCHAR(20))  
  END  
  
  
  EXEC  
  
  
  (  
  '  
  
  
  INSERT INTO spat_idx_test_result (perm_id, num_cells, permut, g1t' +@a3 +')  
  VALUES (' +@perm_id +', ' +@num_cell +', ' +@permut +', ' +@time_str +')'  
  )  
  END  
  ELSE  
  
  
  EXEC  
  (  
  '  
  
  UPDATE spat_idx_test_result  
  SET  
  num_cells = ' +@num_cell +',  
  permut = ' +@permut +',  
  g1t' +@a3 +' = ' +@time_str +'  
  WHERE perm_id = ' +@perm_id   
  
  
  )  
  SET @a3 = @a3 + 1  
  END  
  WHILE @a4 < 5  
  BEGIN  
  
  SET @start_t = GETDATE()  
  EXEC  
  (  
  'CREATE TABLE #tmp_tab (shp GEOGRAPHY)   
  DECLARE @g2 GEOGRAPHY  
  SET @g2 = ' +@testgeom2 +'  
  INSERT #tmp_tab (shp)  
  SELECT  
  r.Shape AS shp  
  FROM  
  ' +@tabnm +' r  
  WHERE  
  r.SHAPE.STIntersects(@g2) = 1  
  DROP TABLE #tmp_tab'  
  )  
  SET @end_t = GETDATE()  
  SET @elapse_t = (SELECT DATEDIFF(MS, @start_t, @end_t))  
  SET @num_cell = CAST(@a1 AS VARCHAR(6))  
  SET @time_str = CAST(@elapse_t AS VARCHAR(20))  
  EXEC  
  (  
  '  
  DECLARE @geom geography  
  DECLARE @x xml  
  DECLARE @PFVALUE float  
  DECLARE @IFVALUE float  
  DECLARE @GRIDL1VALUE int  
  DECLARE @GRIDL2VALUE int  
  DECLARE @GRIDL3VALUE int  
  DECLARE @GRIDL4VALUE int  
  DECLARE @TPIRVALUE bigint  
  DECLARE @TPIPVALUE bigint  
  DECLARE @ANOIRPBRVALUE bigint  
  DECLARE @TNOOCILFQVALUE bigint  
  DECLARE @TNOOCIL0IIVALUE bigint  
  DECLARE @TNOOCIL4IIVALUE bigint  
  DECLARE @TNOOCIL3FQVALUE bigint  
  DECLARE @TNOOCIL4FQVALUE bigint  
  DECLARE @TNOIOIL3FQVALUE bigint  
  DECLARE @TNOIOIL4FQVALUE bigint  
  DECLARE @ITTCNTLGPVALUE float  
  DECLARE @INTTTCNTLGPVALUE float  
  DECLARE @BTTCNTLGPVALUE float  
  DECLARE @ACPONTLGPVALUE float  
  DECLARE @AOPGVALUE float  
  DECLARE @NORSBPFVALUE bigint  
  DECLARE @NORSBIFVALUE bigint  
  DECLARE @NOTSFICVALUE bigint  
  DECLARE @NOROVALUE bigint  
  DECLARE @PORNBPFVALUE float  
  DECLARE @POPFRSBIFVALUE float  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --MAKE SURE YOU EDIT GEOGRAPHY VARIABLE BELOW TO REPRESENT A POLYGON  
  --THAT IS WITHIN YOUR BOUNDING BOX  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  SET @geom = geography::STPolyFromText(''POLYGON ((-83.883490 35.440092, -83.896426 35.715280, -83.121432 35.787017, -83.124960 35.510653, -83.883490 35.440092))'', 4269)
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  --MAKE SURE YOU SPECIFY THE NAME OF YOUR SPATIAL TABLE  
  --AND THE NAME OF THE SPATIAL INDEX  
  -- IN THE sp_help_spatial_geography_index_xml VARIABLES  
  --!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!  
  exec sp_help_spatial_geography_index_xml GRSM_TRAILS , GRSM_TRAILS_IDX , 1, @geom, @x output  
  SET @PFVALUE =  @x.value(''(/Primary_Filter_Efficiency/text())[1]'', ''float'')  
  SET @IFVALUE =  @x.value(''(/Internal_Filter_Efficiency/text())[1]'', ''float'')  
  SET @GRIDL1VALUE =  @x.value(''(/Grid_Size_Level_1/text())[1]'', ''int'')  
  SET @GRIDL2VALUE =  @x.value(''(/Grid_Size_Level_2/text())[1]'', ''int'')  
  SET @GRIDL3VALUE =  @x.value(''(/Grid_Size_Level_3/text())[1]'', ''int'')  
  SET @GRIDL4VALUE =  @x.value(''(/Grid_Size_Level_4/text())[1]'', ''int'')  
  SET @TPIRVALUE =  @x.value(''(/Total_Primary_Index_Rows/text())[1]'', ''bigint'')  
  SET @TPIPVALUE =  @x.value(''(/Total_Primary_Index_Pages/text())[1]'', ''bigint'')  
  SET @ANOIRPBRVALUE =  @x.value(''(/Average_Number_Of_Index_Rows_Per_Base_Row/text())[1]'', ''bigint'')  
  SET @TNOOCILFQVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level0_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOOCIL0IIVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level0_In_Index/text())[1]'', ''bigint'')  
  SET @TNOOCIL4IIVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level4_In_Index/text())[1]'', ''bigint'')  
  SET @TNOOCIL3FQVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level3_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOOCIL4FQVALUE =  @x.value(''(/Total_Number_Of_ObjectCells_In_Level4_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOIOIL3FQVALUE =  @x.value(''(/Total_Number_Of_Interior_ObjectCells_In_Level3_For_QuerySample/text())[1]'', ''bigint'')  
  SET @TNOIOIL4FQVALUE =  @x.value(''(/Total_Number_Of_Interior_ObjectCells_In_Level4_For_QuerySample/text())[1]'', ''bigint'')  
  SET @ITTCNTLGPVALUE =  @x.value(''(/Interior_To_Total_Cells_Normalized_To_Leaf_Grid_Percentage/text())[1]'', ''float'')  
  SET @INTTTCNTLGPVALUE =  @x.value(''(/Intersecting_To_Total_Cells_Normalized_To_Leaf_Grid_Percentage/text())[1]'', ''float'')  
  SET @BTTCNTLGPVALUE =  @x.value(''(/Border_To_Total_Cells_Normalized_To_Leaf_Grid_Percentage/text())[1]'', ''float'')  
  SET @ACPONTLGPVALUE =  @x.value(''(/Average_Cells_Per_Object_Normalized_To_Leaf_Grid/text())[1]'', ''float'')  
  SET @AOPGVALUE =  @x.value(''(/Average_Objects_PerLeaf_GridCell/text())[1]'', ''float'')  
  SET @NORSBPFVALUE =  @x.value(''(/Number_Of_Rows_Selected_By_Primary_Filter/text())[1]'', ''bigint'')  
  SET @NORSBIFVALUE =  @x.value(''(/Number_Of_Rows_Selected_By_Internal_Filter/text())[1]'', ''bigint'')  
  SET @NOTSFICVALUE =  @x.value(''(/Number_Of_Times_Secondary_Filter_Is_Called/text())[1]'', ''bigint'')  
  SET @NOROVALUE =  @x.value(''(/Number_Of_Rows_Output/text())[1]'', ''bigint'')  
  SET @PORNBPFVALUE =  @x.value(''(/Percentage_Of_Rows_NotSelected_By_Primary_Filter/text())[1]'', ''float'')  
  SET @POPFRSBIFVALUE =  @x.value(''(/Percentage_Of_Primary_Filter_Rows_Selected_By_Internal_Filter/text())[1]'', ''float'')  
  
  
  UPDATE spat_idx_test_result  
  SET  
  num_cells = ' +@num_cell +',  
  permut = ' +@permut +',  
  g2t' +@a4 +' = ' +@time_str +',  
  PF_EFF = @PFVALUE,  
  IF_EFF = @IFVALUE,  
  GRIDL1 = @GRIDL1VALUE,  
  GRIDL2 = @GRIDL2VALUE,  
  GRIDL3 = @GRIDL3VALUE,  
  GRIDL4 = @GRIDL4VALUE,  
  TPIR = @TPIRVALUE,  
  TPIP = @TPIPVALUE,  
  ANOIRPBR = @ANOIRPBRVALUE,  
  TNOOCILFQ = @TNOOCILFQVALUE,  
  TNOOCIL0II = @TNOOCIL0IIVALUE,  
  TNOOCIL4II = @TNOOCIL4IIVALUE,  
  TNOOCIL3FQ =  @TNOOCIL3FQVALUE,  
  TNOOCIL4FQ =  @TNOOCIL4FQVALUE,  
  TNOIOIL3FQ = @TNOIOIL3FQVALUE,  
  TNOIOIL4FQ = @TNOIOIL4FQVALUE,  
  ITTCNTLGP = @ITTCNTLGPVALUE,  
  INTTTCNTLGP = @INTTTCNTLGPVALUE,  
  BTTCNTLGP = @BTTCNTLGPVALUE,  
  ACPONTLGP = @ACPONTLGPVALUE,  
  AOPG = @AOPGVALUE,  
  NORSBPF = @NORSBPFVALUE,  
  NORSBIF = @NORSBIFVALUE,  
  NOTSFIC = @NOTSFICVALUE,  
  NORO = @NOROVALUE,  
  PORNBPF = @PORNBPFVALUE,  
  POPFRSBIF = @POPFRSBIFVALUE  
  WHERE perm_id = ' +@perm_id  
  )  
  SET @a4 = @a4 + 1  
  END  
  SET @a2 = @a2 + 1  
  END  
  SET @a1 = @a1 + 1  
  
  END  
PRINT 'Testing of ' +@tabnm +' spatial index: ' +@idxnm +' is complete!'  
GO 

And execute like so:

DECLARE @BOUNDING VARCHAR(MAX)   
SET @BOUNDING = 'geography::STPolyFromText(''POLYGON ((-83.883490 35.440092, -83.896426 35.715280, -83.121432 35.787017, -83.124960 35.510653, -83.883490 35.440092))'', 4269)'  
  
DECLARE @QUERY VARCHAR(MAX)   
SET @QUERY = 'geography::STPolyFromText(''POLYGON ((-83.883490 35.440092, -83.896426 35.715280, -83.121432 35.787017, -83.124960 35.510653, -83.883490 35.440092))'', 4269)'    
  
EXEC sp_tune_spatial_index 'GRSM_TRAILS', 'GRSM_TRAILS_IDX', 4096, 4096, @BOUNDING, @QUERY   
GO

more
1 0 1,597
200 Subscribers
Labels
  • Geodatabase 25