Offline map registration timeouts fixed by this python script, retiring old service replica ids

980
0
02-01-2021 05:45 PM
Labels (4)
ReedHunter
New Contributor III
Symptoms
Users of our offline, sync enabled mobile geodatabases were reporting timeouts when trying to register their offline files.  
 
We have around a thousand ArcGIS Runtime SDK for iOS apps supporting disconnected editing.  Nightly automation pre-generates and zips an offline .geodatabase file with the latest state of the geodatabase, and our users will download this file after initial deployment or the rare occasion of recovering from some error state.  After the zipped file is downloaded and expanded, the client app registers it with the feature service for sync support.  This "preplanned workflow" creates a replica in the geodatabase.  Unfortunately having multiple people download and reuse the same file had a side effect that if anyone unregistered their replica generated from that file, anyone else who also downloaded that file could no longer sync.  For this reason we had to never unregister offline files on the fly, and accept an accumulation of replica ids over time. 
 
Over time We created new map services and retired old ones on top of the same geodatabase.  Unknown to us, the time it takes to register a map service gradually lengthened as the replica counts mounted.  Slow networks also cause these map registration timeouts, so we were slow to learn that our users were experiencing this ever more often on fast networks as well.  Several registration attempts often had to be made even on fast networks before registration could succeed. 
 
Diagnosis
It turns out we had too many old replicas in our enterprise geodatabase, slowing down the registration of new replicas enough that we experienced blocking timeouts.  
 
Solution
Once the replica count was reduced from around 13,000 to around 4,000 the timeouts almost completely stopped.  I uncovered the sql to discover old replica Ids from the enterprise geodatabase's GDB_ITEMS table, and then learned how to identify which were from old, retired feature services versus current production services. Using sql to extract replica IDs just from retired feature services, these IDs were then fed into the ArcGIS Pro unregister replicas command.  Out of caution of possible performance and functional side effects, the script contains a count restraint of how many replicas to process before exiting.
 
Software Used
  • The python script was run on a Windows computer with ArcGIS Pro installed 2.6.2.
  • 64bit python 3.6.10 was used for development and execution of the script. 
  • The geodatabase was on Sql Server.
 
 
The Sql:
 
Note that all queries against the GDB_ITEMS table are read only.  All updates to replicas work through Esri's application tier.  Be aware that unregistering a replica causes it to be removed from the GDB_ITEMS table.  
 
  • To find the count of all replicas in GDB_ITEMS regardless of map service of origin, this sql was used:
 
SELECT COUNT(*) 
FROM [TheDatabaseName].[dbo].[GDB_ITEMS] 
WHERE [Type] = '5B966567-FB87-4DDE-938B-B4B37423539D' 
 
  • To acquire the count for a specific map service, include this clause:
AND [DatasetInfo1] = 'http://thegisserverhostname/arcgis/rest/services/theMapService/FeatureServer'
 
Why query for replica counts?  Getting the replica counts down solves the problem, and the script below includes a limit on how many replicas to unregister in a given batch.  
 

If you need to confirm for yourself what the guid '5B966567-FB87-4DDE-938B-B4B37423539D' means and where it came from, you will find it in the GDB_ITEMTYPES table.  Filtering based on this guid allows your queries to ignore all the other geodatabase item types like featureclasses.

  • For a list of available feature services for which replicas have been created, run the following sql against the enterprise geodatabase.  Be careful to only pick feature services from these results that are no longer in use, otherwise you will prevent someone from being able to check-in their edits or to acquire recent edits of others through a sync.  
SELECT DISTINCT [DatasetInfo1] from [TheDatabaseName].[dbo].[GDB_ITEMS]
WHERE [Type] = '5B966567-FB87-4DDE-938B-B4B37423539D' 
 
 
Setting up the script to run.
 
Update the ini file:
 
  • Once the url and the count to unregister are acquired, update serviceUrl and replicaCount properties of UnregisterGdbReplicas.ini.  
  • For ease of switching between testing and production, the ini file is separated according to QA vs Prod.
  • A Qa and/or Production .SDE file connection file is required in order to establish access to the geodatabase.   
 
Setting up Python
 
  • In order to make this script work, you need a python environment from ArcGIS Pro 2.6.2 or better to have the unregister replicas command available.  You will need to open ArcGIS Pro, and then access "Settings" -> "Python" to first clone the default python environment if you haven't already in order to have write access to the python env.  After that you can add the configparser, sqlalchemy, and pyodbc libraries.  
  • You will also likely need to add the path to that python library's folder to the user's path environment variable.
  • When executing the script, you can either run it in debug inside of visual studio code or from a command prompt.  To run in Visual Studio Code you need to first enter the interpreter path of the cloned python environment's python.exe file.  
  • When executing from the command prompt, it worked better to change directory first to the location of the script, and then execute the command:
<path to python env>\python.exe .\UnregisterGdbReplicas.py
 
In order to confirm that the script is running successfully, it can help to set a breakpoint before and 
after the unregister replica call, and then execute sql on the GDB_ITEMS table before and after to look 
at the row count for that replica's url.  It was also useful to have code at the end of main that would output the list of replicaIds that had been processed.
 
 
Some Disclaimers
 
The code below is provided as is for reference purposes.  If you use this, expect to have to rewrite 
at least part of it for your environment and business needs.  Even in our own environment data could be lost
or sync access for field crews could be lost were this script run against the wrong urls.
  
This code is being shared partly for future reference if this task needs to be performed again in a year or two, 
and partly as a proof of concept for anyone else who has a similar problem.  Notice that all these instructions read more like reminder notes how to ramp up quickly next time this is needed.  If a bug is found and a help request posted on this script, it's unlikely to even be noticed until the map registration timeout problem resurfaces, and this script has to be dusted off again.  
 
The Code:
 
Pasting code into this ui resulted in all python indentation being lost, so I instead posted this to GitHub at
 
 
0 Kudos
0 Replies