|
POST
|
Gotcha, Are you using Versioning in your environment? Or are all edits being made directly to DEFAULT in both replicas?
... View more
09-06-2017
11:35 AM
|
0
|
1
|
1769
|
|
POST
|
Did you get this to work? I was thinking of doing a similar workflow. Also, it looks like if the basemap used in Collector is in the same coordinate system as your feature service that editing in Collector should work just fine?
... View more
09-06-2017
10:49 AM
|
0
|
3
|
1769
|
|
POST
|
Hi Larry! You will need to install the pyodbc python module for the portion of the error handling that writes to a SQL log table. pyodbc basically allows you to access your ODBC database through Python. If you don't care about this portion of the script you can remove the import statement for pyodbc and all other referenced code. Home · mkleehammer/pyodbc Wiki · GitHub Are you wondering how to build the logging table in SQL Server? or something else?
... View more
06-27-2017
01:46 PM
|
0
|
0
|
1583
|
|
POST
|
I'm on SQL Server 2008. I would think this query should work on later versions too... USE [YOUR DB NAME HERE]
GO
/****** Object: Table [dbo].[PYTHON_SCRIPT_TRACKER] Script Date: 06/05/2017 13:46:51 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[PYTHON_SCRIPT_TRACKER](
[ScriptName] [varchar](255) NULL,
[RunDate] [date] NULL,
[Status] [varchar](255) NULL,
[ErrorCode] [varchar](255) NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
... View more
06-05-2017
01:54 PM
|
1
|
1
|
2774
|
|
POST
|
Yep, you have it exactly right. I should have mentioned that. You don't have to unregister, delete, and re-create. I do it to get a clean compress back to state 0, and because we were having some weird issues happening with our versioning at one time, and ESRI support recommended that we implement this full process to eliminate any issues.
... View more
06-05-2017
01:31 PM
|
2
|
3
|
2774
|
|
POST
|
Rebecca Strauch, GISP & Larry Adgate There are a few variables at the beginning that you will need to tailor to your environment: editDB = (your database connection) sqlsvrname = (name of your sql server instance) sqldbname = (name of your sql server db within your instance) There are several other functions and cursors that will need to have my parameters replaced with your own. It should be pretty clear where those are though. __author__ = 'wilson'
# //////////////////////////////////////////////////////////////////////////////////////////////////////
# Script Title: Rec_Post_GEONET
# Created by: Cort Wilson, dcortwilson@gmail.com
# Created Date: 6/1/17
# Description: Blocks connections, disconnects users, reconcile/post versions, delete versions, unregisters tables and feature classes as versioned,
# compresses database, registers tables and features classes as versioned, creates versions,
# rebuild indexes, analyze datasets, set the database to allow connections.
#
# pyodbc is used to log captured errors or success message into a sql table.
#
#
# ***VERSION TREE DIAGRAM***
#
# DEFAULT
# |
# |
# QAQC
# / | \
# / | \
# / | \
# MICHAEL COLLECTOR KAREN
#
#
#
#
# Requirements
# pyodbc for SQL connection and cursor to SQL log table
#
#//////////////////////////////////////////////////////////////////////////////////////////////////////
import arcpy
import sys
import os
import time
import pyodbc
import traceback
# \\\\Constant Vars\\\\
# Database Connection
editDB ="Database Connections/DBO@CityworksGIS.sde"
# Current Day
Day = time.strftime("%m-%d-%Y", time.localtime())
# Current Time
Time = time.strftime("%I:%M:%S %p", time.localtime())
# \\\\pyodbc.connect vars\\\\
sqlsvrname = 'gis'
sqldbname = 'CityworksGIS'
# Set workspace
workspace = editDB
# Set the workspace environment
arcpy.env.workspace = workspace
arcpy.env.overwriteOutput = True
try:
# Start Time
print 'Process Started at ' + str(Day) + " " + str(Time)
# block new connections to the working and prod database.
print "Blocking Connections..."
arcpy.AcceptConnections(editDB, False)
# disconnect all users from the working and prod database.
print "Disconnecting Users..."
arcpy.DisconnectUser(editDB, "ALL")
# Get a list of all versions besides QAQC and DEFAULT to pass into the ReconcileVersions tool.
ver1List = [ver1.name for ver1 in arcpy.da.ListVersions(editDB) if ver1.name != 'DBO.QAQC' and ver1.name != 'dbo.DEFAULT']
# Execute the ReconcileVersions tool with QAQC Target Version then delete all versions besides QAQC
print "Reconcile/post versions to QAQC...."
arcpy.ReconcileVersions_management(editDB, "ALL_VERSIONS", "DBO.QAQC", ver1List, "LOCK_ACQUIRED", "NO_ABORT", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "DELETE_VERSION")
# Extract QAQC version from the list of versions to pass to ReconcileVersions tool.
ver2List = [ver2.name for ver2 in arcpy.da.ListVersions(editDB) if ver2.name == 'DBO.QAQC']
# Execute the ReconcileVersions tool with DEFAULT Target Version and delete QAQC version
print "Reconcile/post QAQC to DEFAULT..."
arcpy.ReconcileVersions_management(editDB, "ALL_VERSIONS", "dbo.DEFAULT", ver2List, "LOCK_ACQUIRED", "NO_ABORT", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "DELETE_VERSION")
print("==============================")
print ("Unregister Tables Started")
print("==============================")
# **** wildcards for tables I don't want to include ****
for tbs in [tb for tb in arcpy.ListTables() if '_log' not in tb and 'GenerateId' not in tb and 'PYTHON_SCRIPT' not in tb]:
print tbs
arcpy.UnregisterAsVersioned_management(tbs,"NO_KEEP_EDIT","COMPRESS_DEFAULT")
# If you want to include all tables in the database use this instead:
# for tbs in arcpy.ListTables():
# print tbs
# arcpy.UnregisterAsVersioned_management(tbs,"NO_KEEP_EDIT","COMPRESS_DEFAULT")
print("==============================")
print ("Unregister Feature Classes is started")
print("==============================")
datasets = arcpy.ListDatasets("*", "Feature")
for dataset in datasets:
fcList = arcpy.ListFeatureClasses("*","",dataset)
for fc in fcList:
print fc
arcpy.UnregisterAsVersioned_management(fc,"NO_KEEP_EDIT","COMPRESS_DEFAULT")
# Run the compress tool.
print "Compressing database..."
arcpy.Compress_management(editDB)
# \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ script initiation, Rec/Post process error handling \\\\\\\\\\\\\\\\\\\\\\\\\\\\\
except:
print 'An error occured'
failMsg = '\nSCRIPT FAILURE IN SCRIPT INITIATION OR RECONCILE-POST PROCESS, \n'
failMsg += 'Most recent GP messages below.\n'
failMsg += arcpy.GetMessages() +'\n'
failMsg += '\nTraceback messages below.\n'
failMsg += traceback.format_exc().splitlines()[-1]
print failMsg
#write error log info
print 'Logging error to database....'
# THIS SECTION FOR LOGGING ERRORS TO A DB TABLE
conn = pyodbc.connect('Trusted_Connection=yes', driver='{SQL Server}', server=sqlsvrname, database=sqldbname)
cursor = conn.cursor()
# You will need to create a table in sql to log to and add fields that you want to log into and pass them as params to this cursor
cursor.execute("""INSERT INTO [dbo].[PYTHON_SCRIPT_TRACKER] (ScriptName,RunDate,Status,ErrorCode) VALUES ('ReconcilePost_AssetIDs_QAQC',GetDate(),'Failure',(?)); """,failMsg)
conn.commit()
conn.close()
sys.exit()
# ////////////////////////////////////// REGISTER AS VERSIONED AND RE-CREATE VERSIONS /////////////////////////////////
try:
print("==============================")
print ("register tables is started")
print("==============================")
# **** wildcards for tables I don't want to version ****
for tbs in [tb for tb in arcpy.ListTables() if '_log' not in tb and 'GenerateId' not in tb and 'PYTHON_SCRIPT' not in tb]:
print tbs
arcpy.RegisterAsVersioned_management(tbs, "NO_EDITS_TO_BASE")
# If you want to include all tables in the database use this:
# for tbs in arcpy.ListTables():
# print tbs
# arcpy.RegisterAsVersioned_management(tbs, "NO_EDITS_TO_BASE")
print("==============================")
print ("register Feature Class is started")
print("==============================")
datasets = arcpy.ListDatasets("*", "Feature")
for dataset in datasets:
fcList = arcpy.ListFeatureClasses("*","",dataset)
for fc in fcList:
print fc
arcpy.RegisterAsVersioned_management(fc, "NO_EDITS_TO_BASE")
# create versions
print "Creating Versions..."
arcpy.CreateVersion_management(editDB, 'dbo.DEFAULT', 'QAQC', 'PRIVATE')
arcpy.CreateVersion_management(editDB, 'DBO.QAQC', 'MICHAEL', 'PUBLIC')
arcpy.CreateVersion_management(editDB, 'DBO.QAQC', 'KAREN', 'PUBLIC')
arcpy.CreateVersion_management(editDB, 'DBO.QAQC', 'COLLECTOR', 'PUBLIC')
# /////////////////////////////////// ANALYZE DATASETS AND CALC STATISTICS /////////////////////////////////////
# NOTE: Rebuild indexes can accept a Python list of datasets.
# Get a list of all the datasets the user has access to.
# First, get all the stand alone tables, feature classes and rasters.
dataList = arcpy.ListTables() + arcpy.ListFeatureClasses() + arcpy.ListRasters()
# Next, for feature datasets get all of the datasets and featureclasses
# from the list and add them to the master list.
for dataset in arcpy.ListDatasets("*", "Feature"):
arcpy.env.workspace = os.path.join(workspace, dataset)
dataList += arcpy.ListFeatureClasses() + arcpy.ListDatasets()
# reset the workspace
arcpy.env.workspace = workspace
# Concatenate all datasets into a list
datasetList = [ds for ds in dataList]
print "rebuilding indexes"
# Execute rebuild indexes
# Note: to use the "SYSTEM" option the workspace user must be an administrator.
arcpy.RebuildIndexes_management(workspace, "SYSTEM", datasetList, "ALL")
print('Rebuild Complete')
print "analyzing datasets"
arcpy.AnalyzeDatasets_management(workspace, "SYSTEM", datasetList, "ANALYZE_BASE", "ANALYZE_DELTA", "ANALYZE_ARCHIVE")
print "analysis complete"
#Allow the database to begin accepting connections again
print "Set databases to allow connections..."
arcpy.AcceptConnections(editDB, True)
# \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ If Script Successful, log into database \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
print 'Data has been synched succesfully and logged in the tracking database'
conn = pyodbc.connect('Trusted_Connection=yes', driver='{SQL Server}', server=sqlsvrname, database=sqldbname)
cursor = conn.cursor()
# You will need to create a table in sql to log to and add fields that you want to log into and pass them as params to this cursor
cursor.execute("""INSERT INTO [dbo].[PYTHON_SCRIPT_TRACKER] (ScriptName,RunDate,Status,ErrorCode) VALUES ('ReconcilePost_AssetIDs_QAQC',GetDate(),'Success','All Steps Completed Without Error'); """)
conn.commit()
del cursor
conn.close()
print 'Script is finished'
# /////////////////////////////////Register as Versioned Error handling area////////////////////////////////////////////
except:
print 'An error occured'
failMsg = '\nSCRIPT FAILURE IN VERSIONING PROCESS\n'
failMsg += 'Most recent GP messages below.\n'
failMsg += arcpy.GetMessages() +'\n'
failMsg += '\nTraceback messages below.\n'
failMsg += traceback.format_exc().splitlines()[-1]
print failMsg
#write error log info
print 'Logging error to database....'
# THIS SECTION FOR LOGGING ERRORS TO A DB TABLE
conn = pyodbc.connect('Trusted_Connection=yes', driver='{SQL Server}', server=sqlsvrname, database=sqldbname)
cursor = conn.cursor()
# You will need to create a table in sql to log to and add fields that you want to log into and pass them as params to this cursor
cursor.execute("""INSERT INTO [dbo].[PYTHON_SCRIPT_TRACKER] (ScriptName,RunDate,Status,ErrorCode) VALUES ('ReconcilePost_AssetIDs_QAQC',GetDate(),'Failure',(?)); """,failMsg)
conn.commit()
conn.close()
sys.exit() Feel free to let me know if you have any questions or find room for improvement! My email is under the author portion of the script if you need it.
... View more
06-02-2017
01:55 PM
|
2
|
9
|
2774
|
|
POST
|
LarryAdgate & rastrauch What's the best way to share this script? Can't seem to figure out how to attach to this post.... I could email it if that works?
... View more
06-02-2017
10:50 AM
|
0
|
11
|
4529
|
|
POST
|
Kim, Did you find a solution for this? I need to do something similar with our water distribution system. Were you using the snap geoprocessing tool?
... View more
06-01-2017
02:40 PM
|
0
|
0
|
1353
|
|
POST
|
LarryAdgate & rastrauch Sure! Give me a bit to strip out any sensitive information from the script and add in some more descriptive comments. If you don't hear from me within a few days give me a shout.
... View more
05-31-2017
03:21 PM
|
2
|
2
|
1583
|
|
POST
|
A little late to your post, but figured I'd put in my $0.02. Hopefully you've found a solution that will work for you! We are a small-ish sized water utility: 27,000 service connections on over 600 miles of water main. I am using python to automate the reconcile-post process. It works well for us because there are only 3 of us editing at one time and rarely are there any conflicts. I perform QAQC edits in the DEFAULT version and favor all conflicts towards that version, and that's good enough for me. If the automated conflict resolution is a concern, you can set a parameter in the script process to abort if any conflicts are found, review the conflicts, then re-run the script. You might want to re-search the implications of aborting the process when a conflict is found. One problem of course is that usually you would want to run the script after-hours or on the weekends and now that possibly leaves you attached to reviewing conflicts after hours. I built in error handling throughout the script and have it write to a SQL Server logging table the specific error that was found or if the script was run successfully. A SQL Server trigger on the logging table or function in python to send an email after an error is raised would take it to the next level; I have yet to do that.... In general, I have found automating the process to be a big help. There are many errors that can be introduced when doing this process manually, and hard-coding in all the steps in the right order has reduced a lot of headaches for me.
... View more
05-31-2017
01:39 PM
|
2
|
15
|
4529
|
|
POST
|
I want to design a bulletproof method to deal with schema changes in a two-way replication environment. The setup is pretty simple: a parent replica where most of the editing and any schema changes will take place, replicated to a child where attribute editing will take place, but no schema changes will be made. The GP tools offered be ESRI for comparing/importing schema XML have limitations, and I want the synchronization process to be automated and handle ANY and ALL schema changes that are made in the parent database. In order to do this I am thinking the workflow would be like this: Synchronize replicas Un-register replicas Delete contents from child database Run "create replica" tool to re-create replicas and copy the replicated datasets to the child database. Am I correct to think this would handle all schema changes? I get that it would be expensive and add a good amount of run time to an automated script, but I don't want to deal with any schema errors. I want the process to be automated, and arcpy doesn't have a tool to un-register replicas, so I would have to use arcobjects within a python script to un-register the replicas. Anyone have any experience/success in doing that? Thanks!
... View more
05-18-2017
01:38 PM
|
0
|
0
|
844
|
|
POST
|
What to do if the permission changes wont take effect? Tried multiple times to make the changes and they wouldn't take. Called ESRI support and they said to re-boot; only after re-boot did the changes take effect.
... View more
08-10-2016
03:58 PM
|
0
|
0
|
2397
|
|
POST
|
Step 1: Grant read/write Using SQL server management studio expand the security tab, then the logins tab open the properties for the AGS server account login (by default the username is "arcgis" and it could be a domain or local user), then click User mapping in the top pane, select the db in question now in the bottom pane check the boxes nect to datareader and datawriter roles click ok Step 2: Remove AGS account from sysadmin server role Expand the security tab, expand the Server Roles tab open the properties for the sysadmin role select your AGS account user and click "remove" Step 3: reboot the server. I had our IT department do this for me.
... View more
08-10-2016
12:03 PM
|
1
|
2
|
2397
|
|
POST
|
Thank you. I mapped the ArcGIS Server account to all of it's associated databases with read/write privileges, then removed it from the sysadmin server role, and then had to re-boot the server for the changes to take effect.
... View more
08-10-2016
10:25 AM
|
1
|
4
|
2397
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 05-23-2024 02:22 PM | |
| 1 | 02-06-2024 03:33 PM | |
| 3 | 10-31-2023 02:05 PM | |
| 1 | 07-24-2020 10:11 AM | |
| 1 | 08-10-2016 12:03 PM |
| Online Status |
Offline
|
| Date Last Visited |
04-01-2026
01:38 PM
|