|
POST
|
ooh! I apologize for deleting the post here. I thought the moderator did not want it posted here and so I deleted it. I can still publish it again here and how I got the issue resolved if anyone is interested in it. Please let me know if I am allowed to post it here.
... View more
09-29-2017
06:47 AM
|
1
|
2
|
2647
|
|
POST
|
I created a python script to reconcile all edits on the users versions, then post the reconciled version to a QAQC and then to the sde or dbo default version. After this process the script will proceed to compress analyze and rebuild indexes for the tables in the sde database. This script will be scheduled on windows scheduler to run silently at 2:00am on Thursday and Saturday morning when users are offline. If there should be any error encountered, the script is to print the errors and also generate a log file containing the errors and store the logfile with date and time in the specified drive location. Since the script was scheduled to run behind scene on windows scheduler, it was difficult for me to see errors when it happens. This was a big issue because users edits weren't reflected on our default database after sometime due to the error encountered by the script, his actually necessitated the need for a functional logfile that will enable me identify the errors. The initial problem I had was that the script executed very well but could only print the errors on the screen but it generates an empty logfile. This was corrected by putting in 3 lines at the end of the script to get it to write the printed errors intothe logfile, save it with time and date. The Initial Script - executes but generates an empty log file # This script will reconcile and post all edits made in child versions of Natural Resources Geodatabase to QA\QC and then to SDE default database,it will go further to compress the database, analyze and rebuild the indexes on the database tables.
# Script developed by Irene F. Egbulefu - GIS Analyst ( TNR Dept)
import arcpy
import sys
import os
import datetime
import traceback
# Database Connection
editDB ="Database Connections\\Frogmouth_Natural_ResourcesTC.sde"
# Current Day
Day = time.strftime("%m-%d-%Y", time.localtime())
# Current Time
Time = time.strftime("%I:%M:%S %p", time.localtime())
# Set workspace
workspace = editDB
# Set the workspace environment
arcpy.env.workspace = workspace
arcpy.env.overwriteOutput = True
try:
# Start Time
print 'Process Started at ' + str(Day) + " " + str(Time)
# block new connections to the PublicWorks Geodatabase on Frogmouth Server.
print "Blocking Connections..."
arcpy.AcceptConnections(editDB, False)
# disconnect all users from the PublicWorks Geodatabase on Frogmouth Server.
print "Disconnecting Users..."
arcpy.DisconnectUser(editDB, "ALL")
# Get a list of all child versions besides QAQC and DEFAULT to pass into the ReconcileVersions tool.
ver1List = [ver1.name for ver1 in arcpy.da.ListVersions(editDB) if ver1.name != 'TC_USER.QA/QC' and ver1.name != 'sde.DEFAULT']
# Execute the ReconcileVersions tool with QAQC Target Version and do not delete child versions
print "Reconcile/post versions to QAQC...."
arcpy.ReconcileVersions_management(editDB, "ALL_VERSIONS", "TC_USER.QA/QC", ver1List, "LOCK_ACQUIRED", "ABORT_CONFLICTS", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION")
# Extract QAQC version from the list of versions to pass to ReconcileVersions tool.
ver2List = [ver2.name for ver2 in arcpy.da.ListVersions(editDB) if ver2.name == 'TC_USER.QA/QC']
# Execute the ReconcileVersions tool with DEFAULT Target Version and do not delete QAQC version
print "Reconcile/post QAQC to DEFAULT..."
arcpy.ReconcileVersions_management(editDB, "ALL_VERSIONS", "sde.DEFAULT", ver2List, "LOCK_ACQUIRED", "ABORT_CONFLICTS", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION")
# Run the compress tool.
print "Compressing database..."
arcpy.Compress_management(editDB)
# /////////////////////////////////// ANALYZE DATASETS AND CALC STATISTICS /////////////////////////////////////
# NOTE: Rebuild indexes can accept a Python list of datasets.
# Get a list of all the datasets the user has access to.
# First, get all the stand alone tables, feature classes and rasters.
dataList = arcpy.ListTables() + arcpy.ListFeatureClasses() + arcpy.ListRasters()
# Next, for feature datasets get all of the datasets and featureclasses
# from the list and add them to the master list.
for dataset in arcpy.ListDatasets("*", "Feature"):
arcpy.env.workspace = os.path.join(workspace, dataset)
dataList += arcpy.ListFeatureClasses() + arcpy.ListDatasets()
# reset the workspace
arcpy.env.workspace = workspace
# Concatenate all datasets into a list
datasetList = [ds for ds in dataList]
print "rebuilding indexes"
# Execute rebuild indexes
# Note: to use the "SYSTEM" option the workspace user must be an administrator.
arcpy.RebuildIndexes_management(workspace, "NO_SYSTEM", datasetList, "ALL")
print('Rebuild Complete')
print "analyzing datasets"
arcpy.AnalyzeDatasets_management(workspace, "NO_SYSTEM", datasetList, "ANALYZE_BASE", "ANALYZE_DELTA", "ANALYZE_ARCHIVE")
print "analysis complete"
#Allow the database to begin accepting connections again
print "Set databases to allow connections..."
arcpy.AcceptConnections(editDB, True)
# \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ script initiation, Rec/Post process error handling \\\\\\\\\\\\\\\\\\\\\\\\\\\\\
except:
print 'An error occured'
failMsg = '\nSCRIPT FAILURE IN SCRIPT INITIATION OR RECONCILE-POST PROCESS, \n'
failMsg += 'Most recent GP messages below.\n'
failMsg += arcpy.GetMessages() +'\n'
failMsg += '\nTraceback messages below.\n'
failMsg += traceback.format_exc().splitlines()[-1]
print failMsg
# \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ write error log info\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
# move to working directory
os.chdir (u'Y:\\TOOLS\\Logs\\')
## define function with variable filename and the format of the timestamp
def timeStamped(filename, fmt='%m-%d-%y-%H.%M.%S-{filename}'):
return datetime.datetime.now().strftime(fmt).format(filename=filename)
## assign local variable filename and use whatever file name and extension you need
filename = timeStamped('Natural_ResourcesTC_toQC_Log.txt')
## Create the logfile and assign write permission
Open(filename, "w")
Final corrected script - executes and generates logfile, prints errors on the screen and also writes error information into the logfile it # This script will reconcile and post all edits made in child versions of Natural Resources Geodatabase to QA\QC and then to SDE default database,it will go further to compress the database, analyze and rebuild the indexes on the database tables
# Script developed by Irene F. Egbulefu - GIS Analyst ( TNR Dept)
import arcpy
import sys
import os
import datetime
import traceback
# Database Connection
editDB ="Database Connections\\Frogmouth_Natural_ResourcesTC.sde"
# Current Day
Day = time.strftime("%m-%d-%Y", time.localtime())
# Current Time
Time = time.strftime("%I:%M:%S %p", time.localtime())
# Set workspace
workspace = editDB
# Set the workspace environment
arcpy.env.workspace = workspace
arcpy.env.overwriteOutput = True
try:
# Start Time
print 'Process Started at ' + str(Day) + " " + str(Time)
# block new connections to the PublicWorks Geodatabase on Frogmouth Server.
print "Blocking Connections..."
arcpy.AcceptConnections(editDB, False)
# disconnect all users from the PublicWorks Geodatabase on Frogmouth Server.
print "Disconnecting Users..."
arcpy.DisconnectUser(editDB, "ALL")
# Get a list of all child versions besides QAQC and DEFAULT to pass into the ReconcileVersions tool.
ver1List = [ver1.name for ver1 in arcpy.da.ListVersions(editDB) if ver1.name != 'TC_USER.QA/QC' and ver1.name != 'sde.DEFAULT']
# Execute the ReconcileVersions tool with QAQC Target Version and do not delete child versions
print "Reconcile/post versions to QAQC...."
arcpy.ReconcileVersions_management(editDB, "ALL_VERSIONS", "TC_USER.QA/QC", ver1List, "LOCK_ACQUIRED", "ABORT_CONFLICTS", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION")
# Extract QAQC version from the list of versions to pass to ReconcileVersions tool.
ver2List = [ver2.name for ver2 in arcpy.da.ListVersions(editDB) if ver2.name == 'TC_USER.QA/QC']
# Execute the ReconcileVersions tool with DEFAULT Target Version and do not delete QAQC version
print "Reconcile/post QAQC to DEFAULT..."
arcpy.ReconcileVersions_management(editDB, "ALL_VERSIONS", "sde.DEFAULT", ver2List, "LOCK_ACQUIRED", "ABORT_CONFLICTS", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION")
# Run the compress tool.
print "Compressing database..."
arcpy.Compress_management(editDB)
# /////////////////////////////////// ANALYZE DATASETS AND CALC STATISTICS /////////////////////////////////////
# NOTE: Rebuild indexes can accept a Python list of datasets.
# Get a list of all the datasets the user has access to.
# First, get all the stand alone tables, feature classes and rasters.
dataList = arcpy.ListTables() + arcpy.ListFeatureClasses() + arcpy.ListRasters()
# Next, for feature datasets get all of the datasets and featureclasses
# from the list and add them to the master list.
for dataset in arcpy.ListDatasets("*", "Feature"):
arcpy.env.workspace = os.path.join(workspace, dataset)
dataList += arcpy.ListFeatureClasses() + arcpy.ListDatasets()
# reset the workspace
arcpy.env.workspace = workspace
# Concatenate all datasets into a list
datasetList = [ds for ds in dataList]
print "rebuilding indexes"
# Execute rebuild indexes
# Note: to use the "SYSTEM" option the workspace user must be an administrator.
arcpy.RebuildIndexes_management(workspace, "NO_SYSTEM", datasetList, "ALL")
print('Rebuild Complete')
print "analyzing datasets"
arcpy.AnalyzeDatasets_management(workspace, "NO_SYSTEM", datasetList, "ANALYZE_BASE", "ANALYZE_DELTA", "ANALYZE_ARCHIVE")
print "analysis complete"
#Allow the database to begin accepting connections again
print "Set databases to allow connections..."
arcpy.AcceptConnections(editDB, True)
# \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ script initiation, Rec/Post process error handling \\\\\\\\\\\\\\\\\\\\\\\\\\\\\
except:
print 'An error occured'
failMsg = '\nSCRIPT FAILURE IN SCRIPT INITIATION OR RECONCILE-POST PROCESS, \n'
failMsg += 'Most recent GP messages below.\n'
failMsg += arcpy.GetMessages() +'\n'
failMsg += '\nTraceback messages below.\n'
failMsg += traceback.format_exc().splitlines()[-1]
print failMsg
# \\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ write error log info\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
# move to working directory
os.chdir (u'Y:\\TOOLS\\Logs\\')
## define function with variable filename and the format of the timestamp
def timeStamped(filename, fmt='%m-%d-%y-%H.%M.%S-{filename}'):
return datetime.datetime.now().strftime(fmt).format(filename=filename)
## assign local variable filename and use whatever file name and extension you need
filename = timeStamped('Natural_ResourcesTC_toQC_Log.txt')
## Create the logfile and assign write permission
f = Open(filename, "w")
f.write(failmsg)
f.close()
... View more
09-28-2017
01:31 PM
|
0
|
9
|
3026
|
|
POST
|
Hi All I have some datasets: Parks, Park facilities and ParkTrails. These have many-to-many relationship; i.e a park can have multiple facilities and there may be multiple park trails on a single park I would like to publish this data to ArcGIS Server, and issue queries on it, so that I can get a single result for a single Park. Currently, when mapping say a pack I get all other parks and facilities as well as Park line and this is somehow confusing and a lot of information to digest at a time when the result is drawn by a text search. Please does anyone have an idea how I can create a spatial view that will generate a unique Object ID field for repeated geometries to enable me publish a park layer with say uniquely identified facilities and park lines. I am using MSSQL Database server 2012 and ArcGIS 10.4. I would also not want to tamper with my data storage type. I have tried creating the view from my MSSQL interface but on bringing it across to ArcCatalog or ArcMap, I keep getting the OID error (ObjectID)and some other errors as shown in the screenshot attached here. Also at some point, the feature displays on ArcMap to show just one feature either polygon data or the Line data but when I try to view the attribute table it gives me another error screenshot 2
... View more
08-07-2017
07:25 AM
|
0
|
0
|
1360
|
|
POST
|
Thank you very much George! Your advise did work. I rebuilt the geodatabase indexes and then ran my python script which is supposed to reconcile and post versions to QA/QC and then to default SDE, after which it will proceed to compress, rebuild and analyzed the dataset. Everything worked out perfectly well. See result below Python 2.7.10 (default, May 23 2015, 09:40:32) [MSC v.1500 32 bit (Intel)] on win32 Type "copyright", "credits" or "license()" for more information. >>> ================================ RESTART ================================ >>> Process Started at 07-11-2017 01:18:25 PM Reconcile/post versions to QAQC.... Reconcile/post QAQC to DEFAULT... Compressing database... rebuilding indexes Rebuild Complete analyzing datasets analysis complete Set databases to allow connections... >>>
... View more
07-11-2017
01:20 PM
|
0
|
1
|
4096
|
|
POST
|
Sure, I will do that once my users are off the server and see if it would help! Thanks and I will get back to you on the outcome
... View more
07-11-2017
09:34 AM
|
0
|
0
|
4096
|
|
POST
|
I am getting this error when trying to compress one of my geodatabases. I have about 5 different geodatabase schemas all existing in the same server MSSQL Server 2012 database on Windows 2012 Server, I am able to reconcile, post, compress and analyze all the other 4 geodatabases successfully without issues but each time I run both my model or script on this particular geodatabase I keep getting the error saying. "Failure to access the DBMS server [[Microsoft][SQL Server Native Client 11.0]Communication link failure] [sde.DEFAULT][STATE_ID = 251] Underlying DBMS error [HY000:[Microsoft][SQL Server Native Client 11.0]Unspecified error occurred on SQL Server. Connection may have been terminated by the server.] [sde.DEFAULT]" I am able to reconcile and post new versions to the default version of this geodatabase but cannot go beyond that, when it gets to the compression stage, it errors out and displays the error message above. I came across some of the recommendations given on this forum and I have tried them all such as running a check on the database schema using the DBCC Check database tool and it ran successfully without error, which confirms the Geodatabase is OK and eliminates possible issue from that end. I have also uninstalled my existing MSSQL Native client and replaced it with a new MSSQL Native Client version 11.3.6538.0 (64bit) and I'm still getting the same error. Please does anyone have new ideas on what else I can do to resolve this issue.
... View more
07-11-2017
08:08 AM
|
1
|
4
|
8729
|
|
POST
|
CAN'T OPEN INSTANCE: sde:sqlserver. Spatial Engine Connection Failed (-409). Cannot Get Access to Instance sde:sqlserver
... View more
03-02-2017
01:58 PM
|
0
|
1
|
1228
|
|
POST
|
I am trying to compress one of my geodatabase on SDE 10.4 but i keep getting this error below. My native client is version 11 and 64 bits and this error is just peculiar to 2 geodatabases out of the 7 geodatabases that I have on my server ERROR 999999: Error executing function. Failure to access the DBMS server [[Microsoft][SQL Server Native Client 11.0]Communication link failure] [sde.DEFAULT][STATE_ID = 15] Underlying DBMS error [HY000:[Microsoft][SQL Server Native Client 11.0]Unspecified error occurred on SQL Server. Connection may have been terminated by the server.] [sde.DEFAULT] Failed to execute (Compress). Rebecca Strauch, gis servers, sde connection microsoft sql server 2012,
... View more
03-02-2017
01:15 PM
|
0
|
5
|
2337
|
|
POST
|
inputFolderPath line 691: \\Hamipool\E$\arcgisserver\directories\arcgissystem\arcgisinput\Public\Bike_Safety_Committee_Project.MapServer inputFolderPath line 693:Hamipool\E$\arcgisserver\directories\arcgissystem\arcgisinput\Public\Bike_Safety_Committee_Project.MapServer\extracted\v101 Step 1: Creating Service Definition Draft (.sddraft) Step 2: Analyzing Service Definition Draft. The following information was returned during analysis of the MXD: MESSAGES: - Layer draws at all scale ranges (CODE 30003) applies to: Is this supposed to be a critical problem 1 : Commissioner Precincts 2 : Public Schools 3 : Bike Safety Committee Recommendations 2013 4 : Existing Bike Routes 5 : CoA's OCG Trail (Built) 6 : Travis County Park Trails 7 : County Maintained Roads 8 : County Rds by Treatment Types 9 : County Rds by Condition 10 : Corridors_TNRs 11 : CAMPO 2035 Plan 12 : Travis_County_Parks 13 : LCRA_Parks_All 14 : COA_Parks_All 15 : Other_Juris_Parkland_All 16 : State_Parks_All 17 : City Limits 18 : ETJs 19 : County_Boundary_Line Service could not be published because errors were found during analysis. In some cases it doesn’t even return any message Step 1: Creating Service Definition Draft (.sddraft) Step 2: Analyzing Service Definition Draft. The following information was returned during analysis of the MXD: MESSAGES: Service could not be published because errors were found during analysis.
... View more
02-24-2017
01:36 PM
|
0
|
0
|
1825
|
|
POST
|
Jonathan Quinn wrote: Jonathan Quinn wrote: It looks like it's writing the errors to a text file: ...
# if the sddraft analysis contained errors
else:
#Service Definition Draft could not be analyzed
if analyseDraft['errors'] == "NO":
arcpy.AddWarning(" Service Definition Draft could not be analyzed.")
content = "\n " + formatDate() + "\n Service Definition Draft could not be analyzed. \n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace)
#writeInDatabase(fromServerName, toServerName, serviceType, finalServiceName, service, user, sources, 'Service Definition Draft could not be analyzed.')
else:
arcpy.AddWarning(" Service could not be published because errors were found during analysis. ")
content = "\n " + formatDate() + "\n Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace) You can add what's stored in the content variable to the AddWarning message: ...
# if the sddraft analysis contained errors
else:
#Service Definition Draft could not be analyzed
if analyseDraft['errors'] == "NO":
arcpy.AddWarning(" Service Definition Draft could not be analyzed.")
content = "\n " + formatDate() + "\n Service Definition Draft could not be analyzed. \n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace)
#writeInDatabase(fromServerName, toServerName, serviceType, finalServiceName, service, user, sources, 'Service Definition Draft could not be analyzed.')
else:
#Include the results of the analysis within the message
messageContent = "Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
#Add the message to the tool
arcpy.AddWarning(messageContent)
content = "\n " + formatDate() + "\n Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace) This way, it prints the message but also writes to the file created by the createTxtFile function, (whatever the "workspace" variable is defined to, which appears to be the sysTemp folder path). It looks like it's writing the errors to a text file: ...
# if the sddraft analysis contained errors
else:
#Service Definition Draft could not be analyzed
if analyseDraft['errors'] == "NO":
arcpy.AddWarning(" Service Definition Draft could not be analyzed.")
content = "\n " + formatDate() + "\n Service Definition Draft could not be analyzed. \n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace)
#writeInDatabase(fromServerName, toServerName, serviceType, finalServiceName, service, user, sources, 'Service Definition Draft could not be analyzed.')
else:
arcpy.AddWarning(" Service could not be published because errors were found during analysis. ")
content = "\n " + formatDate() + "\n Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace) You can add what's stored in the content variable to the AddWarning message: ...
# if the sddraft analysis contained errors
else:
#Service Definition Draft could not be analyzed
if analyseDraft['errors'] == "NO":
arcpy.AddWarning(" Service Definition Draft could not be analyzed.")
content = "\n " + formatDate() + "\n Service Definition Draft could not be analyzed. \n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace)
#writeInDatabase(fromServerName, toServerName, serviceType, finalServiceName, service, user, sources, 'Service Definition Draft could not be analyzed.')
else:
#Include the results of the analysis within the message
messageContent = "Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
#Add the message to the tool
arcpy.AddWarning(messageContent)
content = "\n " + formatDate() + "\n Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace) This way, it prints the message but also writes to the file created by the createTxtFile function, (whatever the "workspace" variable is defined to, which appears to be the sysTemp folder path). Jonathan Quinn wrote: It looks like it's writing the errors to a text file: ...
# if the sddraft analysis contained errors
else:
#Service Definition Draft could not be analyzed
if analyseDraft['errors'] == "NO":
arcpy.AddWarning(" Service Definition Draft could not be analyzed.")
content = "\n " + formatDate() + "\n Service Definition Draft could not be analyzed. \n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace)
#writeInDatabase(fromServerName, toServerName, serviceType, finalServiceName, service, user, sources, 'Service Definition Draft could not be analyzed.')
else:
arcpy.AddWarning(" Service could not be published because errors were found during analysis. ")
content = "\n " + formatDate() + "\n Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace) You can add what's stored in the content variable to the AddWarning message: ...
# if the sddraft analysis contained errors
else:
#Service Definition Draft could not be analyzed
if analyseDraft['errors'] == "NO":
arcpy.AddWarning(" Service Definition Draft could not be analyzed.")
content = "\n " + formatDate() + "\n Service Definition Draft could not be analyzed. \n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace)
#writeInDatabase(fromServerName, toServerName, serviceType, finalServiceName, service, user, sources, 'Service Definition Draft could not be analyzed.')
else:
#Include the results of the analysis within the message
messageContent = "Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
#Add the message to the tool
arcpy.AddWarning(messageContent)
content = "\n " + formatDate() + "\n Service could not be published because errors were found during analysis. \n " + str(analyseDraft['errors']) + "\n - " + finalServiceName + "\n"
serviceFailureNumber = serviceFailureNumber + 1
writeTxtFile(False, content, serviceFailureNumber, content1, workspace) This way, it prints the message but also writes to the file created by the createTxtFile function, (whatever the "workspace" variable is defined to, which appears to be the sysTemp folder path). Hi Jonathan, I have tried to include your suggestion above to my script but its not making any difference to the output. I'm just wondering how I can instruct the script to ignore the non-critical errors in the Analysis stage and proceed to publish the service. Some of the errors include discrepancy in the map projection, layer draws at all scale etc. Is there a way to tell the script to bypass some of these non critical errors and publish the service --inputFolderPath line 691: \\Hamiltonpool\E$\arcgisserver\directories\arcgissystem\arcgisinput\NaturalResources\MS4_Viewer_RegBoundaries.MapServer inputFolderPath line 693: \\Hamiltonpool\E$\arcgisserver\directories\arcgissystem\arcgisinput\NaturalResources\MS4_Viewer_RegBoundaries.MapServer\extracted\v101 Step 1: Creating Service Definition Draft (.sddraft) Step 2: Analyzing Service Definition Draft. The following information was returned during analysis of the MXD: MESSAGES: Service could not be published because errors were found during analysis
... View more
02-23-2017
07:30 AM
|
0
|
2
|
1825
|
|
POST
|
So far I tried to copy about 59 services at a time but it is only able to transfer 24 successfully while complaining that the remaining 23 services could not be transferred because errors were found during analysis. When I analyzed some of the individual services it was having issues with, I noticed that there were some warning errors (not critical errors) like projection issues and so on. These are just warnings and not error. I wonder why it wouldn't ignore those and go ahead and publish the maps just like ArcMap would do.
... View more
02-23-2017
05:05 AM
|
0
|
0
|
1825
|
|
POST
|
Thank you so much Rebecca, I did some modifications following the steps you described above, when I ran the script, it is able to copy over some of the map services and not able to copy some. It reported the error below which i believe could be specific to the individual map service. For these specific map services it is not able to analyze the service definition drafts and I am currently looking into this issue. Did you get similar error in yours? Service could not be published because errors were found during analysis Thank you very much
... View more
02-22-2017
12:22 PM
|
0
|
6
|
1825
|
|
POST
|
Rebecca Strauch, Emily Keefer-Cowles Hello Rebecca, I am happy to note that you actually got this tool to work at your end. I am currently trying to use the tool to copy some bunch of map services from my existing ArcGIS Server 10.4 to a new ArcGIS Server 10.4 but I keep getting this error Directory not copied. Error: [Error 3] The system cannot find the path specified: u'E\\arcgisserver\\directories\\arcgissystem\\arcgisinput\\NaturalResources\\MS4_Map_SupportingLayers_Public_Viewer_Natural_Resources.MapServer\\*.*' Service MXD not found. I don't understand why it is displaying the 'drive E' without the colon. Is this right?. These are the paths I specified for the system sysTemp field E:\arcgisserver\directories\arcgissystem\arcgisinput\sysTemp for the backupTemp E:\arcgisserver\directories\arcgissystem\arcgisinput\backup Both 'sysTemp' and 'backup' folders were created in my new ArcGIS 10.4 Server which is the destination server. I didn't fill anything in the 'Destination_Folder' field I have tried to follow all the steps you stipulated above unfortunately I'm still not able to get it to run successfully. Please can you explain more details of what exactly you did to get the tool to work for you. I have over 280 map services I need to copy and would really be happy if this tool could work for me. That would make life much easier for me. thanks
... View more
02-22-2017
05:20 AM
|
0
|
13
|
1825
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | a month ago | |
| 1 | 09-30-2024 08:15 AM | |
| 1 | 11-26-2018 10:06 AM | |
| 1 | 09-29-2017 06:47 AM | |
| 2 | 09-17-2019 12:51 PM |
| Online Status |
Offline
|
| Date Last Visited |
a month ago
|