|
POST
|
We setup an ArcGIS Online web map for a local metering company to read and inspect meter and risers. The schema is the same for the FGDB and SDE. The feature point data will never change but they're adding new tables for each read and inspection. Is there a workflow to append those new tables back into our enterprise GDB?
... View more
08-09-2018
11:26 AM
|
0
|
3
|
2404
|
|
POST
|
I've got it running on another computer right now. I'll check it in a few to make sure it's running properly and get back to here. Additional Information: The second PC running the script has the same version. If it fails I will try the script on another PC with 10.5. The SDE geodatabase is hosting in SQL Server 2012.
... View more
08-08-2018
07:18 AM
|
0
|
0
|
1399
|
|
POST
|
When I run my py script I get an "Error: Out of Memory" at the last task arcpy.ImportXMLWorkspaceDocument_management. We're using 10.6 software by Esri. Does anyone know if it's related to how I wrote the script or possibly some other problem? # Name: EnterpriseGDB.py
# Description: Import/Export the contents of my geodatabase to an XML workspace document.
# Import system modules
import arcpy
import os
import platform
import imp
import time
xmlDocument = r"C:\EnterpriseGDB.xml"
if arcpy.Exists(xmlDocument):
print("Removing xml document")
os.remove(xmlDocument)
print("successful")
# Set overwrite output
arcpy.env.OverwriteOutput = True
# Set local variables for exports
in_data = r"Database Connections\SDE.sde"
out_file = r"C:\EIS\EISMASTER\eis shape\ArcMap\EnterpriseGDB.xml"
export_option = "DATA"
storage_type = "BINARY"
export_metadata = "METADATA"
# Set local variables for imports
target_gdb = r"C:\EIS\EISMASTER\eis shape\ArcMap\UPDM.gdb"
in_file = r"C:\EIS\EISMASTER\eis shape\ArcMap\EnterpriseGDB.xml"
import_type = "DATA"
config_keyword = "DEFAULTS"
print("Exporting SDE to XML Document")
# Execute ExportXMLWorkspaceDocument
arcpy.ExportXMLWorkspaceDocument_management(in_data, out_file, export_option, storage_type, export_metadata)
print("Finished Exporting")
time.sleep(120)
print("Importing NEW XML Document to Local GDB")
# Execute ImportXMLWorkspaceDocument
arcpy.ImportXMLWorkspaceDocument_management(target_gdb, in_file, import_type, config_keyword)
print("Finished Importing!")
... View more
08-08-2018
05:13 AM
|
0
|
2
|
1594
|
|
POST
|
Recently, we've been having issues getting our python script to copy our enterprise geodatabase to a new fresh file geodatabase. When it's copying one of the feature data sets to the local file geodatabase it generates an error saying a table already exists and skips over the entire data set which holds 90% of our data and onto the next feature data set. Does anyone have a working python script that exports an entire enterprise geodatabase? import time, os, datetime, sys, logging, logging.handlers, shutil, traceback
import arcpy
########################## user defined functions ##############################
def getDatabaseItemCount(workspace):
log = logging.getLogger("script_log")
"""returns the item count in provided database"""
arcpy.env.workspace = workspace
feature_classes = []
log.info("Compiling a list of items in {0} and getting count.".format(workspace))
for dirpath, dirnames, filenames in arcpy.da.Walk(workspace,datatype="Any",type="Any"):
for filename in filenames:
feature_classes.append(os.path.join(dirpath, filename))
log.info("There are a total of {0} items in the database".format(len(feature_classes)))
return feature_classes, len(feature_classes)
def replicateDatabase(dbConnection, targetGDB):
log = logging.getLogger("script_log")
startTime = time.time()
if arcpy.Exists(dbConnection):
featSDE,cntSDE = getDatabaseItemCount(dbConnection)
log.info("Geodatabase being copied: %s -- Feature Count: %s" %(dbConnection, cntSDE))
if arcpy.Exists(targetGDB):
featGDB,cntGDB = getDatabaseItemCount(targetGDB)
log.info("Old Target Geodatabase: %s -- Feature Count: %s" %(targetGDB, cntGDB))
try:
shutil.rmtree(targetGDB)
log.info("Deleted Old %s" %(os.path.split(targetGDB)[-1]))
except Exception as e:
log.info(e)
GDB_Path, GDB_Name = os.path.split(targetGDB)
log.info("Now Creating New %s" %(GDB_Name))
arcpy.CreateFileGDB_management(GDB_Path, GDB_Name)
arcpy.env.workspace = dbConnection
try:
datasetList = [arcpy.Describe(a).name for a in arcpy.ListDatasets()]
except Exception, e:
datasetList = []
log.info(e)
try:
featureClasses = [arcpy.Describe(a).name for a in arcpy.ListFeatureClasses()]
except Exception, e:
featureClasses = []
log.info(e)
try:
tables = [arcpy.Describe(a).name for a in arcpy.ListTables()]
except Exception, e:
tables = []
log.info(e)
#Compiles a list of the previous three lists to iterate over
allDbData = datasetList + featureClasses + tables
for sourcePath in allDbData:
targetName = sourcePath.split('.')[-1]
targetPath = os.path.join(targetGDB, targetName)
if not arcpy.Exists(targetPath):
try:
log.info("Attempting to Copy %s to %s" %(targetName, targetPath))
arcpy.Copy_management(sourcePath, targetPath)
log.info("Finished copying %s to %s" %(targetName, targetPath))
except Exception as e:
log.info("Unable to copy %s to %s" %(targetName, targetPath))
log.info(e)
else:
log.info("%s already exists....skipping....." %(targetName))
featGDB,cntGDB = getDatabaseItemCount(targetGDB)
log.info("Completed replication of %s -- Feature Count: %s" %(dbConnection, cntGDB))
else:
log.info("{0} does not exist or is not supported! \
Please check the database path and try again.".format(dbConnection))
#####################################################################################
def formatTime(x):
minutes, seconds_rem = divmod(x, 60)
if minutes >= 60:
hours, minutes_rem = divmod(minutes, 60)
return "%02d:%02d:%02d" % (hours, minutes_rem, seconds_rem)
else:
minutes, seconds_rem = divmod(x, 60)
return "00:%02d:%02d" % (minutes, seconds_rem)
if __name__ == "__main__":
startTime = time.time()
now = datetime.datetime.now()
############################### user variables #################################
'''change these variables to the location of the database being copied, the target
database location and where you want the log to be stored'''
logPath = "C:/temp/Log"
databaseConnection = r"C:\Users\jnmiller\AppData\Roaming\ESRI\Desktop10.6\ArcCatalog\SDE.sde"
############################### logging items ###################################
# Make a global logging object.
logName = os.path.join(logPath,(now.strftime("%Y-%m-%d_%H-%M.log")))
log = logging.getLogger("script_log")
log.setLevel(logging.INFO)
h1 = logging.FileHandler(logName)
h2 = logging.StreamHandler()
f = logging.Formatter("[%(levelname)s] [%(asctime)s] [%(lineno)d] - %(message)s",'%m/%d/%Y %I:%M:%S %p')
h1.setFormatter(f)
h2.setFormatter(f)
h1.setLevel(logging.INFO)
h2.setLevel(logging.INFO)
log.addHandler(h1)
log.addHandler(h2)
log.info('Script: {0}'.format(os.path.basename(sys.argv[0])))
try:
########################## function calls ######################################
replicateDatabase(databaseConnection, targetGDB)
################################################################################
except Exception, e:
log.exception(e)
totalTime = formatTime((time.time() - startTime))
log.info('--------------------------------------------------')
log.info("Script Completed After: {0}".format(totalTime))
log.info('--------------------------------------------------')
... View more
08-01-2018
12:33 PM
|
1
|
4
|
10863
|
|
POST
|
I am trying to get the Inbox filter to query data based on DISPATCHEDTO and CALLTYPE = 40.61522 _-81.537074 but I can't get the sql query to actually work. I don't get any errors when I re-download the survey and open the inbox. Does it not like the string? Is it maybe possible to write a SQL expression where CALLTYPE IS NOT 'Y' AND 'N' and we can assume it chooses everything else which would be the this string '40.615229 _-81.537074'? Tested and not working query DISPATCHEDTO =${username} AND CALLTYPE = '40.615229 _-81.537074' Survey Tab Choices Tab
... View more
08-01-2018
06:57 AM
|
0
|
1
|
862
|
|
BLOG
|
You would think for a Enterprise system that cost the most money, they would dedicate more time to building stuff that people actually need. We always have to change our workflows because Esri can't figure out how to include versioned layers.
... View more
07-19-2018
11:19 AM
|
8
|
0
|
33369
|
|
POST
|
If anyone has problems trying to create a survey with a feature class due to the error below, please view the solution we had to do to resolve this issue. Hope this helps someone. Error The feature service does not meet the minimum requirement for a survey. The property supportsApplyEditsWithGlobalIds must be true. Solution When you tried to create a survey using your ArcGIS Server feature service in the Survey123 Connect desktop application, you saw error: the feature service does not meet the minimum requirements for a survey. the property supportsApplyEditsWithGlobalIds must be true. When we went to the service's REST endpoint, we saw that the Supports ApplyEdits With Global IDs parameter was set to false. To resolve the issue, we added an attribute index on the GlobalID field, checking the box that said 'Unique.' Specifically, we Right-clicked on feature class from ArcCatalog > Properties > Indexes > Add Attribute Index > Select GlobalID field > checked the box next to 'Unique' > OK > Apply >OK). Then, we overwrote the service, confirmed that the supportsApplyEditsWithGlobalIds property from REST changed to True, and were able to successfully create a survey from it using Survey123 Connect.
... View more
07-19-2018
05:35 AM
|
1
|
1
|
1421
|
|
BLOG
|
Do you know when we'll see a search function in the enterprise dashboard?
... View more
07-02-2018
09:12 AM
|
0
|
0
|
191
|
|
POST
|
Thanks for the reply, Jonathan. I just finished reading the article and you're correct. These are completely separate workflows and each workflow cannot be compared to which is more effective than the other since they serve completely different objectives. For us it would be in our best interest to stay with the current workflow with the enterprise geodatabase using sql server and hosted features through the server. Our only problem is the performance. Currently the sql server 2012 R2 is still being hosted on the old server but we removed all the GIS software and the performance is really bad. Bad enough that we can't edit in child versions where the ArcMap session immediately crashes. Do you think it would be in our best interest to create another VM on the new server and host the sql server 2012 R2? Our server should have the resources and processor power to handle it. Joe Borgione - Is this the workflow you're talking about when they isolate the different component to there own virtual machine? *Edit - I talked with our IT admin this morning and it appears that they move the sql server virtual machine to the new server last week. I guess this puts us at square one for resolving this issue. If it's a network related issue I wonder if there is a way to troubleshoot it without having to spend thousands of dollars to possibly solve the issue. New Server: VM - 1 ArcGIS Server / Portal / web adapters VM - 2 ArcGIS Data Store - Likely to not be used. VM - 3 - proposed VM Host SQL Server 2012 R2
... View more
06-27-2018
07:33 PM
|
0
|
0
|
2555
|
|
POST
|
Our company recently purchased a new server to replace our older one that had all the GIS and sql server 2012 software loaded on. After talking with a esri rep it's to my understanding that data store can essentially replace sql server. So since we've been given the opportunity should we migrate our data from sql server 2012 to data store? How will the performance using the esri products increase? Note: We've always had performance issues since we started using ArcMap with sql server 2012 and we decided to buy a new server and split the resource hog sql server 2012 and the GIS software to there own separate servers. This post is in relation to an earlier post I made a few months back. We're still settings up the new server so I won't be able to see if our issues improve. Slow ArcGIS Server Environment Issues
... View more
06-27-2018
02:50 PM
|
0
|
3
|
3235
|
|
POST
|
Hi Deon, thanks for the response. We was hoping there would be a way where the surveys would automatically update itself every day or once a week. If the app were to update the downloaded surveys we could prevent our operators from using old forms. Even though we tell our operators to update the surveys once a week there is usually a few that don't or they simply forget. It's very time consuming for me and our operators to check 30+ phones.
... View more
06-27-2018
04:16 AM
|
2
|
2
|
1806
|
|
POST
|
Is there anything we can make the forms update automatically on the phones? It's such a headache having to update forms on 30+ phones. We're still new to building our forms that we're constantly having to update everyone's phone when we make any changes.
... View more
06-25-2018
08:20 AM
|
3
|
4
|
2738
|
|
POST
|
Is there a reason why the multiline in appearance not wrapping text. What's the purpose of having a wider & longer box if it doesn't have the ability to wrap text in a textbox?
... View more
06-15-2018
10:56 AM
|
0
|
1
|
1578
|
|
POST
|
When we make changes to a form and republish it back to portal will the previous forms downloaded by users still work? We've had issues with the inbox not working correctly. When we update a form and publish it users can't pull data until they re-download the form again. The inbox loading icon spins nonstop. Hopefully this makes sense.
... View more
06-12-2018
08:15 AM
|
0
|
1
|
677
|
|
POST
|
Thanks! The more I work with Survey123 the more I enjoy it. Great form builder
... View more
06-08-2018
09:11 AM
|
0
|
0
|
680
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 05-11-2018 06:04 AM | |
| 1 | 04-11-2018 06:24 AM | |
| 1 | 04-04-2019 05:59 AM | |
| 1 | 05-06-2019 08:45 AM | |
| 2 | 06-06-2019 06:32 PM |
| Online Status |
Offline
|
| Date Last Visited |
01-13-2021
02:32 PM
|