Solved! Go to Solution.
CRWInspectionLogCopyRows = os.path.join(scratch,CRWInspectionLogCopyRows.dbf) ... if result == 0: arcpy.AddWarning("Process terminated."); sys.exit() else: # The next line is probably the cause of the schema lock: arcpy.CopyRows_management(CRWInspectionLog, CRWInspectionLogCopyRows, "")
# 2. Set geoprocessing environment scratch = arcpy.env.scratchWorkspace arcpy.env.scratchWorkspace = r"E:\geoprocessing\Building\Outputworkspace" # - REMOVE this line arcpy.env.overwriteOutput = True
# 2. Set geoprocessing environment scratch = arcpy.env.scratchFolder arcpy.env.overwriteOutput = True
http://[server]/arcgis/rest/directories/arcgisjobs/lotexport_gpserver/j52e4e3ec1ff741e9aad19b8263615232/scratch/WebMap_12345.pdf ^ (Job ID)
fn = 'WebMap_{0}.pdf'.format(str(id)) out_fn = os.path.join(arcpy.env.scratchFolder, output)
When running a GP service on the server, the scratch workspace is created under a job directory. For example:http://[server]/arcgis/rest/directories/arcgisjobs/lotexport_gpserver/j52e4e3ec1ff741e9aad19b8263615232/scratch/WebMap_12345.pdf ^ (Job ID)
Your script may be overriding this and attempting to use the same scratch workspace for simultaneous users - leading to the schema lock.
The code that produced the URL result above just uses the scratchFolder as the directory in which to create the PDF file:fn = 'WebMap_{0}.pdf'.format(str(id)) out_fn = os.path.join(arcpy.env.scratchFolder, output)
If you create your temporary data in the arcpy.env.scratchFolder then the server should create unique datasets and avoid any schema locks.
Owen
# 1. Import arcpy module import arcpy, os, sys, datetime from arcpy import env # 2. Set geoprocessing environment scratch = arcpy.env.scratchWorkspace arcpy.env.scratchWorkspace = r"E:\geoprocessing\Building\Outputworkspace" arcpy.env.overwriteOutput = True # 3. Specify Date and Time now = datetime.datetime.now() # 4. Input Parameters Initials = arcpy.GetParameterAsText(0) InspDate = arcpy.GetParameterAsText(1) if InspDate == "": InspDate = now.strftime("%m-%d-%y") #This will return today's date. arcpy.AddMessage("Searching for scheduled inspections...") Xpression = """ "INSPECTOR" = \'%s\' AND "SCHEDULED_DATE" = \'%s\' """ % (Initials, InspDate) # 5. Local variables: sde_sdegis_VCOE_CRWINSPECTIONLOG = r"e:\connections\AGSGIS_TO_SDEGIS_COE.sde\\sde.sdegis.VCOE_CRWINSPECTIONLOG" CRWInspectionLog = "CRWInspectionLog" CRWInspectionLogCopyRows = os.path.join(scratch,CRWInspectionLogCopyRows.dbf) AddressesWithBldgno = r"e:\locators\\AddressesWithBldgNo" ScheduledInspections = os.path.join(ScheduledInspections.shp) # 6. Make Query Table and Copy Rows if Result is greater than 0 arcpy.AddMessage("Selecting scheduled inspections...") arcpy.MakeQueryTable_management(sde_sdegis_VCOE_CRWINSPECTIONLOG, CRWInspectionLog, "NO_KEY_FIELD", "", "", Xpression) result = int(arcpy.GetCount_management(CRWInspectionLog).getOutput(0)) arcpy.AddMessage("Inspections Found: " + str(result)) if result == 0: arcpy.AddWarning("Process terminated."); sys.exit() else: arcpy.CopyRows_management(CRWInspectionLog, CRWInspectionLogCopyRows, "") del CRWInspectionLog # 7. Geocode Inspections arcpy.AddMessage("Geocoding inspections, please wait...") arcpy.GeocodeAddresses_geocoding(CRWInspectionLogCopyRows, AddressesWithBldgno, Street SITE_ADDR VISIBLE NONE;City <None> VISIBLE NONE, ScheduledInspections, "STATIC") result1 = int(arcpy.GetCount_management(ScheduledInspections).getOutput(0)) arcpy.AddMessage("Inspections Geocoded: " + str(result1)) if result1 < result: arcpy.AddWarning("Not all inspections were mapped.") else: arcpy.AddMessage("All inspections mapped.") # 8. Scheduled Inspections Output arcpy.SetParameterAsText(2, ScheduledInspections)
CRWInspectionLogCopyRows = os.path.join(scratch,CRWInspectionLogCopyRows.dbf) ... if result == 0: arcpy.AddWarning("Process terminated."); sys.exit() else: # The next line is probably the cause of the schema lock: arcpy.CopyRows_management(CRWInspectionLog, CRWInspectionLogCopyRows, "")
# 2. Set geoprocessing environment scratch = arcpy.env.scratchWorkspace arcpy.env.scratchWorkspace = r"E:\geoprocessing\Building\Outputworkspace" # - REMOVE this line arcpy.env.overwriteOutput = True
# 2. Set geoprocessing environment scratch = arcpy.env.scratchFolder arcpy.env.overwriteOutput = True
I don't think the issue is with the locator, it looks like the schema lock is happening when the second process attempts to copy rows to the DBF file:CRWInspectionLogCopyRows = os.path.join(scratch,CRWInspectionLogCopyRows.dbf) ... if result == 0: arcpy.AddWarning("Process terminated."); sys.exit() else: # The next line is probably the cause of the schema lock: arcpy.CopyRows_management(CRWInspectionLog, CRWInspectionLogCopyRows, "")
The DBF file will be locked by the first process and the second process will not be able to write to the same file while it is open. In your script this file would be the same for all users:
E:\geoprocessing\Building\Outputworkspace\CRWInspectionLogCopyRows.dbf
To avoid this on the server you should remove the line that specifically sets the scratchWorkspace:# 2. Set geoprocessing environment scratch = arcpy.env.scratchWorkspace arcpy.env.scratchWorkspace = r"E:\geoprocessing\Building\Outputworkspace" # - REMOVE this line arcpy.env.overwriteOutput = True
Change to:# 2. Set geoprocessing environment scratch = arcpy.env.scratchFolder arcpy.env.overwriteOutput = True
Then when simultaneous users run the script they each get a system generated job/scratch directory and the DBF files can be created without any locking issues.
Alternatively, you could use some random temp file name for each DBF if you want to keep your specific scratchWorkspace. This way each process is writing to a different DBF file.
Hope this helps.
Owen
www.spatialxp.com.au