POST
|
I have been able to successfully publish a locator to our portal. Works great in a web map and in ArcGIS Pro both in the locator pane and the batch geocoding tools. Used the Create Locator tool in ArcGIS Pro 2.7.2 and published to Enterprise 10.6.1. I am now trying to update the locator with new addresses. I have successfully used the Rebuild Address Locator tool and tested it locally in Pro to see that the new addresses are being found. However I am getting ERROR 001369: Failed to create the service whenever I try to overwrite or update the locator and geocoding service. I have tried the Overwrite Locator tool in Pro and also this Updating Geocoding Services script. I have also now downgraded to ArcGIS Pro 2.6.3 and rebuilt the locator from scratch, but am still getting the same error when overwriting. # Execute StageService to convert sddraft file to a service definition (sd) file
arcpy.server.StageService(sddraft_file, sd_file)
# Execute UploadServiceDefinition to publish the service definition file as a service
arcpy.server.UploadServiceDefinition(sd_file, ags_connection)
arcpy.AddMessage("The geocode service was successfully published") It does the StageService fine but is erroring on the Upload Service Definition line. Here is the full error message from the script output: An error occured ERROR 001369: Failed to create the service.
Failed to execute (Publish Service Definition).
Failed.
Failed to execute (Publish Service Definition).
Failed.
Failed to execute (UploadServiceDefinition).
ERROR 001369: Failed to create the service.
Failed to execute (Publish Service Definition).
Failed.
Failed to execute (Publish Service Definition).
Failed.
Failed to execute (UploadServiceDefinition).
Error thrown
Traceback (most recent call last):
File "Update.py", line 235, in main
arcpy.server.UploadServiceDefinition(sd_file, ags_connection)
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\server.py", line 1263, in UploadServiceDefinition
raise e
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\server.py", line 1260, in UploadServiceDefinition
retval = convertArcObjectToPythonObject(gp.UploadServiceDefinition_server(*gp_fixargs((in_sd_file, in_server, in_service_name, in_cluster, in_folder_type, in_folder, in_startupType, in_override, in_my_contents, in_public, in_organization, in_groups), True)))
File "C:\Program Files\ArcGIS\Pro\Resources\ArcPy\arcpy\geoprocessing\_base.py", line 511, in <lambda>
return lambda *args: val(*gp_fixargs(args, True))
arcgisscripting.ExecuteError: ERROR 001369: Failed to create the service.
Failed to execute (Publish Service Definition).
Failed.
Failed to execute (Publish Service Definition).
Failed. Is anyone able to help with this? Thanks.
... View more
03-26-2021
03:52 AM
|
0
|
5
|
1361
|
POST
|
Are you trying to find if they match exactly to one of the other layers (ie vertices are the same) or are they close enough within a tolerance? Would be good to see a screenshot of the data as well. I would start with the following: Polygon To Line tool - Convert Polygon A to lines. Use the IDENTIFY_NEIGHBORS option so that you don't get multiple overlapping lines. Split Line At Vertices tool - convert the lines from above into individual segments. Add a field to hold the results from the next section (Text length 2 for example) For each polygon boundary layer: Convert it to lines with the Polygon To Line tool. Buffer the lines by your tolerance distance. Use the Select By Location tool to select all the individual Polygon A segments that are COMPLETELY_WITHIN the buffer. Use the Calculate Field tool to set the result field for all the selected records to 'OK' When you have finished, all the individual segments that were not within the buffers (and therefore did not overlap with any other polygon boundaries) should have NULL in the result field - Select by <field> IS NULL.
... View more
03-25-2021
04:15 PM
|
0
|
0
|
379
|
POST
|
I am using Pro 2.7.2 with the Deep Learning Framework installed using the esri installer. I was able to complete that road surface investigation sample ok without any problems. It could be a problem with your input data. The prepare data function expects a folder with 2 subfolders called images and labels In the images folders should be all your images eg: Adachi_20170906093835.jpg In the labels folder, you should have a corresponding xml label file for each image with the same name eg: Adachi_20170906093835.xml Each XML file should have something like the following: <annotation>
<folder>Adachi</folder>
<filename>Adachi_20170906093835.jpg</filename>
<size>
<width>600</width>
<height>600</height>
</size>
<segmented>0</segmented>
<object>
<name>D20</name>
<bndbox>
<xmin>87</xmin>
<ymin>281</ymin>
<xmax>226</xmax>
<ymax>432</ymax>
</bndbox>
</object>
</annotation> Are your XML files the same format as this? What format and width\height are your images?
... View more
03-25-2021
03:20 PM
|
0
|
0
|
1391
|
POST
|
That's interesting - some undocumented functionality? The JSON response should give you the item id of the saved excel file: {
"type":<type of the exported item>,
"size":<size of the exported item>,
"jobId":<jobId for the export job>
"exportItemId":<id of the exported item that is created>,
"serviceItemId": <id of the hosted feature service item that was exported>,
"exportFormat": <exportFormat>
} I think it gets saved into the same user folder?
... View more
03-25-2021
02:31 PM
|
0
|
1
|
1940
|
POST
|
It's not the export parameters, the problem is that your URL is to the feature service, which does not have a REST export capability. If you paste just your URL string into the browser, you get the same error: The requested layer (layerId: export) was not found. If you have a look at the example at the bottom of the following page, it looks like the export item REST functionality needs to work from a URL pointing to the content of the owner (user) of the item: https://developers.arcgis.com/rest/users-groups-and-items/export-item.htm Also note the limitation at the top of that page: Exports a service item (POST only) to the specified output format. Available only to users with an organizational subscription. Invokable only by the service item owner or an administrator.
... View more
03-25-2021
02:06 PM
|
1
|
1
|
1946
|
POST
|
Not that I know of - you may have to script it in your language of choice or maybe FME if you have it? I think that long running REST processes with large attachment layers are just problematic. We have had lots of issues with replication of these types of layers as well. Scripting small batch extracts with lots of error checking and retrying on failure seems like the best option to me.
... View more
03-24-2021
06:49 PM
|
1
|
1
|
1716
|
POST
|
This article explains how to export from the rest endpoint using smaller batches of records. That should have a much better chance of working. https://support.esri.com/en/technical-article/000014156
... View more
03-24-2021
01:28 PM
|
0
|
3
|
1722
|
POST
|
I have had something strange happening that sounds sort of similar. Also editing polygons in Pro. Sometimes the feature I just edited disappears when clicking outside or clearing the selection. The row is still in the attribute table but no matter how I zoom or pan or refresh the view it will not display. If I save my edits, remove the layer and add the layer back in the feature now displays. This has happened 5 or 6 times now but I cannot force it to happen. I am using Pro 2.7.2. I have also once had the feature remain on the screen, even when I remove the layer. It does not disappear, even after zooming\panning. I had to delete the map to get it to disappear. I imagine this sort of problem would be quite hard to track down, maybe a memory or cache corruption issue?. Anyone else experiencing strange things?
... View more
03-24-2021
05:28 AM
|
0
|
0
|
3875
|
POST
|
Well it's a slightly different error this time. My only suggestion now would be to go back to using the default Python environment (arcgispro-py3). What did you mean when you said it was "unable to read the dlpk"? I am also on Pro 2.7.2, and used the deep learning framework from the installer and using the default environment.
... View more
03-24-2021
05:20 AM
|
0
|
1
|
3252
|
POST
|
By default the Landsat service will show you a 3 band combination, but the deep learning model expects 7 bands. In the properties of the Landsat service under Processing Templates, you need to set the Processing Template to None. Worked for me ok once I did that. You should probably set your required extent in the environment variables as well.
... View more
03-23-2021
09:45 PM
|
2
|
1
|
3302
|
POST
|
Can you provide some more information to assist with diagnosing the problem - - ArcGIS Pro version & how you installed the deep learning components - Screenshot of the Messages in the geoprocessing output (it's down the bottom) - Information on the input raster you are using - Type, Bands, Bits per Pixel, Columns, Rows, Coordinate System etc.
... View more
03-23-2021
07:54 PM
|
0
|
1
|
3331
|
POST
|
In Pro, it's important to set your map to the required coordinate system first. So I choose WGS84 for my empty map. Then go to the Insert tab and choose Connections -> New WMS Server. Paste the URL (same as you first post) and click OK. Then in the catalog pane, expand the Servers, find the Ortofoto layer and add it to your map. I can also choose any of the WMS listed coord systems and set my map to those, and the WMS will come in ok. For example I chose ETRS 1989 UTM Zone 36N and the WMS came in as shown in the screenshot. The default basemaps in AGOL are Web Mercator EPSG:3857. That is why they won't overlap. You may have to find an actual WGS84 basemap. Hopefully there may someone from ESRI who understands the ins and outs of AGOL who may be able to help with that side of things.
... View more
03-22-2021
03:53 PM
|
1
|
0
|
5723
|
POST
|
Just some further thoughts - you have already recognized that disaggregating data using population\building footprints has issues, because not all buildings are residential. Using land use to assist is good, but what happens when you get mixed land use areas (which are really common) such as commercial\residential? One factor you may not have considered is that each footprint may not necessarily be one family - what about high density multi-story buildings with hundreds of people in a single footprint? Your hexagons are very small relative to the overall area. The smaller you make them, the more chance of errors during the disaggregation. It wouldn't be hard to pick out some individual hexagons that are not correct and cast doubt on the entire map. It all comes down to the purpose of your mapping - if it is being used for policing, healthcare, policy making, strategic planning etc. then it might be quite important to ensure the results are accurate.
... View more
03-22-2021
03:04 PM
|
1
|
0
|
977
|
POST
|
The most common way of displaying raw data without normalizing is to use proportional symbols.
... View more
03-22-2021
02:46 PM
|
1
|
0
|
979
|
POST
|
The Zonal Statistics tool should work for this. Input would be your 100m buffers and DEM, and choose MEAN as the statistic. Will return the average of the DEM values in the buffer which you can then quite easily compare to the original DEM value to see if they are generally above or below.
... View more
03-21-2021
01:50 PM
|
1
|
0
|
1261
|
Title | Kudos | Posted |
---|---|---|
1 | 06-04-2021 12:20 AM | |
1 | 02-28-2023 03:58 PM | |
1 | 02-19-2023 10:12 PM | |
1 | 04-14-2021 05:57 PM | |
1 | 08-20-2021 02:12 PM |
Online Status |
Offline
|
Date Last Visited |
02-25-2024
11:41 PM
|