POST
|
I was having same issue. Trying to schedule overwriting a service on ArcGIS Online, script being triggered from a machine running ArcGIS Pro 2.3 The following works for me. Make sure you include the in_override parameter. arcpy.UploadServiceDefinition_server(sd_output_filename, "My Hosted Services", in_override="OVERRIDE_DEFINITION", in_public="PUBLIC")
... View more
02-01-2019
07:44 PM
|
1
|
2
|
1031
|
POST
|
Having similar-ish issues. Have a script that downloads NetCDF which has new time-slices of data in it (weekly forecast) This overwrites the previous NetCDF in a folder Mosaic Dataset points to this NetCDF file Runs a Sync-MD Somehow still has a reference to the older forecast data, dispite removing overviews as part of the Sync. Any ideas? arcpy.SynchronizeMosaicDataset_management(in_mosaic_dataset=r"😧\Work\BOM\workings\BOM_FTP.gdb\T_SFC_ArcMap", where_clause="", new_items="UPDATE_WITH_NEW_ITEMS", sync_only_stale="SYNC_STALE", update_cellsize_ranges="UPDATE_CELL_SIZES", update_boundary="UPDATE_BOUNDARY", update_overviews="UPDATE_OVERVIEWS", build_pyramids="NO_PYRAMIDS", calculate_statistics="CALCULATE_STATISTICS", build_thumbnails="NO_THUMBNAILS",build_item_cache="NO_ITEM_CACHE", rebuild_raster="REBUILD_RASTER", update_fields="UPDATE_FIELDS", fields_to_update="CenterX;CenterY;Dimensions;GroupName;ProductName;Raster;Shape;StdTime;Tag;Variable;ZOrder", existing_items="UPDATE_EXISTING_ITEMS", broken_items="REMOVE_BROKEN_ITEMS", skip_existing_items="SKIP_EXISTING_ITEMS", refresh_aggregate_info="NO_REFRESH_INFO", estimate_statistics="NO_STATISTICS") print('Finished syncing Mosaic Dataset') if __name__ =='__main__': main()
... View more
02-01-2019
01:42 AM
|
0
|
0
|
839
|
POST
|
it by delete the zipped FGDB item after the update You can actually bypass the above approach of uploading directly to the portal as an item (still a very valid approach for appending) and upload your fGDB directly against a hosted feature service. Benefit is that this data is manually cleaned automatically by the back-end Server and no reliance on you having to delete. Appending definitely supports using this via the appendUploadID parameter on the REST endpoint. But with the Python API: Cant spot anything that might help me upload to begin with The append method does not have an equivalent appendUploadID, just a Portal Item ID - item_id I am ok working around this by using the requests module, but might be worth including in a future release?
... View more
01-30-2019
02:22 AM
|
0
|
1
|
895
|
POST
|
Hey again Simo No its not so much the actual appending, its the uploading of a fGDB against the Hosted Feature Service itself (as opposed to an item in the portal) which I cant spot an easy way of doing without using the requests library and working directly with the REST endpoint. Will see if I can share some code. Here is a Javascript equivalent.
... View more
01-29-2019
06:09 PM
|
0
|
3
|
895
|
POST
|
I have a NetCDF of temperature data, that I get a fresh copy every morning. This is stored under a Mosaic Dataset, created in ArcMap 10.6.1 (Nothing else in this mosaic dataset) I can serve this up as an image service using a stretched min/max renderer fine I want to also add a renderer to serve it up as a classified renderer, ranging from -5 to 50 I can create a classification on a dummy raster that has min/max values of -5 to 50 Have to use a dummy raster with the anticipated max/min values, as cant edit the min/max range when trying to setup a classified renderer I can then save this out as a raster function template If I then import this against the original mosaic dataset, the classification looks like it is available as a processing template But the legend appears as RGB (as shown below) I have tried playing around with adding in a colourmap function, but it still shows up as RGB As an end web gis user, I want to see the classification intervals and not the RGB values, how can I do this?
... View more
01-29-2019
05:20 AM
|
0
|
3
|
3328
|
POST
|
I have a polygon featureclass in a local file Geodatabase. I want to Append-Upsert this against an existing Hosted Feature Service in AGO (same schema) I know that you can upload the fGDB against the hosted feature service and then use this in an Append with appendUploadId I might be missing it, but the the ArcGIS Python API seems to be missing this capability - to perform an upload against an existing hosted feature service? I know there are other options: Upload the fGDB to the portal as an item, and then use this in an append (prefer to avoid this approach) Convert the layer to a featurecollection and pass this in directly with the edits param on append Not entirely sure how to convert my featureclass to a featurecollection (?) in order to do this, happy to attempt this approach as then I can skip out on any uploading. Thanks for any pointers.
... View more
01-24-2019
02:36 AM
|
0
|
5
|
1105
|
POST
|
Hey Simo, hope things are good with you. Too easy! Works a charm. Ashamed I didn't work that one out. Now onto tweaking the expression to filter using dates. Thanks. replica = waze_flc.replicas.create(replica_name = 'Waze_Download',
layer_queries = {"0":{"queryOption": "useFilter", "where": "type = 'JAM'", "useGeometry": "false"}},
layers = '0',
data_format = 'filegdb',
out_path = './download')
... View more
01-09-2019
01:13 AM
|
0
|
0
|
951
|
POST
|
Hi Dan - Love your work - Big thanks for taking the time to brain-dump a lot of your wisdom into GeoNet - has saved me many a time. try a simpler install path For ArcGIS Pro you mean? I see in a lot of your screenshots, you don't have it in Program Files, but directly under C:/ I think when 2.3 comes along, I will do a clean wipe of Pro/Python/Anaconda, and take your approach.
... View more
01-06-2019
02:12 PM
|
1
|
6
|
4042
|
POST
|
Resolved it with the following: conda clean --all --yes conda update --all conda install -c anaconda spyder
... View more
01-06-2019
02:22 AM
|
1
|
0
|
4042
|
POST
|
Running ArcGIS Pro 2.2 on my own home PC (full admin access). I am having trouble installing Spyder. (arcgispro-py3) C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3>conda install spyder
Fetching package metadata ...............
Solving package specifications: .
Package plan for installation in environment C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3:
The following NEW packages will be INSTALLED:
alabaster: 0.7.12-py36_0
astroid: 2.1.0-py36_0
atomicwrites: 1.2.1-py36_0
babel: 2.6.0-py36_0
cloudpickle: 0.6.1-py36_0
cryptography-vectors: 2.3.1-py36_0
docutils: 0.14-py36h6012d8f_0
fastcache: 1.0.2-py36hfa6e2cd_2
icu: 58.2-ha66f8fd_1
imagesize: 1.1.0-py36_0
isort: 4.3.4-py36_0
lazy-object-proxy: 1.3.1-py36hfa6e2cd_2
libsodium: 1.0.16-h9d3ae62_0
m2w64-gcc-libgfortran: 5.3.0-6
m2w64-gcc-libs: 5.3.0-7
m2w64-gcc-libs-core: 5.3.0-7
m2w64-gmp: 6.1.0-2
m2w64-libwinpthread-git: 5.0.0.4634.697f757-2
mccabe: 0.6.1-py36_1
msys2-conda-epoch: 20160418-1
numpy-base: 1.14.3-py36h555522e_1
numpydoc: 0.8.0-py36_0
packaging: 18.0-py36_0
pandoc: 2.2.3.2-0
prometheus_client: 0.5.0-py36_0
psutil: 5.4.7-py36hfa6e2cd_0
pycodestyle: 2.4.0-py36_0
pyflakes: 2.0.0-py36_0
pylint: 2.2.2-py36_0
pyqt: 5.9.2-py36h6538335_2
pywin32: 223-py36hfa6e2cd_1
qt: 5.9.6-vc14h62aca36_0
qtawesome: 0.5.3-py36_0
qtconsole: 4.4.3-py36_0
qtpy: 1.5.2-py36_0
rope: 0.11.0-py36_0
sip: 4.19.8-py36h6538335_0
snowballstemmer: 1.2.1-py36h763602f_0
sphinx: 1.8.2-py36_0
sphinxcontrib: 1.0-py36_1
sphinxcontrib-websupport: 1.1.0-py36_1
spyder: 3.3.2-py36_0
spyder-kernels: 0.3.0-py36_0
sqlite: 3.25.2-hfa6e2cd_0
typed-ast: 1.1.0-py36hfa6e2cd_0
wrapt: 1.10.11-py36hfa6e2cd_2
zeromq: 4.2.5-he025d50_1
The following packages will be UPDATED:
attrs: 17.4.0-py36_0 --> 18.2.0-py36h28b3542_0
bleach: 2.1.3-py36_0 --> 3.0.2-py36_0
certifi: 2018.1.18-py36_0 --> 2018.11.29-py36_0
cffi: 1.11.5-py36h945400d_0 --> 1.11.5-py36h74b6da3_1
chardet: 3.0.4-py36h420ce6e_1 --> 3.0.4-py36_1
colorama: 0.3.9-py36h029ae33_0 --> 0.4.1-py36_0
cryptography: 2.2.2-py36hfa6e2cd_0 --> 2.3-py36h74b6da3_0
decorator: 4.2.1-py36_0 --> 4.3.0-py36_0
entrypoints: 0.2.3-py36hfd66bb0_2 --> 0.2.3-py36_2
future: 0.16.0-py36_1 esri --> 0.17.1-py36_0
html5lib: 1.0.1-py36h047fa9f_0 --> 1.0.1-py36_0
idna: 2.6-py36h148d497_1 --> 2.8-py36_0
intel-openmp: 2018.0.0-arcgispro_0 esri [arcgispro] --> 2018.0.3-arcgispro_0 esri [arcgispro]
ipykernel: 4.8.2-py36_0 --> 5.1.0-py36h39e3cac_0
ipython: 6.3.1-py36_0 --> 7.2.0-py36h39e3cac_0
ipywidgets: 7.2.1-py36_0 --> 7.4.2-py36_0
jedi: 0.11.1-py36_0 esri --> 0.13.2-py36_0
jinja2: 2.10-py36h292fed1_0 --> 2.10-py36_0
jupyter_client: 5.2.3-py36_0 --> 5.2.4-py36_0
jupyter_core: 4.4.0-py36h56e9d50_0 --> 4.4.0-py36_0
keyring: 11.0.0-py36_0 esri --> 17.0.0-py36_0
kiwisolver: 1.0.1-py36h12c3424_0 --> 1.0.1-py36h6538335_0
markupsafe: 1.0-py36h0e26971_1 --> 1.0-py36hfa6e2cd_1
mistune: 0.8.3-py36_0 --> 0.8.3-py36hfa6e2cd_1
more-itertools: 4.1.0-py36_0 --> 4.3.0-py36_0
mpmath: 1.0.0-py36hacc8adf_2 --> 1.1.0-py36_0
nose: 1.3.7-py36h1c3779e_2 --> 1.3.7-py36_2
notebook: 5.4.1-py36_0 --> 5.7.4-py36_0
numpy: 1.14.2-py36h5c71026_1 --> 1.14.3-py36h9fa60d3_1
openpyxl: 2.5.2-py36_0 --> 2.5.12-py36_0
pandas: 0.22.0-py36h6538335_0 --> 0.23.4-py36h830ac7b_0
pandocfilters: 1.4.2-py36h3ef6317_1 --> 1.4.2-py36_1
parso: 0.1.1-py36hae3edee_0 --> 0.3.1-py36_0
pickleshare: 0.7.4-py36h9de030f_0 --> 0.7.5-py36_0
pip: 9.0.3-py36_0 --> 18.1-py36_0
pluggy: 0.6.0-py36hc7daf1e_0 --> 0.8.0-py36_0
prompt_toolkit: 1.0.15-py36h60b8f86_0 --> 2.0.7-py36_0
py: 1.5.3-py36_0 --> 1.7.0-py36_0
pycparser: 2.18-py36hd053e01_1 --> 2.19-py36_0
pygments: 2.2.0-py36hb010967_0 --> 2.3.1-py36_0
pyopenssl: 17.5.0-py36h5b7d817_0 --> 18.0.0-py36_0
pyparsing: 2.2.0-py36h785a196_1 --> 2.3.0-py36_0
pytest: 3.5.0-py36_0 --> 4.0.2-py36_0
python-dateutil: 2.7.2-py36_0 --> 2.7.5-py36_0
pytz: 2018.3-py36_0 esri --> 2018.7-py36_0
pywin32-ctypes: 0.1.2-py36_0 esri --> 0.2.0-py36he119be9_0 esri
pywinpty: 0.5-py36_0 esri --> 0.5.5-py36_1000
pyzmq: 17.0.0-py36hfa6e2cd_0 --> 17.1.2-py36hfa6e2cd_0
requests: 2.18.4-py36h4371aae_1 --> 2.21.0-py36_0
scipy: 1.0.1-py36hce232c7_0 --> 1.1.0-py36h672f292_0
setuptools: 39.0.1-py36_0 --> 40.6.3-py36_0
six: 1.11.0-py36h4db2310_1 --> 1.12.0-py36_0
sympy: 1.1.1-py36_0 esri --> 1.3-py36_0
testpath: 0.3.1-py36h2698cfe_0 --> 0.4.2-py36_0
tornado: 5.0.2-py36_0 --> 5.1.1-py36hfa6e2cd_0
urllib3: 1.22-py36h276f60a_0 --> 1.24.1-py36_0
vs2015_runtime: 14.0.25420-0 esri --> 14.15.26706-h3a45250_0
webencodings: 0.5.1-py36h67c50ae_1 --> 0.5.1-py36_1
wheel: 0.31.0-py36_0 --> 0.32.3-py36_0
widgetsnbextension: 3.2.1-py36_0 --> 3.4.2-py36_0
win_inet_pton: 1.0.1-py36he67d7fd_1 --> 1.0.1-py36_1
winkerberos: 0.7.0-py36_0 --> 0.7.0-py36_1
xlrd: 1.1.0-py36h1cb58dc_1 --> 1.2.0-py36_0
The following packages will be SUPERSEDED by a higher-priority channel:
nbconvert: 5.3.1-py36_1 esri --> 5.3.1-py36_0
send2trash: 1.5.0-py_0 conda-forge --> 1.5.0-py36_0
Proceed ([y]/n)? y
DEBUG menuinst_win32:__init__(199): Menu: name: 'Anaconda${PY_VER} ${PLATFORM}', prefix: 'C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3', env_name: 'arcgispro-py3', mode: 'None', used_mode: 'system'
DEBUG menuinst_win32:create(323): Shortcut cmd is C:\Users\Simon\AppData\Local\Temp\_MEI59642\python.exe, args are ['C:\\Users\\Simon\\AppData\\Local\\Temp\\_MEI59642\\cwp.py', '"C:\\Program Files\\ArcGIS\\Pro\\bin\\Python\\envs\\arcgispro-py3"', '"C:\\Program Files\\ArcGIS\\Pro\\bin\\Python\\envs\\arcgispro-py3\\python.exe"', '"C:\\Program Files\\ArcGIS\\Pro\\bin\\Python\\envs\\arcgispro-py3\\Scripts\\jupyter-notebook-script.py"', '%USERPROFILE%']
ERROR conda.core.link:_execute_actions(337): An error occurred while installing package 'defaults::fastcache-1.0.2-py36hfa6e2cd_2'.
CondaError: Cannot link a source that does not exist. C:\Program Files\ArcGIS\Pro\bin\Python\pkgs\fastcache-1.0.2-py36hfa6e2cd_2\Lib\site-packages\fastcache\__pycache__\__init__.cpython-36.pyc
Attempting to roll back.
DEBUG menuinst_win32:__init__(199): Menu: name: 'Anaconda${PY_VER} ${PLATFORM}', prefix: 'C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3', env_name: 'arcgispro-py3', mode: 'None', used_mode: 'system'
DEBUG menuinst_win32:create(323): Shortcut cmd is C:\Users\Simon\AppData\Local\Temp\_MEI59642\python.exe, args are ['C:\\Users\\Simon\\AppData\\Local\\Temp\\_MEI59642\\cwp.py', '"C:\\Program Files\\ArcGIS\\Pro\\bin\\Python\\envs\\arcgispro-py3"', '"C:\\Program Files\\ArcGIS\\Pro\\bin\\Python\\envs\\arcgispro-py3\\python.exe"', '"C:\\Program Files\\ArcGIS\\Pro\\bin\\Python\\envs\\arcgispro-py3\\Scripts\\jupyter-notebook-script.py"', '%USERPROFILE%']
CondaError: Cannot link a source that does not exist. C:\Program Files\ArcGIS\Pro\bin\Python\pkgs\fastcache-1.0.2-py36hfa6e2cd_2\Lib\site-packages\fastcache\__pycache__\__init__.cpython-36.pyc
(arcgispro-py3) C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3>
... View more
01-05-2019
11:17 PM
|
0
|
9
|
5013
|
POST
|
I have a public hosted feature in ArcGIS Online (test data) which has SYNC enabled. Using the Python API I want to download a file geodatabase, but also include a WHERE clause to download a subset of the data. I am having a problem with the WHERE clause. I can perform a standard REST query to get some records from an individual layer. WHERE type='JAM' I can also successfully download a file geodatabase using the createReplica endpoint via REST: However, if I want to apply a WHERE clause, I am having some problems getting the layerQueries parameter to work via REST {"0":{"queryOption": "useFilter", "where": "type = JAM", "useGeometry": false}}
{"0":{"where": "type = JAM"}} Both of the above result in: Exporting data for layer 0 failed. Can anyone point out what is wrong with my layerQueries param, I am sure it must be something simple. I intend on using this via Python as below, but I figure I need to be able to get it work directly against the REST API before troubleshooting in Python. replica = waze_flc.replicas.create(replica_name = 'Waze_Download',
layer_queries = {"0":{"queryOption": "useFilter", "where": "type = JAM", "useGeometry": false}},
layers = '0',
data_format = 'filegdb',
out_path = './download') The field I actually need to perform the WHERE clause on is a timestamp, to get the last 24 hours of data.
... View more
01-04-2019
07:49 PM
|
0
|
2
|
1182
|
POST
|
Using createreplica gets around the credits issue, but the service needs SYNC enabled (which has more overhead for publisher when compared to Extract). replica = waze_flc.replicas.create(replica_name = 'Waze_Download',
layers = '0',
data_format = 'filegdb',
out_path = './download')
... View more
01-04-2019
06:43 PM
|
0
|
0
|
959
|
POST
|
We have a need to build a very lightweight Web Application Template, that will be reuseable from both ArcGIS Online and ArcGIS Enteprise portal users. This will be for 2D Web Maps, not 3D Web Scenes.
Has anyone built a configurable Web Application template and leveraged esri-leaflet?
We will most likely use the Esri JSAPI and try to create as lightweight an app template as we can. We have not put much investigation into this but would anticipate that there are some limitations between some functionality that a portal user can save to a Web Map not being able to flow through to esri-leaflet? e.g. Arcade expressions/charts in popups/hosted feature layer views/etc.
Would love to know if we are wrong though!
Create configurable app templates—ArcGIS Online Help | ArcGIS
https://github.com/Esri/application-base-js
... View more
11-25-2018
02:57 AM
|
0
|
0
|
631
|
POST
|
Hey Robert. Encountering this issue as well. ArcGIS Online (HTTPS enforced) Public services/Web Map Zip from Dev-WAB uploaded to S3 (which does not have HTTPS enabled via a domain name in front of it) ConfigManager.js error. I assume I need to either get a cert to my domain name or consider enabling HTTP on my Org account (for testing). I tried changing the config portalURL to use http, but same problem.
... View more
11-08-2018
01:16 AM
|
0
|
1
|
1388
|
POST
|
At the moment, I am investigating the use of doing this with a view. That way the crunching can be done server-side and looks like it won't incurr any credits. Set hosted feature layer view definition—ArcGIS Online Help | ArcGIS Will report back my findings...
... View more
10-30-2018
05:53 PM
|
1
|
0
|
991
|
Title | Kudos | Posted |
---|---|---|
1 | 02-07-2019 05:34 PM | |
1 | 01-06-2019 02:12 PM | |
1 | 06-08-2017 06:29 PM | |
2 | 09-04-2020 06:18 AM | |
2 | 06-19-2019 08:05 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:22 AM
|