POST
|
We're currently using ArcMap 10.4.1 to run a process we built in ModelBuilder. As the model currently stands it takes an extremely long time to run. When opening the task manager up we can see that the computer is only utilizing one out of the four cores on our machine. Is there a way to tell ArcMap to run a geoprocessing job over all four cores? The only information I've found so far only talks about background processing - which allows a job to run while you continue drawing and editing the map space itself and how to turn it on when using ArcGIS pro or running python scripts. If there's any other tips people have for speeding up geoprocessing jobs we are open to that too.
... View more
07-25-2018
01:08 PM
|
0
|
2
|
575
|
POST
|
I have a table of addresses that I am hoping to geocode so that I can perform some zonal statistics on them. I don't want to have to make my own address locator so was hoping to use ArcGIS Online World Geocoding Service. When you deploy this geocoding service does your table get sent out to their servers and live online or does their service geocode on the local machine/environment? The addresses I'm using is PHI so it shouldn't be shared and accessible through the general web. Additionally, if ArcGIS Online does save this information or is not protective of this information what address locator service would I be able to use that's stored on my local server.
... View more
06-25-2018
10:03 AM
|
0
|
1
|
450
|
POST
|
I've attached the buffer shapefile as well as a clip of our results (both the time the tool worked and the time it did not). I am not able to include the raster data as that would give protected information away for the participants in our study - the raster data is just an NDVI map of a county in Wisconsin. The area calculated for each buffer should be ~ 198000. Some are expected to be a little less than that due to the fact that we masked all water out of the NDVI layer. The time that it worked this statement held true. However, when we tried to run it again some of the calculated areas were as low as 15300 due to the amount of overlap among some of our buffers. I used the exact same buffer shapefile both times and the exact same landsat data to create the NDVI layer. The only thing that I can foresee as being different is the workspace and I didn't change any geoprocessing environment variables. Let me know if there's any other information I could give that might be useful!
... View more
06-12-2018
06:57 AM
|
0
|
0
|
585
|
POST
|
We've successfully executed zonal statistics as a table 2 (from the spatial analyst supplemental tools) on one of our map files to extract a mean NDVI value from 250 m buffers - with some polygons overlapping each other. This map file is rather large and holds a lot of data - since we have used it to do testing on different processing in the past. Due to this, we decided to start over and only import and run necessary steps to extract the data we want. In both files we started with the exact same tifs and shapefiles before further processing. Trying to run the tool on this new map file no longer works. It runs as if it were the same as the original zonal statistics tool. I've double checked that the geoprocessing environment is the same. Looking at the script the tool looks like it tries to run the code that allows for overlapping polygons and if it can't then it runs the normal version. I script is a little dense so I'm not 100% sure that this is true. I've attached the python script for reference. Does anyone have any ideas as to why for this new file it runs the exception/normal zonal statistics tool but for my older/larger file it runs correctly?
... View more
06-07-2018
12:11 PM
|
0
|
2
|
860
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:25 AM
|