|
POST
|
Not a spatial problem per-se, and the SciPy library does a pretty good job for problems like this. It'll be included in the 10.3 release, but can be downloaded and installed separately in 10.x. With that in place, convert your raster to a NumPy array, then use the statistics built in to SciPy to do the calculation:
import scipy.stats
rast_path = 'C:/my_input_raster.tif')
raster_as_numpy_array = arcpy.RasterToNumPyArray(rast_path)
raster_geometric_mean = scipy.stats.stats.gmean(
raster_as_numpy_array, axis=None)
Which for a raster of 10M values takes a few seconds for me to run. Hope that helps, Shaun Edit: Michael August I've updated my answer to include 'axis=None' which will compute the geometric mean for the whole matrix (instead of along one axis). With this change, it should work without any further steps.
... View more
11-06-2014
02:59 PM
|
2
|
5
|
3972
|
|
POST
|
The responses in this thread will be useful. You can't directly modify the current MXD, but if you're OK with opening a new session, you can do something like:
os.startfile(mxd_path)
If you do this, you won't need to use the arcpy.mapping call, just point it at the actual path of your MXD file.
... View more
08-27-2014
07:49 AM
|
1
|
2
|
990
|
|
POST
|
Updating Excel spreadsheets can be tricky, in part due to the particulars of the Excel file format. That said, since the 10.2 release, you can use the xlrd and xlwt python modules to read and write Excel spreadsheets. You can combine these two existing modules with an additional module, xlutils, as is shown in this example: python - writing to existing workbook using xlwt - Stack Overflow Depending on what you're doing, you may have an easier time just writing a new spreadsheet with xlwt alone, which ships with ArcGIS, and using some semantic naming to track which spreadsheet is the current iteration.
... View more
08-05-2014
08:50 AM
|
1
|
0
|
14306
|
|
POST
|
I've found that this pattern works pretty well that I use for Python toolboxes: place your code into a function with the named parameters, and then depending on how it's called, either take the command line arguments, or the parameters from ArcPy. You should be able to use this same pattern by detecting if you have any input command-line parameters (e.g. len(sys.argv)), and if so, use those, if not, default to the getParameters arguments. Something like this:
import sys
import arcpy
def main(input_fc=None, output_fc=None):
# your main script body goes here
# executed as a script
if __name__ == '__main__':
if len(sys.argv) == 3:
# we were passed command line parameters, execute in a script context
input_fc=sys.argv[1]
output_fc=sys.argv[2]
else:
# if we don't have args passed on the command line, assume a toolbox context
input_fc=arcpy.GetParametersAsText(0)
output_fc=arcpy.GetParametersAsText(1)
# call the main function with our parameters
main(input_fc, output_fc)
... View more
08-05-2014
07:53 AM
|
3
|
4
|
4165
|
|
POST
|
If I'm reading your code correctly, then your TBX file lives at the top-level inside of your 'Install' directory. When you build an add-in, the files inside of 'Install' are all placed into a GUID-based directory within your users' home directory, and any paths should be relative to this. You can read more about using files from the 'Install' directory in the documentation, under the heading "File and folder structure". So, the path to the toolbox should be something like:
toolPath = os.path.join(relPath, "getIR.tbx")
One way you can test this is by changing your code to just pop up a message box instead of executing the tool: def onclick(self):
pythonaddins.MessageBox("toolPath is set to {}.".format(toolPath), "Expected Path") cheers, Shaun
... View more
07-29-2014
01:46 PM
|
1
|
1
|
2168
|
|
POST
|
This is a general issue when using SQL aggregate functions, but basically, just include the additional column you're grouping on in the results. So, something like MIN(OBJECTID) in the WHERE clause of the SQL statement.
... View more
06-10-2014
09:47 PM
|
0
|
0
|
1711
|
|
POST
|
umgrad, I haven't used it recently, but unless you need the specific behavior GME provides, the approach I mentioned above should do what you're asking for. If you're getting errors with GME specifically, you'll probably have to contact its author for help, I'm not sure if there is a public forum for support specifically for GME. cheers, Shaun
... View more
04-01-2014
01:35 PM
|
0
|
0
|
580
|
|
POST
|
Marion, The problem is that you're initially creating a shapefile with type "POINT", but then trying to write Polylines to it. Change the create feature class layer to: arcpy.CreateFeatureclass_management("C:/workspace", output, "POLYLINE")
cheers, Shaun
... View more
03-31-2014
05:32 PM
|
0
|
0
|
403
|
|
POST
|
TA, First, create a raster which classifies your raster into a binary '>30%' or '<=30%' grid. You can use, for example, the raster calculator to compute this. Once you have that in hand, the operation you're looking for is called zonal statistics: you want to get sums based on the polygons you have. You should be able to use the Zonal Statistics as Table tool for that step. chers, Shaun
... View more
03-31-2014
05:02 PM
|
0
|
0
|
580
|
|
POST
|
Francisco, I don't think this is due to your Python installation, but a bug in the particular function you're using. I suggest one option to fix it in the other thread posted.
... View more
03-28-2014
06:16 PM
|
0
|
0
|
1004
|
|
POST
|
Goerlich, It looks like that particular tool has a bug in how it handles Unicode characters beyond the base 7-bit ASCII set. In particular, it is failing on the 'á' character, or 0x1E in hexadecimal. It looks like that character is included in one of your paths, could you try running the script from a directory which only has plain characters in its name? This is a bug, and should be filed as such, but this will at least get you up and running for the time being. cheers, Shaun
... View more
03-28-2014
06:13 PM
|
0
|
0
|
1075
|
|
POST
|
I don't think it's possible using ArcPy directly. If you really needed it, you could call a DLL from a script tool which included the relevant ArcObjects code, but that's non-trivial to set up if you just wanted something simple. cheers, Shaun
... View more
03-28-2014
10:15 AM
|
0
|
0
|
802
|
|
POST
|
Rebecca, There shouldn't be any differences in support for GRID files between releases, I suspect it's something else. Is the geoprocessing environment set up to change the cell size? It's also possible that the rasters differ in resolution, and there's some automatic conversion going on to get them to match up before doing the calculation. Could you post some more details on the rasters themselves? If do you do think it's explicitly an issue with them being GRID files, try converting the files to GeoTIFF and perform the same analysis -- if the same problem occurs, we'll know it's in the calculation step and independent of the file format. cheers, Shaun
... View more
03-28-2014
10:05 AM
|
0
|
0
|
474
|
|
POST
|
Ben, Those are good questions, and I think doing an accuracy assessment of the variation in the years of Landscan would be a great thing to have, and could probably stand up as its own publication. I'm not aware of any specific group which has done this, and in the past when I've worked with Landscan we've made some rather crude assumptions when looking at the data over time. I don't think this has been done before, in part because the Landscan data has a cost, and is less frequently used in analyses than its free counterparts like CIESIN's gridded population datasets. I haven't done time-effects models, but I imagine that with some careful modeling this could be done meaningfully. You may want to look more into the existing literature published by users of Landscan to see if others have tried similar approaches. If you're only interested in particular parts of the world, you may be able to incorporate census data to predict expected differences and account for the error in Landscan. Overall, it sounds like a great project, but I'm not aware of anyone who's done this already. cheers, Shaun
... View more
03-28-2014
10:01 AM
|
0
|
0
|
397
|
|
POST
|
It looks like it might be a 3D localization file produced by TOPCON's TopSURV, as it's documented in their manual. Probably your best bet is to try using the tools they provide and see if it can be converted to something useable outside their software. It may also be a text file, but I'm not sure what the values represent -- probably they can help in interpreting what it contains and how it can be applied in a GIS context. cheers, Shaun
... View more
03-28-2014
09:49 AM
|
0
|
0
|
2324
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 11-24-2025 09:10 PM | |
| 1 | 08-12-2022 10:14 PM | |
| 1 | 05-05-2025 10:56 AM | |
| 1 | 04-04-2025 09:03 PM | |
| 1 | 02-09-2023 10:10 PM |
| Online Status |
Offline
|
| Date Last Visited |
12-19-2025
01:14 PM
|