Solved! Go to Solution.
Sorry to have answered so late. I confirm that the gdb was created with ArcGis 9.3. I had already seen that the da commands were not available in 9.3 and was hoping that someone could give me the clue how to bypass the problem, as I tried to bypass the arcpy function, also not available, with something I found in the net. But this is too much for my little knowledge and I have to give up.
3,600 € for the software updating seems to be too much for the needs of the work and for the budget of the company.
Thank you anyway
Bye
Ialina
Hi Ialina Vinci ,
If you don't use ArcGIS on a regular basis, or GIS is not your core business I can imagine that paying that amount for maintenance can be simply too much. However, if you do work with ArcGIS intensively, paying maintenance and having access to the newest versions can provide a lot of benefits.
For now I could offer that you send me your data and I can run the process for you, but if this is something you want to apply to multiple datasets, it might be better to adapt the script to 9.3. I would offer to help you, but I don't have access to this version of ArcGIS and that will make development a lot harder. A lot has changed at 10.x with the introduction of arcpy.
Kind regards, Xander
I use GIS everyday, I have an ArcMap licence 10.1 on my PC, but the PC with Spatial Analyst and 3D licence has remained at 9.3 due to cuts of the budget of the office where I work.
We use Spatial Analysis not on regular basis, that's why.
Since I should run the script with several maps, and our DTM is very heavy, I would prefer to have the script adapted to 9.3, but I understand that might be very diffcult
bye
Ialina
Hi Ialina Vinci ,
I understand. I can help you with the conversion to 9.3, but this will be an interactive process where you will have to do the testing and post back any errors. This may take some time. If that is no problem for you, I can give it a shot.
Kind regards, Xander
Thanks for this. An approach that you might consider is to use rastertonumpyarray rather than looping through the extracts for each polygon. In my kludge to do something similar, actually wanted cumulative frequency curves, I extracted using rastertonumpyarray, flattened the array, sorted it ascending and then was working on calculating the percentiles when I said, "heck with it" and sent it all to Excel to finish. But the processing might be more efficient working in numpy.
Hi All
Given this is over three years old ... is this still the best way to calculate a percentile as zonal stats still doesn't?
Thanks
I haven't seen any new tool that does this. However, it might be a good thing to create an idea and upvote it.
And upvoted...
Hi Xander Bakker. Like other users, I followed this conversation with interest and I am using your excellent script to calculate percentiles from a poligon shapefile and a raster. I have tried several iterations of your code, and all of them work without errors, with a big exception: for each polygon, the value obtained is the same, independently of the percentile under calculation. I am aware that other user had the same problem, and I have tried the script you posted in response, but the problem remains. Besides, in my case it is guaranteed that each polygon encompasses many, many raster cells.
Would you please take a look at the sample polygon shapefile and raster I am annexing, so that you may find what is wrong? Thanks a lot!
Best Regards,
Rafaello Bergonse