|
POST
|
As promised please find attached my Toolbox (ArcGIS 10.0) with attached model that iterates through each feature within a feature class and passes it onto zonal statistics to calculate the average slope for each polygon (Watershed Catchment) and outputs it into a single table. Please note that I've used the following model to process over 1200 polygons with no problems. It runs pretty quickly. I'll post the updated model which identifies the overlapping features and processes them separately to minimise the amount of iterates shortly. problems with my internet connection at the office. Regards
... View more
10-10-2012
11:18 AM
|
0
|
0
|
2542
|
|
POST
|
I wouldn't say it's a thing of the past. Setting up the model is not trivial for people new to ModelBuilder, especially how you aggregate the results. Please do post your model, IMHO this is a good "real-world" example of how iteration can be used to great advantage in Arc 10. The tool Sue just posted is better than a brute-force iteration though - it separates your overlapping polygons into non-overlapping groups - if you have thousands of overlapping polygons, this could mean a few hundred iterations instead of thousands. Hi Curtis I agree that using a iterate to deal with the overlapping is a bit of a brute force, but as a suggestion I could alter the tool to only deal with feature that overlap and pass them as single "in_memory" process" to zonal statistics thereby eliminating the amount of iterations required. I'll post my original model tonight when I get home from work as well as an adjusted model to deal with the overlaps only. Regards
... View more
10-09-2012
08:41 AM
|
0
|
0
|
2542
|
|
POST
|
Thats all I get from this tool. <type 'exceptions.RuntimeError'>: Row: Field FID does not exist Failed to execute (StatisticsForOverlappingZones) Hi everyone With the release of ArcGIS 10 a list of iterate functions have been added to ModelBuilder. The problem with overlapping polygons for Zonal Statistics is of the past. If you iterate over your features within a feature class and pass each one to zonal statists it works perfectly and writes out the results into a single table. If anyone would like I would gladly post my model. Regards
... View more
10-09-2012
07:24 AM
|
0
|
0
|
2542
|
|
POST
|
Peter, A dataset with 7.5 million Features is pretty large! What ever you do it will take time to process. Loading dictionaries with 7.5 million entries will probably cripple a machine anyway... One quick win is to ensure your fields have attribute indices, have you created these? You can often get a significant performance boost with these alone. Duncan Hi Duncan I have created attribute indices, unfortunately ArcGIS joins are very slow. I found the following post that suggested that python dictionaries outperform ArcGIS joins. http://forums.arcgis.com/threads/55099-Update-cursor-with-joined-tables-work-around-w-dictionaries?p=238702#post238702 Regards
... View more
10-08-2012
03:26 AM
|
0
|
0
|
4538
|
|
POST
|
I'm looking for a way to use python dictionaries instead of ArcGIS Joins. Please note that I'm new to Python and would need some assistance to understand your code if you don't mind and have the time. I have 7.5 million parcles saved as a feature class. Within the feature class I have a field called "SG_Code". I also have two tables called WARMS (i.e. WARMS_DW760 & WARMS DW764). They each have a field called "SG_Code" & "TIT_DEED_NUM". I then have another two additional tables called RED (i.e. Redistribution) and REST (i.e. Restitution). The RED and REST tables have a two fields "SG_CODE" and "TIT_DEED_NUM". I need to create a subset feature class of the 7.5 million parcles where I find a match using firstly the "SG_Code" between the parcles feature class and each WARMS table separately (i.e. WARMS_DW760 then WARMS_DW764). I then need to find a match using the original 7.5 million feature class and RED and REST tables using the "SG_Code". Then I need to find a match based on the match already found using the 7.5 million records between the WARMS_DW760 and WARMS_DW764 and then match the "TIT_DEED_NUM" and the "TIT_DEED_NUM" found in the RED and REST tables to see if I find additional matches using the "TIT_DEED_NUM" as not all the records have "SG_Codes" within the REST and RED tables. In short, what I'm trying to accomplish is to identify where I find a match between the parcles and warms, then a match between the parcles and RED and REST. I've used Add Joins so far to accomplish this, but its running forever. I've attached my model that I've built so far to better understand what I'm trying to accomplish. Regards
... View more
10-05-2012
09:23 AM
|
0
|
11
|
14814
|
|
POST
|
Hi Mathew I came across your thread and hope that you are able to assist me to use python dictionaries to accomplish what I'm trying to do. Please note that I'm new to Python and would need some assistance to understand your code if you don't mind and have the time. I have 7.5 million parcles saved as a feature class. Within the feature class I have a field called "SG_Code". I also have two tables called WARMS (i.e. WARMS_DW760 & WARMS DW764). They each have a field called "SG_Code" & "TIT_DEED_NUM". I then have another two additional tables called RED (i.e. Redistribution) and REST (i.e. Restitution). The RED and REST tables have a two fields "SG_CODE" and "TIT_DEED_NUM". I need to create a subset feature class of the 7.5 million parcles where I find a match using firstly the "SG_Code" between the parcles feature class and each WARMS table separately (i.e. WARMS_DW760 then WARMS_DW764). I then need to find a match using the original 7.5 million feature class and RED and REST tables using the "SG_Code". Then I need to find a match based on the match already found using the 7.5 million records between the WARMS_DW760 and WARMS_DW764 and then match the "TIT_DEED_NUM" and the "TIT_DEED_NUM" found in the RED and REST tables to see if I find additional matches using the "TIT_DEED_NUM" as not all the records have "SG_Codes" within the REST and RED tables. In short, what I'm trying to accomplish is to identify where I find a match between the parcles and warms, then a match between the parcles and RED and REST. I've used Add Joins so far to accomplish this, but its running forever. I've attached my model that I've built so far to better understand what I'm trying to accomplish. Regards
... View more
10-05-2012
09:11 AM
|
0
|
0
|
3659
|
|
POST
|
Sure. Find attached. This piece will find its way into this document: http://resources.arcgis.com/en/help/main/10.1/index.html#//00s200000009000000 Right after the first large table, and before the "Maps" section. Let me know if you still have questions after having a read. Thanks Kevin The help document looks really good. Regards
... View more
10-02-2012
08:37 AM
|
0
|
0
|
1945
|
|
POST
|
Are you using 10 or 10.1? At 10.0 its mostly "what you see is what you get" coming from ArcMap/ArcGlobe using the Layer to KML tool. At 10.1 you have more options, what you see, or you can have fine control over feature creation on a per feature basis using field attributes. If you're using 10.1, I have the doc updated internally to explain how to use the field attributes (its making its way through the copy edit process). Hopefully it'll get pushed to the web in the next couple weeks. If you want it before then, I'll cut out the appropriate pieces and post here. Just let me know. Hi Kevin I'd really be interested in having a look at the following document. Regards
... View more
10-02-2012
08:19 AM
|
0
|
0
|
1945
|
|
POST
|
Hi there, I'm wanting to display a simple 3D polyline in Google Earth... The dataset are points collected from a total station. 1. I created a feature class from ArcCataloge by right clicking a .CSV file with Point_ID, Easting, Northing and Height fields and creating from XY table. 2. I then invoke the 'Points to Line' tool from the arcToolbox; (Data Management Tools, Features, Points to line). At this stage the polyline can be viewd in ArcScene as nomal in 3d. 3. I then export to .KML to view in Google Earth (Conversion Tools, To KML, Layer to KML) 4. When I import into Google Earth, the polyline mysteriousely has a very small ranging, incorrect height profile. The polyline appears to be sitting flat on the surface. The projection data is GDA94 (I've also tried converting to WGS84 with the same result). Also, my individual points also appear to be bearely above the Google Earth surface when I import these. Any help would be appreciated! Thanks Hi James It first seems that the KML height hasn't been set correctly within Google Earth. If you look at the properties of the kml within Google Earth you can set the relation of your Z values to the ground level. Not sure if this is the problem, but give it a try. Regards
... View more
10-01-2012
09:34 AM
|
0
|
0
|
1945
|
|
POST
|
I've done a fair amount of testing with regards to the performance of ArcGIS 10.1, specifically the Geoprocessing environment. I've tested ArcGIS Tools within ArcToolbox, ModelBuilder utilising ArcGIS Tools and ModelBuilder Tools. I've also tested ArcPy within ArcMap and ArcCatalog. I've also run ArcGIS Tools and ArcPy scripts in the foreground and background. There seems to be a few bugs associated with running scripts in the background. ArcMap and ArcCatalog become unresponse on numerous occasions. The ArcGIS tool Table to Table will not run within the background. The Geoprocessing tools on a whole run extremely better on data that is either saved locally or on an external drive. Data that is stored on a network file server is far slower and even unresponsive at times. I've tested the network for speeds and connectivity and could not find any reason for the poor geoprocessing times. The network is not dropping and copying speeds is between 250 MB/s to 500 MB/s. I've attached screen shots of Tera Copy while copying data from a local drive to the file server as well as the Network Speeds via the Task Manager. Regards
... View more
10-01-2012
09:20 AM
|
0
|
0
|
734
|
|
POST
|
Hi Kimberly What you need to use is a Feature Set : ESRI Help: Feature Set The following ESRI Training Seminar shows you how to impliment the Feature Set for what you are trying to do: Using Python in ArcGIS Desktop 10 Regards
... View more
10-01-2012
09:01 AM
|
0
|
0
|
671
|
|
POST
|
The append tool takes really long to append simple polygon features into a new empty feature class it takes 13min to append. The ability to append existing records with the same schema has to improved, this is really slow. Regards
... View more
10-01-2012
04:49 AM
|
0
|
1
|
1493
|
|
POST
|
Hi Stephan You are going to have to use a ModelBuilder tool called "Calculate Value" and write an ArcPy expression using a Search Cursor: Calculate Value Search Cursor Hope this gets you going 🙂 Regards
... View more
09-27-2012
11:54 AM
|
0
|
0
|
615
|
|
POST
|
Hi Fozia This definitely sounds like a bug, I've not come across it yet. A work around that I can offer you is to use an iterator instead of using the batch process. This should resolve the bug for now. I'd suggest that you report the bug to your local ESRI offices. Regards
... View more
09-27-2012
11:45 AM
|
0
|
0
|
580
|
|
POST
|
Hi Jennifer I'm not sure that there is going to be any easy way of automating this. That said what you trying to accomplish is to measure the distance between two vertices within your polygon that are 90 degrees from each other and there is no guarantee that you are going to have a vertex opposite each other for each location within your polygon. Unless someone has a brilliant answer to accomplish this you might have to do this by eye and create a tool using geoprocessing to interactive split and measure the distance upon splitting your polygons. Have a look at feature sets: http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#//002w00000023000000 Regards
... View more
09-27-2012
11:32 AM
|
0
|
0
|
1920
|
| Title | Kudos | Posted |
|---|---|---|
| 3 | 01-16-2012 02:34 AM | |
| 1 | 05-07-2016 03:04 AM | |
| 1 | 04-10-2016 01:09 AM | |
| 1 | 03-13-2017 12:27 PM | |
| 1 | 02-17-2016 02:34 PM |
| Online Status |
Offline
|
| Date Last Visited |
03-04-2021
12:50 PM
|