|
POST
|
I do this all the time for the City I work for to "true-up" the land base layers. Here's two processes that seem to work well in identifying changes: Detecting geometry changes. For both the existing polygons and the new polygons, convert the polygons to lines. Then run Erase on the line files to subtract the New from the Existing. Finally, run Multipart to Singlepart on the result. The output of all this will be lines that show where the geometry differs. This can be helpful in knowing exactly what needs to be edited, which can be an advantage over the "Select By Location, Identical" process (which is another way to do this). Detecting Attribute changes. If you have a common unique ID, an Attribute Join can be used to combine the two attribute tables. Some examples: APN, TaxID. Obviously will vary depending on your attribute fields. Once the two datasets are joined, Queries can be run for each field of interest to see if the attribute has changed. Generalized example: Table1.APN <> Table2.APN. Of course, one can do much more complex processing, but that will depend on knowing more of the specifics of the data. Chris Donohue, GISP
... View more
07-03-2018
10:08 AM
|
2
|
0
|
1577
|
|
POST
|
You probably already thought of this, but I will throw it out there: Would it be possible to do a query first on the full dataset to just select those items that need to be updated? Then sort all the data to find the last used ID number. In your code, use the last used ID number plus 1. Then run the Python on just the selected records. See the link below for suggested Python code to do this. How To: Create sequential numbers in a field using Python in the Field Calculator There is probably a more elegant full-on Python solution out there. I'm curious to see what code the Python experts come up with for this one. Chris Donohue, GISP
... View more
06-29-2018
08:46 AM
|
1
|
0
|
3306
|
|
POST
|
I don't know the answer for this outright, but its possible this may work: Importing feature datasets, classes, and tables from an XML workspace document—ArcGIS Help | ArcGIS Desktop Every now and then someone will throw an XML document my way, which it is easy to assume is just metadata. Its always a surprise then when the sender says its the data too. Give this extraction process a try, it might work. Chris Donohue, GISP
... View more
06-29-2018
08:16 AM
|
0
|
0
|
2805
|
|
POST
|
I second what Curtis Price says. I've gone to many User Conferences and found each time it has been easy to find a knowledgeable ESRI staff member in the ESRI area in the Exhibitor Hall. Go to the specific subject area staff member and ask away! For example, one year I went to the Spatial Analyst area and peppered the ESRI staff with questions on how to do measurements of linear objects on a surface, a process I was hazy on. It was quite educational. Plus staff is very willing to follow up a few days after the conference if you do toss them a question they don't have an immediate answer to. For example, the staff members sent me some Python code a few days later to help me figure out a trail length in true 3D distance traveled over rugged terrain. Bring some sample data too if you are trying to resolve what seems to be a bug. I treat it like Tech Support, but you can usually use the computers they have set up their to show the staff what your process is and exactly where it seems to go awry. It's like a step better than Tech Support, in my opinion, as they are right next to you seeing every move. Plus they usually are the total subject matter expert on that software/extension/process. This is one of the most important reasons why I go to the Conference. Chris Donohue, GISP
... View more
06-28-2018
03:31 PM
|
0
|
1
|
1789
|
|
POST
|
Like the original poster, I am sort of curious about this too. Typically, when I have an editing task to do in SDE I am pulling from both a default (non-editible production) SDE connection and an editable connection to the same data in SDE. When I go to make a version, it always defaults to the non-editable connection, which is annoying. The only way to create a version is to have only the editable SDE connection loaded into an mxd and then create the version, then pull that across to the mxd with the non-editable connection. Not the end of the world, but annoying. So back to the original posters question, how does one specifically choose the correct database? The selection window is a bit vague. And in my case, how does on force it to be the connection I want? Not to hijack this thread, but I am likewise curious. Chris Donohue, GISP
... View more
06-28-2018
07:42 AM
|
2
|
1
|
1233
|
|
POST
|
I did an analysis similar to what the original poster stated for a municipality about ten years ago with ArcGIS 9. We were evaluating what areas of a California City that could be seen from major roads as part of their Zoning update, as concerns had been raised about unusual and outlandish construction affecting the historic Gold Rush character of the old town section, which had quite a tourist draw. To get a handle of this, the City first wanted to know what areas were even possible to be spotted from the major roads. To accomplish this, the major roads were broken down into points every several feet. Via Modelbuilder, a Viewshed was run for each point, then all the resultant several thousand viewsheds were combined to come up with a count for each cell. Thresholds were chosen to differentiate counts in each visible cell so as to divide off the results with no or just incidental results from those area that had major exposure. If I remember correctly, runs took about 6 hours to do (the project area was about a dozen miles across, so far smaller than what others are considering). It was a fun project to do and I ended up putting up a poster on it in the Map Gallery. Chris Donohue, GISP
... View more
06-25-2018
03:28 PM
|
1
|
0
|
1147
|
|
POST
|
Found a possible solution for this in Modelbuilder: How To: Create a model that makes a correctly formatted query from a string input with special characters Chris Donohue, GISP
... View more
06-25-2018
08:19 AM
|
1
|
0
|
802
|
|
POST
|
That's a tricky one. I suspect the reason it doesn't like the filename is that the percent signs are used in Modelbuilder for Online Variable Substitution. (You did not state it, so I am assuming you are not already using Inline Variable Substitution). Examples of inline model variable substitution—Help | ArcGIS for Desktop One potential solution would be to do some processing before running the main model so as to rename the files to not have the percentage signs. This would likely have to be done outside Modelbuilder, as the typical way to easily rename files is to employ Inline Variable Substitution. There may be a way to employ "Escape characters" to allow Modelbuilder to realize that the percent signs are not part of online variable substitution. Offhand, though, I don't know what those are for Modelbuilder. As an example, in Python \ and r are used in certain cases to allow string input with certain characters to be used without being interpreted as something else. Chris Donohue, GISP
... View more
06-25-2018
08:17 AM
|
0
|
0
|
802
|
|
POST
|
Some of the folks in this group on GeoNet might know: Tagging: https://community.esri.com/community/gis/imagery-and-remote-sensing for greater exposure.Imagery and Remote Sensing Chris Donohue, GISP
... View more
06-24-2018
05:48 PM
|
0
|
0
|
641
|
|
POST
|
If you have a way of sorting down the buildings data to just that part that needs to be transferred to your standalone table, you can save this selection out as a separate dataset and then use a process to combine it with the standalone table. There are two common options available (besides many others not mentioned here): 1. Merge (Data Management) - takes two files and combines them to make a new file http://pro.arcgis.com/en/pro-app/tool-reference/data-management/merge.htm 2. Append (Data Management) - adds one file to an existing file. http://pro.arcgis.com/en/pro-app/tool-reference/data-management/append.htm Note - this does change the existing target (destination) file, so make a backup of the target destination file first just in case it doesn't work out as expected and you need to try it again. Also note - the tricky part of both of these processes will be the Field Mapping, i.e. setting the options so the fields from one dataset are correctly matched to the other dataset. For example, a field might be "GISACRES" in one table that needs to be added to the "ACRES" field in the second table, so the field mapping will need to be set to make this happen. http://pro.arcgis.com/en/pro-app/help/analysis/geoprocessing/basics/field-map.htm Chris Donohue, GISP
... View more
06-23-2018
08:06 PM
|
1
|
1
|
2365
|
|
POST
|
I can't speak to ArcGIS Pro and how it works with Maplex, annotation, and labels, but here is my take in the ArcGIS Desktop world: There is no perfect solution to labeling, but Maplex goes probably the furthest of any method. I would try to play with it a bit, realizing it does take some experimenting with all the settings to effectively label things. However, it is not perfect. One strategy that also can work is to do a hybrid strategy. Use Maplex to get it as far as you can. Then Definition Query out the labels that aren't working out well, and then either create labels or annotation for those, or via using Definition Query to only affect them use a different set of settings in Maplex. This can be messy to set up, but well worth it. What is the Maplex Label Engine?—Help | ArcGIS for Desktop Display a subset of features in a layer—ArcGIS Pro | ArcGIS Desktop Chris Donohue, GISP
... View more
06-21-2018
10:11 AM
|
2
|
0
|
1087
|
|
POST
|
Just an idea on why things are going awry: I forget off the top of my head which ones, but some of the interpolation methods do not strictly enforce the values, in other words they extrapolate a line that may not necessarily pass through the data point. So that will be why some attempts will not have the zero result where one expects it even though there are zero value data points present. So one possibility may be to stick to just the interpolation methods that are true to the data points and then make sure one has a large amount of zero data points where the banks are. EDIT: For example, Spline does honor the values (underline added for emphasis): Spline The Spline tool uses an interpolation method that estimates values using a mathematical function that minimizes overall surface curvature, resulting in a smooth surface that passes exactly through the input points. Source: Comparing interpolation methods—Help | ArcGIS Desktop Chris Donohue, GISP
... View more
06-19-2018
08:45 AM
|
0
|
0
|
625
|
|
POST
|
I'm not super-familiar with ArcGIS Pro yet, but I do believe it offers the same functionality in this regard as ArcGIS Desktop, so take the following with a grain of salt in case that turns out not to be true. Typically, one can do a Join to combine the spatial data table with the tabular data table. Part of the challenge then will be determining the relationship. Is it one to one, one to many, many to one, or many to many? Another part of the challenge will be finding a common bit of information to allow the Join to do the matching (a primary key). For example, is there a unique ID value in each table, like a customer ID, that can be used to link the data? In terms of how one does this with the actual software, at least in ArcGIS Desktop it is done by either using a geoprocessing tool, or by loading the table and spatial data into a document (mxd) and then opening the properties of the spatial layer and going to the Join Tab, then filling in the information. However, I don't know if it works the same in ArcGIS Pro. http://pro.arcgis.com/en/pro-app/help/data/tables/joins-and-relates.htm http://desktop.arcgis.com/en/arcmap/10.3/manage-data/tables/about-joining-and-relating-tables.htm http://desktop.arcgis.com/en/arcmap/10.3/manage-data/tables/essentials-of-joining-tables.htm .Anyways, I hope this steers you the right way, Chris Donohue, GISP
... View more
06-17-2018
11:00 AM
|
1
|
3
|
2365
|
|
POST
|
As Xander Bakker is suggesting, I would suppose there would be some preferences that may lead to the optimal shapes. The first thing that comes to mind is a pragmatic "what it the minimum sized strip of land that can be mined?". This may be governed by something as simple as the minimum-sized bulldozer/excavator that it is practical/economical to work with. This may come into play as the output of the split may include a long thin sliver. Also, for economies of scale, would it be best if the polygon was all compact, or would it instead be OK to sprawl? Or is there there a central processing location, then the whole site needs to be split into 5 with each "slice" touching the central area? If you have not already, I'd come up with all the factors that need to be considered. This is important as once those are established, that may lead to totally different solution processes on the technical side to come up with the split. There are several folks on GeoNet who are really good at the many ways to "slice and dice" a polygon, but to get them involved the criteria would need to be known, as the solutions are varied and explaining them can get complicated. Chris Donohue, GISP
... View more
06-14-2018
08:25 AM
|
1
|
0
|
7042
|
|
POST
|
As an example of how data may be different, check out the news story I posted in this thread about a roads issue that came up in the City I work at. One of the realities of this business.... EDIT: Scratch that, even though it is listed with a lead in, the story no longer seems viewable (but the ads still work). So no video to watch from the news network. Can anyone offer suggestions on how to update Google maps with new roads and addresses in your City? Chris Donohue, GISP
... View more
06-14-2018
08:08 AM
|
0
|
0
|
2516
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 03-18-2015 12:04 PM | |
| 1 | 09-29-2015 12:41 PM | |
| 1 | 11-29-2018 07:51 AM | |
| 1 | 05-08-2018 02:07 PM | |
| 1 | 07-26-2016 07:53 AM |
| Online Status |
Offline
|
| Date Last Visited |
08-03-2022
01:39 PM
|