POST
|
Sallie Vaughn has some good points. Some added comments: At the City I worked at, we had Township Section Range information, but it was a separate feature class. On demand, you could spatial join it to find out where the features are located, since in theory assets don't move. We found this to be much more effective than trying to hard-code it into a feature class. In a way, its redundant data to have hard-coded in a feature class. So that brought up a thought, let me back up. Does your municipality have an enterprise geodatabase or are they using File Geodatabases/Shapefiles, etc? Knowing that will also determine to some extent what options you have in coming up with the most effective system. For example, in an enterprise geodatabase it is fairly easy to set up Subtypes to help categorize data, plus this helps limit entry errors by constraining the categories to only valid values. Introduction to subtypes—Geodatabases | ArcGIS Desktop Also, who will use the data? What for? For example, will it be hosted to a web service? Will other systems be tied into it? Who are the downstream users and what are they/their systems expecting as input? Knowing that will then provide options and constraints for how one sets this up. At the City I was at, we didn't try to come up with one universal unique ID for everything. Instead, we had unique IDs for each feature class. So I guess part of figuring this out will be determining the business need for one big universal ID. It may be that its very needed. But it would be good to know the why before moving forward. I would dig a bit more to find out the rationale more. It may be that somewhere in your organization there is a non-GIS data consumer that reads the GIS data from a linked system and needs the proposed ID to be able to do its process (and has little capability to be adjusted). I was often surprised at the City that even after working a few years there that I would discover there were systems using our data that we had never even heard of. We would propose changes, conduct outreach, go through a review process with all the City departments, then once we implemented even a small change (like eliminating one obsolete zoning code) someone call us from out of nowhere and ask why the data was broken now. Even my boss, who had been there many years, who knew everyone and everything going on in this mid-sized city government, would often be surprised. So ask around. And then ask again. Seriously. Don't expect to be able to see the whole picture in one day. It may take asking many people in various parts of the city over a period of time to get the full picture. Anyways, keep at it. That's my 2 cents. Chris Donohue, GISP
... View more
09-11-2019
03:22 PM
|
1
|
0
|
1809
|
POST
|
To second what kreuzrsk , I have found that people use these terms in many different ways, which can add to the confusion. Particularly casual and new users. Some of these terms do have multiple uses. For example, even professionals use the term layers to describe different things. They could be referring to the many feature classes in a map layout or the actual layer files (.lyr) that control how those feature classes are portrayed when drawn. At that point one has to know the context to understand what is meant by layers (or ask to confirm). Chris Donohue, GISP
... View more
09-11-2019
02:39 PM
|
1
|
0
|
37785
|
POST
|
I probably should of worded that to be "approximately 3 meter accuracy" to be more precise, otherwise people may assume an absolute statement of accuracy was made. So to clarify, the W3W system works on the basis of 3meter grid cells, but that does not mean it is exactly 3 meter accuracy. Also, I hope people didn't assume that W3W system somehow makes GPS derived locations more accurate. It doesn't. The key point on the locational aspect is that W3W can often provide much better accuracy than the commonly-used alternative, which is Triangulation. Note that Triangulation does not use GPS as part of its locating. So the potential improvement offered by W3W comes down to the comparison of GPS on a cell phone (which yes is almost always not survey level) VS. Triangulated accuracy. In that comparison, GPS usually wins (even with some level of inaccuracy). For example, W3W may place you in the 3meter box adjacent to where you are really located, but that is often much closer to where you are than a Triangulated location would come up with. So at least on the accuracy aspect, it has some benefits. By the way, I am not saying this is a perfect system and that everyone should run out there and buy it. But as a person who does addressing, it is an interesting (and at times whimsical) alternative to traditional US addressing systems. Chris Donohue, GISP
... View more
02-19-2019
10:46 AM
|
2
|
1
|
1646
|
POST
|
Another option: Identity—Help | ArcGIS Desktop Though Union and Intersect are probably easier to get started with. Chris Donohue, GISP
... View more
02-11-2019
10:25 AM
|
0
|
0
|
1027
|
POST
|
Here is one way to tackle that. One can go to the Table of Contents in the ArcMap document being used and make a copy of the layer there, then display the full layer in the map and use a reduced one in the legend. So to start, make a copy of the layer in the Table of Contents. Note that this is all being done in the Table of Contents of the final map layout (we are not modifying or copying the actually file on disk). Then for that copy, go into the Properties while in the Table of Contents and then in the legend symbology area delete the item you don't want to appear (in your case populated place). While you are in the Properties, it is helpful to add something to the layer name so you can distinguish it from the original. For example, if the layer name is PlacesOfInterest, I would modify it to be PlacesOfInterestLegend. Then check off the display of this layer in the Table of Contents (but leave the original layer checked on), so only the original layer is being displayed. Finally, to finish this off, go to your legend box in the layout, right-click on it, then for the layers list add in the one you modified. So in my example PlacesOfInterestLegend. Remove from the list the original layer. Note that you may need to set the Property in the legend box that controls display of layers that are not turned on, as this may suppress the legend layer you just made. Chris Donohue, GISP
... View more
02-11-2019
08:42 AM
|
2
|
0
|
1164
|
POST
|
You can bookmark each data frame separately. To do so, "Activate" a data frame by right-clicking on it in the Table of Contents and selecting "Activate", then set the view and create a bookmark. It can helpful to give it a distinctive name, as it can be easy at times when working on a map with multiple layouts to not realize which frame is active. For example, name the main map bookmark "Main map", use "Locator Map" for the locator map bookmark, etc. A tidbit for helping to review the map. Once you have your map all set up with multiple frames and each one bookmarked to the exact final scale, it is helpful to review it in Layout Mode after first hitting the 1:1 button on the Layout Toolbar. Then use the Layout Pan tool to move around. This will show you the final appearance, barring slight changes in colors and lineweigthts from using different printers. Chris Donohue, GISP
... View more
02-06-2019
12:11 PM
|
0
|
0
|
2154
|
POST
|
When creating maps in ArcGIS Desktop for printing, I usually set the Reference Scale to be what is appropriate for the Layout View, then bookmark the view (while in Layout View) so it shows the full map as desired for output. Then if you want to zoom in on stuff, use the Layout View Zoom Tool so you are still at the scale (instead of the standard zoom tool). The bookmark is handy to return the map to the correct scale and alignment once you are done with it, as often you may need to pan/zoom as you check the map and label/annotate it. However, I am not exactly sure if this addresses what you are looking for? Is there any chance you instead actually need something like Scale Dependency? This would be used for instances where you want information to only be displayed at certain scales. http://desktop.arcgis.com/en/arcmap/10.5/map/working-with-layers/displaying-layers-at-certain-map-scales.htm Chris Donohue, GISP
... View more
01-30-2019
04:24 PM
|
0
|
3
|
2154
|
POST
|
For starters, in terms of viewing the data, can you check the Coordinate system set in the Table of Contents in ArcMap to see if it is what is expected? (Right-click on Layers in the Table of Contents, then click on the Coordinate System tab). The Table of Contents inherits the Coordinate System of the first layer loaded into it, so make sure it is using the one you expect. Also, check the coordinate system of the DEM to see if that is what is expected. It is possible one or more of the layers you are using do not have an specifically-assigned coordinate system, so are "free floating" and instead rely on the underlying Table of Contents coordinate system to align. If coordinate system is correct in each case, there are other possibilities that come to mind: For your First Problem, it may be necessary to use a custom coordinate system if the one you are currently using does not do a range of 0 to 360. For the Second Problem, the issue may be in the XY data conversion not having a correct assignment for the Coordinate System. Be sure to specify an appropriate coordinate system when running the conversion to help avoid issues (many people forget to do this). Adding x,y coordinate data as a layer—Help | ArcGIS Desktop Chris Donohue, GISP
... View more
12-27-2018
08:40 AM
|
0
|
0
|
1737
|
POST
|
In case this helps, I'll throw out a pragmatic approach I sometimes use for problematic maps: 1. Convert the feature class you want to label off of to a feature class in a File Geodatabase. 2. Add a field with text (string) type to this feature class called "LabelProcess" or similar. 3. Set the map to Layoutview, with the desired scale and all the layers on that need to be displayed. 4. Make sure Maplex is active. 5. Start labelling. Play with the settings in Maplex until you get the most possible labels looking good. 6. Select all features with labels that worked. Using Field Calculator, calculate the "LabelProcess" field to now include the attribute "Maplex Only" or similar. 7. Copy the feature class you are labeling off of in the Table of Contents so you now have 2 listed. 8. For one of them, set a Definition Query so that it just displays the ones that are "LabelProcess" field = "Maplex Only". 9. For the other one, set a Definition Query so that it just displays the ones that are "LabelProcess" field <> "Maplex Only". 10. Turn off the feature class with that is based on "Maplex Only". 11. On the one that is <> "Maplex Only", try a different combination of settings. For example, consider allowing the label to be rotated, using a leader line, reducing font size, splitting the label into more than one line, etc". Turn on the other layer to make sure there will be no overlap or other visual conflicts. 12. Once a batch of these work, select the features and then using Field Calculator populate the "Label Process" field with a good descriptive name. 13. Copy the feature class being labeled, set a Definition query to be equal to the latest description. 14. Update the Definition Query of the labels that still don't have a good solution to cull out the ones that worked in the last step. 15. Continue the above steps until you get to a point where Maplex will no longer effectively label features. At this point, one will have to switch gears and manually label these. 16. Convert this last batch to annotation, making sure you only are exporting the ones that have do not have a sucessful label from an above process. Then manually align, rotate, add leader lines, etc. 17. As a final step, turn on all the copies of the feature class in the Table of Contents, along with the Annotation created in the last step. Ensure that it all fits well together. There may may another cleanup step here if some things work well individually but not together. Example: A leader line generated in one step overlaps a label for another step. 18. Note - if it is possible the map will be shifted in extent after doing all this, it may be worth it to do an additional step and blow out all the labels to annotation layers. This will "fix" them to one spot, whereas labels tend to "float around" somewhat due to Maplex if the map is shifted, which can cause visual conflicts. I use this approach on large complex maps with many labels and challenges like a variety of polygons sizes that need to be labeled. For example, we have a poster-sized citywide Zoning map where every polygon needs to be labeled clearly with the zoning designation, and the polygons vary in size from a 70 square feet to several thousand square feet. There are areas with many "small" polygons with different zoning designations all closely lumped together besides areas of large polygons, so pure Maplex or Standard labeling is ineffective. Manually labeling all 1,400 polygons is a bit tedious, so this process helps use the power of Maplex to do the bulk of the labeling and then saves just the "troublesome" labels for the more time-consuming manual processing. Chris Donohue, GISP
... View more
12-18-2018
10:26 AM
|
3
|
3
|
2157
|
POST
|
First, check your data to see if it is in the same projection/coordinate system. It is usually best to place all the data in the same projection/coordinate system before processing it. In theory, one could ignore the warning, but it may be that the warning is relevant. Given the uncertainty, reduce that potential by putting all your data into the same system before processing. Data prep on the front end goes a long ways to head off later issues. 001003: Datum conflict between input and output.—Help | ArcGIS Desktop As to why the watershed may be derived "downstream" of the pour point, to troubleshoot this I would start by looking at the output surface and checking the elevation values. If the elevation values "downstream" of the pour point are the same value as above, that usually indicates a data issue. Check the DEM used as the input to to the Watershed process. Maybe it is too coarse for what is being attempted? For example, the cellsize is some large value like 1 mile when what is needed is a resolution of 10 feet? Or a Fill process run before the Watershed geoprocessing tool caused the area "downstream" of the pour point to be "leveled out". Just examples of what can go awry. You will have to check the data. Along with this, confirm that the pour point is really in the correct and relevant location. Is it located right at the top of the spilllway of the lake or instead in the lake many feet away? If the latter, that will likely be the issue. Any chance the Snap Pour Point process didn't work out that well? Review the data and see if its location makes sense given other reference information. Chris Donohue, GISP
... View more
12-14-2018
08:42 AM
|
1
|
1
|
1427
|
POST
|
I find it is usually best to do them in Layout View, as this will be the final "look" of them, so if there are issues one will spot them fairly quickly. Trying to get the settings optimal can take some experimenting and several iterations. If one does them in Data View, they may seem great but then when put into the exact Layout View several issues may suddenly be apparent. As a side note, be sure to have the final map scale set in Layout View and if there are Reference Scales for various annotation layers be sure they are appropriate for the final output before outputting the labels/annotation. Chris Donohue, GISP
... View more
12-14-2018
08:23 AM
|
1
|
8
|
1490
|
POST
|
Another possibility - the Flow Accumulation step often results in an output raster with values for almost every cell. Since the Snap Pour Point process will evaluate every cell that is not null, sometimes it can be worthwhile to create an intermediate raster to use with all cells below a threshold set to NULL to make the Snap Pour Point process faster. To do this, one could use could use the Conditional Tools CON or SETNULL to set all values below a threshold to NULL. Con—Help | ArcGIS for Desktop Set Null—Help | ArcGIS for Desktop Raster Calculator—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
12-06-2018
01:49 PM
|
0
|
0
|
1395
|
POST
|
I don't know the exact reason why the tool failed, but I wanted to suggest some options as these sometimes remedy the issue of raster processing tools not working. I can't tell how your data is structured from the screenshot, but if it is not already, try to place it all in a simple folder on your C: drive before running raster processes. Networked drives can be problematic when processing raster data for a whole host of reasons, so this may remedy the issue. In terms of simplicity, a folder name like C:/process or something likewise very short is best. Very short both in terms of file path and very short in terms of folder names. Speaking of short, even though the tools don't explicitly require this, keep all your raster file names 9 characters or less. Some raster processes still seem to adhere to the older raster processing limits (like Grid Stack) even though we are in modern times. Don't ask me why, but sometimes just making raster file names 9 characters or less will suddenly allow the processing to occur. Output raster formats and names—ArcGIS Help | ArcGIS Desktop Yes, it will be a pain to move all your data around to do this, but it can be worth it to avoid the aggravation of raster tools that go awry for no apparent reason. Move the data to the C: drive, then process it, then move it back to where it will be stored long-term (and rename as needed). Chris Donohue, GISP
... View more
12-06-2018
01:31 PM
|
0
|
0
|
1395
|
POST
|
Can you post images or list the specific error message details? Also, some there are some potential issues that come up when importing one format into another. Here are some resources that can be helpful. Note that they will probably not immediately solve your issue, but can provide some background for troubleshooting these issues: How data converts when importing—ArcGIS Help | ArcGIS Desktop Geoprocessing considerations for shapefile output—Help | ArcGIS for Desktop In terms of a specific error message mentioned, refer to this: Error: WARNING 000594: Input feature falls outside of output geometry domains Based on this warning, it sounds like the data is either a CAD file or a shapefile created from CAD. That could easily explain why data is not importing, as CAD data often take some extra processing or settings changes to be able to import. See the solution they recommend in the link. I have run into this issue often, and typically use the "Disable M and Z values" solution to resolve this. Chris Donohue, GISP
... View more
11-29-2018
09:06 AM
|
0
|
0
|
1226
|
POST
|
You've made it to the big time, Joe! You are on the map. Now get that on your business card pronto so people can find you. On a serious note, I wonder how many Response organizations will include this as part of one of their location methods for NextGen 911? Chris Donohue, GISP
... View more
11-29-2018
07:51 AM
|
1
|
0
|
1527
|
Title | Kudos | Posted |
---|---|---|
1 | 03-18-2015 12:04 PM | |
1 | 09-29-2015 12:41 PM | |
1 | 11-29-2018 07:51 AM | |
1 | 05-08-2018 02:07 PM | |
1 | 07-26-2016 07:53 AM |
Online Status |
Offline
|
Date Last Visited |
08-03-2022
01:39 PM
|