|
POST
|
I guess it boils down to looking at all the factors and making a call. Time and Money are two factors, but there are others. A big factor to consider is risk/liability. What are the consequences of the data being off? For example, is the data being used in a Life Critical Application or other purpose where there are high expectations that the data be solid and consequences of inaccuracy when the data is relied upon and fails severe? An evaluation will need to be made of this. Not just the liability in the legal sense, but also the risk that poor data will cause other side effects, like for example poor location data used in a construction project causing delays and rework as things turn out not to be quite where the data seemed to indicate they were. Along these lines, another aspect to consider is standards. Are there accuracy standards that are required that are not being met? External standards, Agency standards, Internal standards, Client standards? Will the data in its current "not perfect" state meet the standards? If it doesn't meet an expected standard, that needs to be mentioned. Organizations I've been with have occasionally had issues with subcontractors where they failed to meet the data standards we specified, which had major financial consequences for them as laid out in our contract agreement (and consequences for us as the client as this often led to project delays and rework). Also, there is the less tangible risk - if there is an expectation by clients (internal and external) that the data is good and if its not quite up to snuff from their perspective, will your group/organizations reputation take a hit? Will this impact the long-term ability to get clients/work with other organizations? As part this evaluation, also keep in mind the realities of any project, that there almost never is enough resources to do it all perfectly. That's not to say one shouldn't strive to do better, but it may be what is currently available is sufficient to do the tasks at hand. Keep in mind there can be a dimension present in that it was known that the data was not great but that it would be workable, so a project was bid out with that knowledge, thus the reluctance of the organization to spend more money to improve the data. This can be a major factor in private sector work. So make a frank evaluation. Maybe the data is good enough as is? If not, "spend time and money now" to improve the data or "spend time and money later" when things go awry? Whatever decision is made, another item to consider is metadata. Is there an explanation of the data limitations in the metadata? Do you have a disclaimer? It might be worth documenting the known limitations to help prevent other users from running into trouble as they may not see the data issues and consequently have issues. Anyways, those are some things to consider. Chris Donohue, GISP
... View more
05-18-2018
08:49 AM
|
1
|
1
|
1642
|
|
POST
|
Just to clarify, when the georeferencing was performed, was the process used only to "Update Georeferencing" or was a "Rectify" done also? If a "Rectify" was done, note that it creates a new dataset, and it is important to choose a dataset that will support the proper amount of bands needed. In your case, it sounds like a dataset format was chosen that only supports export of one band, thus the "gray" output. Rectifying or warping will create a new raster dataset that is georeferenced using the map coordinates and the spatial reference. Source: Fundamentals of georeferencing a raster dataset—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
05-18-2018
07:58 AM
|
0
|
0
|
2006
|
|
POST
|
There is probably a more elegant way to do this, but one approach would be to use the Dissolve (Data Management) geoprocessing tool. When running the tool, to make it dissolve based on spatial location, do not check off any of the checkboxes under the setting Dissolve Field(s) (optional). This will result in a feature class with one record - a multipart point. To restore the individual points, take the output and run the Multipart to Singlepart geoprocessing tool. Dissolve—Help | ArcGIS for Desktop Multipart To Singlepart—Help | ArcGIS for Desktop However, this process may not be the best depending on where you want to go with all this. A big issue that will need to be thought about is if the attributes need to be carried over from the original points. If that is the case, it may be that one or more Spatial Joins may be necessary. Spatial Join—Help | ArcGIS Desktop Chris Donohue, GISP
... View more
05-11-2018
07:57 AM
|
1
|
0
|
3256
|
|
POST
|
One way to do this is with a script. If a consistent symbol or pattern can be identified for cases where a change needs to be made, then that would be employed in the script to detect what records need to be modified and then do the modification. For example, if all the records that need to be modified are situations where the information after a comma needs to instead be placed in front of a comma. As to specific code, that will depend on the script language of choice. Are you familair with Python or another script language used in GIS? As one means to do this, check out the label expression examples in this link. They are not exactly what you need, but will give a feel for how to employ the script in ArcGIS when labelling. Building label expressions—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
05-08-2018
02:07 PM
|
1
|
9
|
3345
|
|
POST
|
Just to clarify, the restriction caused by the overpass sign is limited to just the overpass road itself (and not restricting the highway passing under it)? Chris Donohue, GISP
... View more
05-08-2018
12:03 PM
|
0
|
2
|
1188
|
|
POST
|
So If I'm understanding this correctly, you want to clip a raster (top layer) by a polygon (London Boundaries). And the software you are using is ArcGIS Desktop 10.6. If that is the case, one workflow would be this: 1. As suggested by Xander Bakker, run the geoprocessing tool Dissolve (Data Management) on the London Boundaries to simplify it to one polygon boundary. 2. Run the Clip (Data Management) geoprocessing tool, specifying for "Output Extent (optional)" the feature class created from the Dissolve and checking the box to employ "Use Input Features for Clipping Geometry (optional). Dissolve—Help | ArcGIS for Desktop Clip—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
05-04-2018
01:46 PM
|
2
|
1
|
8887
|
|
POST
|
Folks who do major TIN processing can probably provide specific workarounds, but I know in general that memory is often an issue. From the Tool Help: "...it's recommended that TINs be kept under several million points. Large input rasters and small Z tolerance settings may exceed this. If size is an issue, consider processing subsets or use Raster To Multipoint followed by building a terrain dataset." How Raster To TIN works—Help | ArcGIS for Desktop https://community.esri.com/community/gis/analysis/spatial-analyst Chris Donohue, GISP
... View more
05-04-2018
08:56 AM
|
1
|
1
|
2452
|
|
POST
|
Back in earlier versions of ArcGIS Desktop there was Map Algebra. For example: ArcGIS Desktop Help 9.3 - Single Output Map Algebra Nowadays, similar functionality in ArcGIS Desktop 10.x can be found in what is called Raster Calculator. "The Raster Calculator tool allows you to create and execute a Map Algebra expression that will output a raster." Raster Calculator—Help | ArcGIS for Desktop https://community.esri.com/community/gis/analysis/spatial-analyst Chris Donohue, GISP
... View more
05-04-2018
08:41 AM
|
2
|
0
|
2089
|
|
POST
|
Are you using ArcGIS Pro or ArcGIS Desktop? I see that the post is tagged with ArcGIS Pro, but just wanted to be sure as there are some differences in how ArcGIS Pro does the processing compared to ArcGIS Desktop. My first thought on what is going awry (regardless of whether using Pro or Desktop) is that something went amiss in the processing of the data leading up to the Flow Accumulation tool. Review each of the process steps you did and check the outputs of each process to see if the results are expected. Also, be sure that all the needed steps/processes are being followed. ArcGIS Help (10.2, 10.2.1, and 10.2.2) Deriving runoff characteristics—Help | ArcGIS Desktop Chris Donohue, GISP
... View more
05-04-2018
08:14 AM
|
2
|
4
|
2185
|
|
POST
|
Here's the process instructions: Using Select By Attributes—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
04-10-2018
03:46 PM
|
1
|
0
|
3062
|
|
POST
|
There are several ways to do this. Here's probably the easiest workflow: With both layers loaded into ArcGIS, use Select By Location Using Select By Location—Help | ArcGIS for Desktop Export the resulting selected information from the Census Block groups to a new feature class. Chris Donohue, GISP
... View more
04-10-2018
12:06 PM
|
2
|
2
|
3062
|
|
POST
|
If the data is all in 2D, one way to go would be to employ Generate Near Table using the starting point and points representing the extents of the polygon. Be sure to read the fine points of the Tool Help on how the Angle is determined. Also, check on how to set the Angle and Closest options are set so when running the tool one gets the angle to all the points. One way to get points of the polygon is to convert the polygon to lines first using Feature to Line, then convert the lines to points using Feature Vertices to Points with the All option. Then thin down the points to just the relevant two to use in Generate Near Table Generate Near Table—Help | ArcGIS Desktop Feature To Line—Data Management toolbox | ArcGIS Desktop Feature Vertices To Points—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
04-06-2018
10:02 AM
|
1
|
0
|
1001
|
|
POST
|
There are several ways to do this. One general workflow would be to add the coordinate of interest into GIS as a Point Feature in a Point Feature Class using the "Add X, y coordinate data a layer workflow (see link). Then add the other points into a second Point Feature Class. Run a Buffer of 200 meters of the original point of interest. Clip the point feature class with the other points based on the 200 meter buffer to determine what points are within the radius. Adding x,y coordinate data as a layer—Help | ArcGIS for Desktop Buffer—Help | ArcGIS Desktop Clip features using another feature—ArcGIS Pro | ArcGIS Desktop Chris Donohue, GISP
... View more
04-05-2018
03:34 PM
|
1
|
1
|
1419
|
|
POST
|
Some possibilities: Is there currently a symbology layer stored in SDE for the feature classes? For example, is there a .lyr file for the feature class? If not, you may need to create .lyr files from the existing symbolized feature classes. Layer Files Or is the issue more specifically that the feature class is correctly symbolized in the Table of Contents in ArcMap (with either the use of a lyr file or symbology created manually), but the in the Create Features window the Template does not show the symbology? If that is the case, you may need to work on the FeatureTemplates. About feature templates—Help | ArcGIS for Desktop Chris Donohue, GISP
... View more
04-05-2018
12:13 PM
|
1
|
1
|
1481
|
|
POST
|
Typically, a suitability map is created as an end product of a logical process based on several things one is trying to balance. The weighting controls how important each factor is. To get started, check out this link to get a better understanding of what goes into making a suitability layer: Using the conceptual model to create a suitability map—ArcGIS Help | ArcGIS Desktop Chris Donohue, GISP
... View more
03-29-2018
09:15 AM
|
1
|
0
|
1011
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 03-18-2015 12:04 PM | |
| 1 | 09-29-2015 12:41 PM | |
| 1 | 11-29-2018 07:51 AM | |
| 1 | 05-08-2018 02:07 PM | |
| 1 | 07-26-2016 07:53 AM |
| Online Status |
Offline
|
| Date Last Visited |
08-03-2022
01:39 PM
|