Have you ever attempted to run a geoprocessing tool, only to have the tool fail? Perhaps your data fails to publish to ArcGIS Online or draws incorrectly on your map. Maybe you are running a geoprocessing tool only to have it fail with a generic error message. You are using the same workflow you use every day with the same settings and configuration, but you can't seem to find another cause to the problem.
You may be dealing with a data-specific issue. There are a few basic troubleshooting steps that may provide a resolution to this, but it all starts with determining if the issue is truly data-specific. Determining if an Issue Is Data-specific
A quick test to determine if an issue is data-specific is to bring your dataset into a blank map document, map frame, or web map (depending on the environment in which you are working). If the issue does not persist in a new map, then the issue may be specific to the map document. If you experience the same issue in a new map document, the source of the problem may be the data.
Another way to determine if an issue is data-specific is to run the same process with a different dataset similar to the one that you are using. For instance, if using a point shapefile that fails to import into your file geodatabase, run the process on a different point shapefile of a similar size. If the tool or process succeeds on the new dataset, then the issue may be data-specific. Luckily, there are tools available that can help to resolve some of these data-specific issues.
If you find that the problem or error is reproducible with multiple datasets, you may want to investigate some of our additional resources to determine the source of the issue. Feel free to check out more resources from the first post in the WWTSD series (linked above).Possible Data-specific Issues and Their Solutions
A geometry error can be one potential source of a data-specific issue that has a quick fix. ArcGIS applications require that a feature's geometry meets certain standards. Issues can occur if any features have null or incorrect geometry. In ArcMap and ArcGIS Pro, you can determine if your dataset has any geometry errors by running the Check Geometry tool, which generates a table that lists the geometry errors found in the data. If there are errors present in the resulting table, run the Repair Geometry tool to fix the geometry errors present in the data. It is recommended to make a copy of your data prior to running this tool, as the tool may delete records with geometry errors.
If your features appear in a different location on the globe than you would expect, your data may have an issue with your data's projection. You can view the coordinate system of your data by navigating to the properties of the layer. If the data does not have a defined projection, you may need to use the Define Projection tool to assign the correct projection (see the tool documentation here for more information). If your data has been assigned a different projection than the other layers in your map, you may need to use the Project tool (here) to alter the coordinate system of your data. For more information about when to use the Define Projection tool versus the Project tool, take a look at the blog post found here. If you do not know what projection your data should be in, please see the technical article here for more information.
Data can become corrupt for various reasons, including incorrectly copying data or experiencing connection issues to a network drive. These issues sometimes can be resolved by exporting the data into a different format or location, such as to a different feature class or to a .tif rather than to a .png raster file. If you are working in a file geodatabase, run the Recover File Geodatabase tool, which creates a new file geodatabase with repaired versions of feature classes that the tool identifies as potentially corrupt. Considerations for Raster Datasets
Raster datasets have many parameters and properties and therefore, many sources of data-specific issues. The following by no means addresses all potential issues with raster datasets, but does address a couple common sources of data-specific issues for rasters and troubleshooting steps to address the issues.
Bit-depth is a characteristic of a raster that defines the possible cell values allowed for the dataset (for more information, click here). If the bit-depths of two or more rasters that you are running a geoprocessing operation on do not match, you may run into errors or issues. For instance, if you create a mosaic dataset containing rasters from multiple sources, you may want to confirm that the bit-depths of the rasters are the same. You can determine the bit-depth of a raster by navigating to the raster properties. If you must change the bit-depth of your raster, you can use the Copy Raster tool to manually set the necessary bit-depth and create a new output raster with those parameters.
When adding a raster dataset to a map document or creating a new one, you are given the option to build pyramids that control how the dataset is viewed at different scale levels. If you are unable to view your raster dataset at some scale levels, but not at other levels, the raster pyramids may have become corrupt. Exporting the raster into a different format or deleting and rebuilding pyramids may help resolve this issue. If you would like more information about deleting and rebuilding pyramids, click here.Contact Esri Support
These steps can help to begin narrowing down potential causes to an issue, but they may not resolve every potential problem. If you need additional assistance with diagnosing or resolving an issue, feel free to contact Esri Support. We are happy to assist our customers resolve any technical issue they encounter. When contacting Esri Support, please be prepared to provide the following information so that an analyst can assist you as efficiently as possible.
With the addition of the Train Random Trees Classifier, Create Accuracy Assessment Points, Update Accuracy Assessment Points, and Compute Confusion Matrix tools in ArcMap 10.4, as well as all of the image classification tools in ArcGIS Pro 1.3, it is a great time to check out the image segmentation and classification tools in ArcGIS for Desktop. Here we discuss image segmentation, compare the four classifiers (Train Iso Cluster Classifier, Train Maximum Likelihood Classifier, random trees, and Support Vector Machine), and review the basic classification workflow.Image Segmentation
Before you begin image classification, you may want to consider segmenting the image first. Segmentation groups similar pixels together and assigns the average value to all of the grouped pixels. This can improve classification significantly and remove speckles from the image.Train Iso Cluster Classifier
The Iso Cluster is an unsupervised classifier (that is, it does not require a training sample), with which the user can set the number of classes and divide a multiband image into that number of classes. This classifier is the easiest of all the classifiers to use, as it does not require creating a training sample and can handle very large segmented images. However, this classifier is not as accurate as the other classifiers due to the lack of training sample.Train Maximum Likelihood Classifier
The Maximum Likelihood Classifier (MLC) uses Bayes' theorem of decision making and is a supervised classifier (that is, the classifier requires a training sample). The training data is used to create a class signature based on the variance and covariance. Additionally, the algorithm assumes a normal distribution of each class sample in the multidimensional space, where the number of dimensions equals the number of bands in the image. The classifier then compares each pixel to the multidimensional space for each class and assigns the pixel to the class that the pixel has the maximum likelihood of belonging to based on its location in the multidimensional space.Train Random Trees Classifier
One supervised classifier that was introduced with ArcGIS 10.4 is the random trees classifier, which breaks the training data into a random sub-selection and creates classification decision trees for each sub-selection. The decision trees run for each pixel, and the class that gets assigned to the pixel most often by the trees is selected as the final classification. This method is resistant to over-fitting due to small numbers of training data and/or large numbers of bands. This classifier also allows the inclusion of auxillary data, including segmented images and digital elevation model (DEM) data.Train Support Vector Machine Classifier
Support Vector Machine (SVM) is a supervised classifier similar to the MLC classifier, in that the classifier looks at multidimensional points defined by the band values of each training sample. However, instead of evaluating the maximum likelihood that a pixel belongs to a class cluster, the algorithm defines the multidimensional space in such a way that the gap between class clusters is as large as possible. This divides the space up into different sections separated by gaps. Each pixel is classified where it falls in the divided space.Image Classification Workflow:
With the addition of the Create Accuracy Assessment Points, Update Accuracy Assessment Points, and Compute Confusion Matrix tools in ArcGIS 10.4, it is now possible to both create and assess image classification in ArcMap and ArcGIS Pro.
The general workflow for image classification and assessment in ArcGIS is:
Use the measures of accuracy (the user’s accuracy, producer's accuracy, and Kappa index) calculated by the confusion matrix to assess the classification. Make changes to the training sample, as needed, to improve the classification.
The best part about this six-step process is that it makes it pretty easy to compare different classification methods, and it’s often important to compare the different methods. Getting your training sites nailed down (step 2) is usually the toughest part, but steps 3 through 7 fly by since the analysis is done for you. In the end, you have several classified raster images to use in your work and can choose the best result based on your personal objectives.
As an example, we used this workflow to classify a Landsat 8 image of the Ventura area in Southern California. We used the MLC, SVM, and Random Trees (RT) methods to classify a single Landsat 8 raster captured on February 15, 2016. We classified the image into nine classes and manually selected training samples and accuracy assessment (“ground truth”) points. Additionally, we used a segmented image as an additional input raster for each classifier. Once we classified the rasters, we computed a confusion matrix for each output to determine the accuracy of the classification when compared to ground truth points. The Kappa index in the Confusion Matrix gives us an overall idea of how accurate each classification method is.
The results showed that each method did pretty well in the classification when looking at the Kappa indexes, as well as based on a visual assessment. In order of accuracy (from the highest Kappa index to the lowest), we see that the SVM output was the most accurate (Kappa = 0.915), followed by Random Trees (Kappa = 0.88) and finally the MLC method (Kappa = 0.846).
We can see from the Confusion Matrix that some methods did better than others for specific classes. For example, the MLC didn’t do too well with Bare Earth classification, but RT and SVM weren’t too much better. This is great information for honing in on a better-classified image–now we know that we should focus on getting better Bare Earth training samples to improve our results. You could keep going with this until you get a really high accuracy for all classes, if that’s what you need for your analysis. If you need just a general idea of the area, you could just take what you get in Round 1! Check out what we got:
It is difficult to troubleshoot serious problems with any type of software. When faced with repetitive and persistent crashing, it is important to ask yourself questions like, "Does the problem occur on multiple machines?", "Does this occur for every user on a machine?", and "Is the problem specific to certain data?" One of the more common reasons that ArcMap may repeatedly crash is when there are issues with the display adapter. An easy way to check if your display adapter is causing problems is to temporarily disable the adapter to see if that resolves the issue.NOTE: Before doing this, take note of what specific display adapter is being disabled. This will make it easier to undo any changes made.
To disable your display adapter:
Click Start, right-click on Computer, and select Manage.
In the Computer Management window, click on Device Manager and select the drop down for 'Display adapters'. You should see your display adapter(s) listed there.
Right-click your display adapter and select Disable. This enables the Windows generic driver to start running instead.
You may notice that your screen resolution is not as clear as it was before, in addition to any additional monitors you had running have now stopped. To adjust this to continue testing, right-click the desktop background and select 'Screen resolution'.
For screen resolution, select the drop-down menu under Resolution, and change it to the highest value possible.
Finally, open ArcMap and attempt to reproduce the behavior that caused the crash.
If the crash can no longer be reproduced after disabling the adapter, then you will need to troubleshoot what is wrong with the adapter. A few things to try:
Verify that the video driver currently being used is the most up-to-date driver available by visiting the Video Driver Manufacturer's website. You can check to see what video driver is installed by navigating to Start > Run, and typing in 'DXDIAG'. Click Yes to check that the drivers are digitally signed. Once the DirectX dialog window opens, click on the Display tab. This will provide you with information on the maker and version of the Graphics Device installed.
Try performing a clean uninstall and reinstall of the display adapter.
Visit the Can you run it? website, and check your system against the ArcGIS for Desktop system requirements. If your system does not meet the minimum requirements, you may have to upgrade your hardware.
If, however, you see the same crash happening after disabling the adapter, then you know that the display adapter is not the culprit. If the crash continues even after trying the steps listed above, you may want to reach out to Esri Support Services for additional help.Alexander N. and Alan R. - Desktop Support Analysts
Raster datasets have a large assortment of information beyond the basic pixel display. This information is stored in the properties and is helpful in understanding more about the data. Locating the properties of a raster dataset can be a tedious process if you are trying to compile information for a lot of datasets. This is because the workflow to find the properties of a raster dataset in the Catalog is to right click on each dataset and choose Properties > General tab.
For example, you are working on a project that requires an elevation image service to be created. The image service foundation is a mosaic dataset that references source rasters for each county in the state. You create the mosaic dataset, add the rasters, build the overviews, and add it to the map document. The mosaic dataset looks great until you begin zooming into the source data. In one spot you find that the elevation appears very dark instead of a consistent grayscale.
After investigating the source raster you find that the dataset is an 4-bit integer instead of the normal 32-bit float. This makes you wonder if there are other datasets that were created incorrectly. To find those datasets you can either zoom into small scale areas across the entire state, or you can open the properties for each individual raster. Either way the process would be very time consuming as each raster would need to be reviewed individually.
Another option is to access the raster properties through a Python loop. This process presents a more efficient and simplified solution. The information can be queried through the python Raster object and the Get Raster Properties geoprocessing tool.
Lidar data is become increasingly available and there are likewise new tools to analyze and display the data. LAS is the public file format for the interchange of three dimensional point cloud data between data users. LAS information can be downloaded and has a file extension of *.las. The LAS Dataset was created in order to grant the ability to utilize LAS data quickly in ArcMap. However, with the advent of the new format, new questions are raised, including what types of analysis, geoprocessing tools and toolbar buttons can be used on a LAS dataset. The ArcGIS 10.1 help documentation lists all of the tools that can be used on a LAS dataset within ArcToolbox and on the 3D Analyst toolbar, all with only a few small steps to enable them. What small steps am I referring to?
Full Extent of the LAS Dataset with the Tools greyed out.
However, the tools are not always active when working with LAS datasets and the 3D Analyst toolbar tools. The 3D Analyst toolbar tools are only available when a LAS dataset is displayed as a full resolution triangulated surface. A full resolution surface is indicated in the table of contents when 100% of points are being used to construct the triangulated surface. When the LAS dataset is displayed as a point set the tools will be disabled.
What does 'full resolution' mean? According to the ArcGIS help on the topic, the definition is, "the scale threshold used to control when the LAS dataset will render itself without thinning, using 100% of the LAS points." So once you have modified the settings or zoomed in far enough to see 100% of the LAS points, you can use all the tools in the 3D Analyst toolbar, right? Almost.
The final step is to choose the Elevation option from the Display As TIN options on the LAS dataset. Once you make that selection, you should see the LAS dataset appear in the drop-down for the 3D Analyst toolbar and the tools should become active.
Now you can create the profile graph or generate a 3d feature just as you would with a normally recognized surface in ArcMap.
With the release of ArcGIS 10, new tools were made available, allowing you to use 3D objects in geoprocessing tools, and opening up new possibilities for proximity analysis. One of the new tools is the Near 3D tool, which calculates the three-dimensional distance from each input feature to the nearest feature residing in or more nearby feature classes. However, when using the new 3D analysis tools, it's always important to consider the geometry of the features being used.
Suppose you have a 3D polygon and would like to determine whether 3D points are above versus below the surface of the polygon. Near 3D is a logical geoprocessing tool to determine which points are above and which are below, right?
3D Points with 3D Polygon
In this particular case, the answer is no. The results from the Delta Z created in the Near 3D analysis cannot be used because the geometry of the polygon is only maintained at the edge of the polygon. Therefore, the results will be based on the distance from the point to the edge of the polygon, not the distance from the point to the polygon surface directly above or below the point.
Points considered negative when only considering the Delta near Z
To determine whether the point is above and below the polygon surface, follow this workflow instead. 1. Convert the 3d Polygon to a TIN via the Create TIN tool. 2. Use the Add Surface Information tool to add the TIN values to the point. 3. Use the Add XY Coordinate tool if you do not have z-values for your 3D point data. 4. Add a field to calculate the difference. 5. Subtract the field with the TIN Elevation from the z-value of the point.
Correct output with the field showing the difference field
With this workflow, if the result is positive, the point is above the surface.
So, when using the new 3D Analysis toolsets, be sure to consider the geometry of the features being used.
"Why does my profile graph show an elevation change of 11 feet over 1,111,111 feet?"
In general, profiles show the change in elevation of a surface along a line. They help to assess the difficulty of a trail for hiking or biking, or to evaluate the feasibility of placing a rail line along a given route.
A Profile Graph represents height on the Y axis and horizontal distance on the X axis. The unit of distance along the X axis depends upon the units of the projected coordinate system (PCS) of the elevation raster data. For example, if data is in a Universal Transverse Mercator (UTM) PCS, the unit of distance will be in meters; if data is in State Plane PCS, the unit of distance will be in feet (US) as shown in the following figure.
However, sometimes it might be inconvenient and hard to understand the following: The elevation changes from 1,200 feet to 1,400 feet over a distance of 400,000 feet.
This may be a little like weighing a person in ounces, or counting age in minutes…
In fact, it is more meaningful when data is displayed in the following manner: The elevation changes from 1,200 feet to 1,400 feet over a distance of 80 miles as illustrated in the following figure.
Often there is a need to display the horizontal distance on the X axis of a Profile Graph in a unit other than the PCS units of the data, so as to do a quick comparison between different units or for better understanding of the data.
Here are the steps to change the horizontal distance units (units of X axis) on a Profile Graph without changing the PCS of elevation data:
1) Open Data Frame Properties (View > Data Frame Properties)
2) Click on Coordinate System Tab
3) Click Modify
4) Select the desired units (miles, kilometer etc.)
5) Click Apply.
6) Click OK.
7) Create a new Profile Graph.
And remember, don’t weigh yourself in ounces. Consider changing the units.
There are many different kinds of rasters that can be used in ArcMap. To better describe and explain these different kinds of rasters, Esri has created help documentation on technical specifications and supported raster formats. Included in the documentation is the ESRI GRID format. This is a very flexible format that many users are comfortable using to process and display data. There is a unique aspect to this format that many users are unaware of that is mentioned in the help documentation.
A grid dataset is always stored as 32 bit (either signed, unsigned, or floating point), but ArcGIS shows it above as being the most appropriate bit depth with regard to the cell values it contains.
So while the bit depth of the ESRI GRID may say 8 bit unsigned or 16 bit signed, the storage is still 32 bit. However, the properties in the source tab of the particular ESRI GRID raster may say something different. This is important when considering the behavior of the GRID compared to other formats like .imgs or .tifs.
This does not affect the average user of ESRI GRIDs, but just something to keep in mind and understand about the format. So fear not if you create a new ESRI GRID and the bit depth does not match your defined bit depth; it is still okay. Once the data is added,it should report the most appropriate bit depth. ESRI GRIDs are always 32 bit rasters regardless of what the source properties report.Jeff S. - Raster Support Analyst
I am sure many of you have already heard about CityEngine, and some of you may have even already contacted Support Services with questions related to it. For those who haven't been introduced, CityEngine is a stand-alone software product that provides professional users in architecture, urban planning, entertainment, GIS and general 3D content production with a unique conceptual design and modeling solution for creating 3D cities and buildings. It allows professional users in GIS, CAD and 3D to do the following:
Quickly generate 3D cities from existing 2D GIS data.
Do conceptual design in 3D, based on GIS data and procedural rules.
Efficiently model virtual 3D urban environments for simulation and entertainment.
CityEngine screenshots of Rotterdam
CityEngine also provides advanced capabilities for the direct export of the generated 3D city models out to other software tools, such as data management/analysis packages (like ArcGIS), 3D editing software (like Maya or 3DS Max), geo-visualization tools (like ArcGlobe or Google Earth), game engines (like Unity or Unreal), high-end rendering solutions (such as RenderMan), or web-ready cloud rendering services (like RealityServer).
If you work with 3D data, I would strongly recommend trying out CityEngine. You can start by downloading the CityEngine Free 30-day Trial. If you would like to know more about CityEngine, please contact us or review the information on the linked pages below.
If you are already using CityEngine and have ideas to further enhance the product, please post your ideas at CityEngine Ideas portal and/or post your comments at CityEngine Forums.Pavan Y. - Raster Unit Development Technical Lead
Trail hiking can be a fun and exciting activity when you properly prepare for the hike. An important step for preparation is knowing the trail. Some key factors include: length, minimum elevation, maximum elevation, and slope. If you do not have a trail information guide, you can easily calculate this information with ArcGIS for Desktop.First you must gather a polyline file for the trail and an elevation raster for the area of interest.
This tool takes the input surface and interpolates the heights for the features by converting them to 3D in the background. It then calculates the 3D properties for these features and writes the property values as attributes to the input feature class.
From the Add Surface Information tool dialog input these required parameters:
Input Feature Class: Hiking Trail Polyline Layer
Input Surface: Elevation Raster
Output Property: Select desired properties to be calculated
Once the Add Surface Information tool completes the calculation process, the selected Output Properties are appended to the input feature table.
With this information you are now able to properly plan your hiking trip. In addition to this hiking example, there are a several other applications for this concept such as for use with bike paths, pipelines, streets, and drainage areas.Timothy H. – Raster/3D Support Analyst