POST
|
Mark, Not that the idea is bad, especially if someone manually types in the extent and wants to "view" it... but if someone is choosing a layer, then we already provide tools/visualization of extent depending on if it is a feature or a raster. For raster, just display NoData in addition to real values - that is the extent. For features, there is a tool called Minimum Bounding Geometry. Choose the Envelope option to get a polygon output that is the extent. Best, Eric
... View more
08-19-2013
03:32 PM
|
0
|
0
|
719
|
POST
|
It's just 336.5979 points (sewage overflow locations)/ sq mile. If you haven't read the blog below, I think it may be helpful. How should I intrepret the output of density tools? In particular, you may be interested in converting density into expected count. Best, Eric
... View more
08-19-2013
02:51 PM
|
0
|
0
|
1457
|
POST
|
If an area unit is selected, the calculated density for the cell is multiplied by the appropriate factor before it is written to the output raster. To see the effect of going to sq miles, set the area unit to SQUARE_MAP_UNITS (your original unit feet). Best, Eric
... View more
08-19-2013
11:11 AM
|
0
|
0
|
1457
|
POST
|
Did you check out How Kernel Density works for point features? Best, Eric
... View more
08-19-2013
10:55 AM
|
0
|
0
|
1457
|
POST
|
Hi Peter, I recommend some reading at this point. It's long, but worth it. For the Tiff comment it is simply faster to access than a FGDBR. Here is an excerpt from Imagery: Data Management and patterns and recommendations. "Optimized image formats�??Some imagery can be slower to read than others due to their storage format or compression, and it is recommended that you convert these into more optimal formats. For example, an ASCII DEM image format is slow to read; therefore, it is recommended that you convert it to a format such as TIFF. Also, if the image is very large and not tiled it is recommended that you convert this to a tiled TIFF format to optimize disk access." Since you have to convert from ECW you might as well go to an optimized image format like Tiff. Also, while in ArcMap go to Customize > Options > Raster tab, and make sure Created Tiled Tiff is checked on. It should be on since that is the default. You can even compress the Tiff file, so you might consider Tiled Tiff with jpeg compression. If you know Peter Becker and he is familiar with your data he can probably guide you on this quite well. For the format questions please see the following. Esri GRID format How raster data is stored and managed (this explains the FGDB parts including PGDB and Enterprise geodatabases) If you really want to know how a FGDBR is actually stored in the FGDB you can read all about it in the SDE section since they follow the same idea/storage. Raster datasets and raster catalogs in a geodatabase in SQL Server Best, Eric
... View more
08-19-2013
09:52 AM
|
0
|
0
|
770
|
POST
|
Jay, Review the section "Controlling the visibility analysis" within the topic Using Viewshed and Observer Points for visibility analysis. It sounds like you need to use the OFFSETA field in your observer attribute table. Keep in mind the offset units are assumed to be the same unit as your projection. Best, Eric
... View more
08-19-2013
09:05 AM
|
0
|
0
|
1331
|
POST
|
Hi Peter, You should convert the data to Tiff. Peter Becker would agree with that and I would be surprised if that wasn't already advised. Tiff should be faster than File Geodatabase rasters. Please note that FGDB rasters and Esri GRID are two completely seperate formats. It's impossible to store an Esri GRID in a File Geodatabase. Best, Eric
... View more
08-19-2013
08:39 AM
|
0
|
0
|
770
|
POST
|
Greetings, Altering the range of values (the statistics min/max) you see in the TOC is quite different from your assertion that the software actually alters the pixel values when doing a copy. Most likely during the conversion process, pyramids are being created along with statistics (which are built off a pyramid for performance) if you have default environment settings. The statistics on a pyramid are basically always different than statistics built on the source value. Have you used Identify button to confirm that the original raster and the copied raster have different pixel values for any given pixel location? If you want statistics to be built off the source data, then you have to delete the pyramids and stats, then build stats, then build pyramids if you want them. To avoid this in the future you can change your environment settings and turn off build stats/pyramids. Best, Eric
... View more
08-16-2013
12:38 PM
|
0
|
0
|
855
|
POST
|
You can't just click 'add data' and browse to a .las file and display it. It's not supported like that. See Using lidar in ArcGIS as well as What is a LAS Dataset for good starting points. Best, Eric
... View more
08-16-2013
09:37 AM
|
0
|
0
|
2733
|
POST
|
On the main menu go to Geoprocessing > Geoprocessing options. Uncheck "Enable" background processing. 🙂 I read a bit more... it should still work on PGDB. The 2gb limit is on the .mdb file, and PGDB doesn't actually store the raster data. It writes .img, jpg, jp2 files. See How raster data is stored and managed. I would still personally use FGDB raster though. Eric
... View more
08-15-2013
12:30 PM
|
0
|
0
|
1331
|
POST
|
Ryan, The display units are not actually connected to the coordinate system of the dataframe. It's probably returning X,Y in A Hammer_Aitoff projection space. To get X,Y in Decimal Degrees your coordinate system on the dataframe needs to be in a GCS. To confirm, go to the General tab of the dataframe properties. I bet your Map units are Meters, and your Display units are Decimal Degrees. Best, Eric
... View more
08-15-2013
11:41 AM
|
0
|
0
|
307
|
POST
|
Jay, Try running it in the foreground. That error is specific to background geoprocessing. I would also recommend NOT using a personal geodatabase due to its size limitations. See Types of geodatabases for additional details. Best, Eric
... View more
08-15-2013
10:43 AM
|
0
|
0
|
1331
|
POST
|
Extrude the polygons in ArcScene, then run Layer 3D to Feature Class to create a multipatch feature class which can be your Input Features in Skyline. Best, Eric
... View more
08-15-2013
09:15 AM
|
0
|
0
|
431
|
POST
|
Here's the problem: For some reason I cannot clip the paths from 2006 NLCD or attribute the habitats encountered by the owls in their flight paths. The GIS analyst I am working with noted that the map from ArcGIS online was in raster format and in order to attribute the data, it needs to be in vector format. I'm not sure I know what I am doing or what to do. Has anyone else had this problem? Re-read your post. You probably can't clip because the flight path buffer is larger than the max request size on the service. Try it with a polygon known to be smaller than the limit first. The layer is restricted to a 24,000 x 24,000 pixel limit, which represents an area of nearly 450 miles. For the record, raster data can have attribute tables. See Raster dataset attribute tables for more information. Best, Eric
... View more
08-07-2013
01:53 PM
|
0
|
0
|
998
|
POST
|
Eric, Zonal Stats is not going to give you the distribution of the values. It won't tell you the count of every value it finds in the zone - it just gives stats. For your request, you should look into Zonal Geometry as Table or Zonal Histogram. I have a GPK in a thread below to convert the raw count distributions into percentages using Zonal Geometry as Table and some other tools I strung together. Calculating the area and the percent of cells of each value in a raster (GPK) extracting information from a Raster landuse file by a polygon shape file ArcMap - Land use type as a percentage of total area To use the GPK properly you clip the input raster to the zone (buffer), since the tool has no parameter for zones (polygon zones in addition to the value raster) despite its name. It's referring to the internal zones (values) inherent in the data. In other words, use the value field (or the landcover field). I haven't looked at the service, sorry. Also, the NLCD service you refer to has a max request size limit imposed. Assuming your buffers aren't so large as to exceed the max request size you can clip the service to your buffer and take the result to my GPK. Or just give Zonal Histogram a go, but it won't give you percentages - just the raw distribution. Best, Eric [h=2]
... View more
08-07-2013
01:31 PM
|
0
|
0
|
998
|
Title | Kudos | Posted |
---|---|---|
1 | 04-12-2011 03:29 PM | |
1 | 09-25-2012 07:54 AM | |
1 | 09-21-2012 08:56 AM | |
1 | 09-19-2012 11:59 AM | |
1 | 11-02-2011 08:15 AM |
Online Status |
Offline
|
Date Last Visited |
10-22-2021
02:45 PM
|