Hello everyone, how do you calculate the area of a raster dataset with 0.00025-degree cell size when you don't plan to re-project it to a projection with a unit in meter? I thought about using zonal geometry as table, after running, it gave me this: I did not know what's the unit of the area column. Did Arc just simply square the cell size of 0.00025 so that each pixel became 6.3*10^-8? Then that seemed to be ridiculous to me...I am confused, please help! (notes: I am not allowed to reproject this dataset so that it will not be distorted; and 0.00025 degrees is about 30 meters).
Solved! Go to Solution.
DThat is why I sent those links. Projecting the data is the best way to get planar area units if you consider the cell size projection environment settings. Note, that decimal degree data in raster form has a different area and perimeter which varies by latitude because the lines of longitude converge at the poles. One square degree is a different area at the equation than it is at any other latitude! Simple solutions for the sphere shouldn't be used, Project to a projected coordinate system appropriate to yolur latitude and area of coverage. Skipping this step will save you a couple of minutes at best
There is no easy conversion because the lines of longitude converge at the poles.
Read How the Cell Size Projection Method environment setting works—ArcGIS Pro | Documentation
and see the script example in
Cell Size Projection Method (Environment setting)—ArcGIS Pro | Documentation
where you can specify a projected coordinate system to use for the projection.
The simplest approach would be to actually project the raster using the information in the cell size projection method discussion. If you "can't" project it, this must be for a course or something (?)
Thanks for your answer, Dan. I thought this through and also had some extra readings: This raster I am dealing with only has a GCS, meaning it only has angular units (lat and lon) rather than linear units which we would use to calculate areas. Is that why I had to project it (give it linear units) so that I could calculate area?
DThat is why I sent those links. Projecting the data is the best way to get planar area units if you consider the cell size projection environment settings. Note, that decimal degree data in raster form has a different area and perimeter which varies by latitude because the lines of longitude converge at the poles. One square degree is a different area at the equation than it is at any other latitude! Simple solutions for the sphere shouldn't be used, Project to a projected coordinate system appropriate to yolur latitude and area of coverage. Skipping this step will save you a couple of minutes at best
Probably just digitize the extent of the raster or calculate/use the extent or use Raster to Polygon (reclass all raster values to a single value first) and then calculate the geodesic area AREA_GEOSEIC https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/calculate-geometry-attribute...
Thank you David, I tried this out, unfortunately the area shows to be 0...So I guess it does not work.
Which did you try? I'm very confident that any of those methods will work, if you can give more detail on what you did and maybe screenshot the 0 result I can provide some insight.
Thank you very much David! The reason I got 0 might be related to the output coordinate system? I set it as GCS WGS 1984...
raster to polygon isn't going to give you any better estimate of area than projecting the raster to a projected coordinate system.
30 meters will only be 0.00025 degrees of longitude at that specific latitude, as @DanPatterson says. If it is a small area, however, that value will not vary by much and there will not be much distortion when you project to a projected coordinate system.
Why are you not allowed to project the data?