POST
|
I have returned to this thread looking for a way to calculate VRM. I tried your ArcScript, Mark Sappington, and it worked wonders. I was using an open source GIS that had a tool for VRM built into it, but for some reason the program decided to stop working for that tool. I highly suggest this tool (http://resources.arcgis.com/gallery/file/geoprocessing/details?entryID=F65FF927-1422-2418-A02A-EE72574A8C26) as mentioned by Mark. Thanks for everyone's help!
... View more
06-23-2012
06:15 PM
|
0
|
0
|
120
|
POST
|
I am looking at the spatial patterns of 7 different landcover types and would like to know if they are arranged randomly or not. I was hoping I would be able to run Moran's I on my data but have a few concerns: - I originally had a raster map with 7 different landcover types, which I converted to polygons - The only values associated with each landcover type are the codes denoting each of the 7 types (1, 2, 3, ..., 7) - Moran's I computes the mean and variance for the attribute being analysed to get Moran's I Index value If I run this spatial autocorrelation tool on my data, I am concerned that I will not get an accurate output, as the files only have code values, not a measurement of anything. The values I get are: - Index: 0.026753 - ZScore: 4.32751 - PValue: 0.000015 An index value of 0 indicates randomly spaced attributes. A postive value indicates a tendency for clustering. Since I have a significant pvalue, I can reject H0 (that feature values are randomly distributed across the study area). And since my pvalue is statistically significant, then my positive zscore tells me that my features are more clustered than would be exptected if underlying spatial processes were random. Am I interpreting this correctly? More importantly, are these calculations valid with my data type since I only have code values (1 to 7)?
... View more
06-01-2012
03:15 PM
|
0
|
0
|
299
|
POST
|
So, for some reason the sample tool won't work for me anymore. I have used it in the past without any problems, so I have no idea why it suddenly decided to stop working. Anyways, I need to extract the values from 2 different rasters so I can put them into SPSS and analyze the rasters. I wanted to use the sample tool to extract the values with the geographic coordinates so I can re-add them later to arcmap with new values calculated outside of arcmap. But, my sample tool isn't working. Does anyone know how I can do this using different tools? Or has anyone had this same issue with sample and know how to fix it? Thank you!!
... View more
03-05-2012
01:22 PM
|
0
|
1
|
382
|
POST
|
I found a better answer...batch processing! What a life saver. No need for iterations or extract by mask. Extract by mask seems to work sporadically (as I found by trying to use it, and then also seeing several other people have the same problem in the forums). So this is what I did: To extract all cells from my fishnet, I used the select layer by attribute tool. Arc10 freezes all the the time...so I did it manually. I should have been able to make model and set the expression and output layer as parameters, then run in batch processing...but arc10 seems extremely crash happy when I try to use a model or batch processing.
... View more
02-24-2012
12:41 PM
|
0
|
0
|
448
|
POST
|
I am having the same issue...extract by mask doesn't work but it still outputs a file in catalogue...it usually is incomplete (not the entire area within the mask) and doesn't have any projection information. I can look at it in catalogue and even add it to arcmap, but since it is incomplete it is useless to me. Also, I can't delete the files for some reason; when I try to arc crashes. Did you find another way to extract from a raster using a polygon mask?
... View more
02-24-2012
11:00 AM
|
0
|
0
|
1699
|
POST
|
So it does work! My next question is, how can I get the final output to be different for each iteration automatically? I know how to set the output as a parameter, but it would be too time consuming for me to enter in the output name every single time it runs...and that would defeat the purpose of an iterator.
... View more
02-24-2012
07:09 AM
|
0
|
0
|
448
|
POST
|
Thanks, Darren. I just built the model and it looks like it is going to work. I will let it run and let you know the results when I get them!
... View more
02-24-2012
06:57 AM
|
0
|
0
|
448
|
POST
|
I guess a better question is: Can I program band collection statistics to run within the area of one selected fishnet cell? And do that for all of the cells? I am not a programmer...so this is totally out of my realm...that's why I don't know if it is possible at all.
... View more
02-23-2012
12:58 PM
|
0
|
0
|
448
|
POST
|
I want to run the band collection statistics tool between 2 different rasters to determine their correlation. I can run this on the entire region and get 1 value, but I would like to create several smaller areas within the region to get several r values for the entire region. So far I have created a fishnet over the region, with cells that are 1000m x 1000m. I want to determine what the correlation between 2 rasters is within each cell of my fishnet, thus resulting in several r values over the entire region. I don't know how to run the band collection statistics tool within each of cell of my fishnet layer, so I thought I would extract each cell as a new layer...my only problem is that I have ~4000 cells, so this would be very time consuming. I have tried to create a model in model builder to select each cell and then extract the selection. I used the "iterate feature selection" iterator to select each different cell, but then I don't know what tool I should use to extract/export each cell selection as a new layer. I know how to export selected data (by right clicking on the layer I want to export from), but I don't know how to get this tool into my model. I ultimately would like to run band collection statistics on each cell (between 2 rasters) and apply the r value to the cell, allowing me to display areas of high and low correlation with a stretched colour scheme. If anyone has any ideas on how I can run this a little more efficiently, please let me know.
... View more
02-23-2012
12:49 PM
|
0
|
5
|
3373
|
POST
|
Thanks! I will give it a shot and let you know if its what I am looking for!
... View more
01-27-2012
12:25 PM
|
0
|
0
|
7940
|
POST
|
I want to combine different raster layers, but need the data to be measured at the same numerical scale. Therefore, I would like to normalize all of the different layers so I can add them together, and then renormalize them. I am not very good with statistics in general, and am unsure of how to even normalize a data set. I know there are multiple equations to do this, but don't know which one is right for me. But if we were to use the equation: Z = X - u / std Where, Z - the normalized value; u - mean; and, std - standard deviation; How would I calculate this in a raster? I have tried using "Calculate Statistics" (Data Management Tools) because I assumed it would calculate mean and standard deviation for me, but I don't know where the output goes, as I can't choose that in the window. Also, the information on what this tool actually calculates is pretty limited in desktop help. I have thought about using Focal Statistics to calculate mean and standard deviation individually, but I run into the problem of what scale to use - I want it to include the entire raster in the calculation. This tool would give me different outputs for mean and standard deviation, which I could then put into the raster calculator to solve the normalization equation. Can someone please let me know if using Focal Statistics is the best way to go about this? Am I even using the right equation to normalize my data? Any input is GREATLY appreciated!!
... View more
01-26-2012
01:30 PM
|
1
|
18
|
27798
|
POST
|
Awesome! Thanks for your help, that makes things more understandable.
... View more
07-19-2011
02:22 PM
|
0
|
0
|
638
|
POST
|
My mistake, I meant square, not square root. The difference between the way I saw it and your equation was the lack of squaring after applying the standard deviation in mine. I still don't understand how squaring it will scale the TRI. What does the approximation of the TRI mean? Thank you for your help!!
... View more
07-19-2011
10:17 AM
|
0
|
0
|
638
|
POST
|
I am trying to figure out the same problem as you, Rachel. I looked into Jeffrey's post a little further because it seemed like a simple way to solve the TRI, but I have some questions about it: If the TRI is: SQRT ((c-1)^2 + (c-2)^2 + (c-3)^2 + (c-4)^2 + (c-5)^2 + (c-6)^2 + (c-7)^2 + (c-8)^2)), where c is the focal centre within the 3x3 neighbourhood. And standard deviation of a dataset is the square root of the variance, and variance is the expected value of the squared difference ( variance(x) = E((x-u)^2) ). Then wouldn't TRI = (STDV[dem], rectangle, 3, 3)? The difference from Jeffrey's is the lack of square rooting the equation, which I think is not necessary since the standard deviation already calculates the square root of the sum of the squared differences within the 3x3 grid. If this is in fact the case, then the TRI can be calculated by using the Focal Statistics tool and solving with the standard deviation function within a 3x3 rectangle. I am not sure if this I am correct, so any feedback is welcome.
... View more
07-19-2011
09:50 AM
|
0
|
0
|
638
|
POST
|
Thanks Bill! I didn't even realize what I was doing would give the same result as the focal mean!! I will have to do the next few with the focal mean to see if it speeds things up... I have run correlations in excel and SPSS etc. but I want the visual aspect of it, so I will try out the Pearson correlation methods you have outlined to try and get a visual raster representation. Thank you for all of your replies, they really did help!! Tayler
... View more
03-31-2011
12:13 PM
|
0
|
0
|
275
|
Title | Kudos | Posted |
---|---|---|
1 | 01-26-2012 01:30 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|