POST
|
Sorry for the truncated question text -- I'm a newbie to GeoNet. My concern was that I sometimes see distances of as much as 9-10 meters between vertices when streaming feature at a walking pace, but with the streaming interval set to either 1 second or 1 meter.
... View more
06-09-2020
07:03 PM
|
0
|
0
|
325
|
POST
|
I recall that Collector Classic used to defer placing vertices if there had been little movement since the preceding vertex had been recorded. For example, when stationary, but while collecting a line feature, a new vertex would not be added until GPS error had created an apparent movement. Now that the new Collector for Android allows setting either a distance or a time interval for streaming, I had expected different behavior.
... View more
06-09-2020
06:59 PM
|
0
|
1
|
373
|
POST
|
Hi Peter! Thanks very much for your guidance. Following it, I was able to export my data as a file geodatabase, then downloaded it and opened it in ArcMap on my desktop. There I was not only able to see the date and time together in a single field, but (also happily) was able to see the alias names of the GNSS metadata fields, rather than having to reconstruct them manually. Since I didn’t have any important reason to be exporting as a shapefile – other than the fact that I’m more used to shapefiles than geodatabases – I haven’t yet tried your other suggestion of using the DatePart function in the Field Calculator. I will try that also since it seems a good additional arrow to have in my quiver. I’ve copied Randy Burton on this message since he also replied to my original question. Thanks again for your prompt and very helpful message! Sincerely, JGWjr John G. Whitman, Jr. Green Forest Farm PO Box 177 (1011 Potter Hill Rd) Readsboro, VT 05350-0177 (802) 423-9917
... View more
02-18-2018
11:21 AM
|
0
|
0
|
325
|
POST
|
I've used the Field Notes template to add GNSS metadata fields to a webmap. Some of those fields include both a date and a time. When opened in ArcMap on my desktop and then exported as a shapefile, those fields show only the date, but omit the time. I've been advised by ESRI Support that this is probably because shapefiles support Date as a field type, but don't allow the time to be included. (I'm told that the Tracking Analyst extension might allow dates and times to be concatenated in a field, but don't have that extension.) It would be advantageous to me if I could retain the time as well as the date when I'm working in ArcMap. So far, my workaround (practical for only a small number of records) is to write down the time values as I see them in the webmap, then manually populate a string field named "Time" with those values. It would seem desirable if Collector were to put the data and time information for GNSS metadata into separate fields so that this cumbersome### workaround would not be required.
... View more
02-16-2018
01:17 PM
|
0
|
2
|
497
|
POST
|
Thanks to Eric Rice to his response to my intial post, noting that the tool is operating as designed, but acknowleging the desirability of an enhancement that would compute a weighted mean correctly and submitting a request for such an enhancement. My use of the focal statistics tool was an attempt to circumvent limitations of the Spatial Analyst low pass filter tool. That tool specifies a 3x3 kernel with uniform weighting, but I needed to do weighted filtering. In a response to an earlier post that I had made re that low pass filter tool, ESRI had suggested that I might overcome its limitations by using the focal statistics tool with a weighted neighborhood to compute the mean statistics. As that tool now operates, this is not possible. In Eric's response to my post, he states "after all, we just need to sum the weights and use the sum in the denominator". This statement is correct and it is sufficient in the case where the input array includes no NoData cells. In the general case, NoData cells may be present and the design of an enhancement and the documentation of that enhancement must treat these appropriately. For a weighted mean calculation on an input array containing NoData elements, the divisor must be the sum of the weights associated with the valid data cells. When the (default) "Ignore NoData" option is selected, the NoData cells must be ignored both in the numerator computation of the weighted sum and in the denominator summation of weights. The way that the current Spatial Analyst low pass filter tool handles NoData cells (both internal to the input array and beyond its extents) is an appropriate model for an enhancement of the weighted mean calculation of an enhanced focal statistics tools as well. It needs only to be generalized to allow different kernel sizes/shapes and non-uniform weights. Caution: My original post in this thread included an example in which there were no NoData cells in the input array. When there are no NoData cells, the workaround that I described in that post allows the current version of the focal statistics tool to correctly compute a weighted mean in the central portion of the array, but my workaround does not handle NoData cells properly. JGWjr
... View more
09-08-2011
04:56 PM
|
0
|
1
|
391
|
POST
|
Use of the Spatial Analyst focal statistics tool with the Weight neighborhood type and an arbitrary weight kernel file to compute a mean, the result computed for any processing cell is NOT the expected "weighted mean" of the values of the cells within its neighborhood. That this is the case may be seen from the following example which uses an input processing raster filled with constant values and the kernel file shown below. Input processing raster: 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 1000 Kernel file: 5 5 1 2 3 2 1 2 4 6 4 2 3 6 9 6 3 2 4 6 4 2 1 2 3 2 1 Output raster (NoData cells ignored): 4000 4000 3600 3600 3600 3600 3600 3600 4000 4000 4000 4000 3600 3600 3600 3600 3600 3600 4000 4000 3600 3600 3240 3240 3240 3240 3240 3240 3600 3600 3600 3600 3240 3240 3240 3240 3240 3240 3600 3600 3600 3600 3240 3240 3240 3240 3240 3240 3600 3600 3600 3600 3240 3240 3240 3240 3240 3240 3600 3600 3600 3600 3240 3240 3240 3240 3240 3240 3600 3600 3600 3600 3240 3240 3240 3240 3240 3240 3600 3600 4000 4000 3600 3600 3600 3600 3600 3600 4000 4000 4000 4000 3600 3600 3600 3600 3600 3600 4000 4000 The value of 3240 seen for the cells in the central part of the output raster is 81000 / 25. (The other values near the array edges are different because the processing has encountered NoData cells. No further discussion will be given here of any cells in the output array near the edges.) The value 81000 for the cells in the central part of the output raster is the weighted sum of the product of the input cell raster cell values and the kernel weights (as it should be). This can be confirmed by computing the sum statistic, instead of the mean statistic, using the focal statistics tool on the same input raster and with the same kernel. The problem, then, is the tool's division of the weighted sum values by 25 (the number of cells in the kernel neighborhood, rather than by 81 (the sum of the weights used in that kernel neighborhood. Conventionally, and intuitively, the divisor by which a weighted mean should be created from a weighted sum should be the sum of the weights. That is not what this tool does. A workaround for this error is easily contrived. Since the weighted sum statistic is computed correctly for a non-uniform kernel, the weighted sum tool can be used to compute the weighted mean for any kernel by proportionally reducing each of the kernel values so that the sum of all weights in the kernel is unity. For my example above, since the kernel weights add to 81, each should be divided by 81 to form a new kernel file. With that change, then use of the focal statistics tool to compute a weighted sum statistic does produce values of 1000 throughout the central portion of the output raster.
... View more
09-06-2011
10:08 AM
|
1
|
4
|
2817
|
POST
|
I'm having difficulty obtaining the expected result with the ArcGIS10 Spatial Analyst Low pass filter tool. Specifically, I have been unable to reproduce the results shown in Example 2 of the "Learn more about how the Filter Tool works" on the Help page. I first created a point shapefile containing data matching Example 2 (including NoData values) and then used the Point to Raster tool to create from these points a 5 x 5 raster for experimentation. The input array cell values are shown below. Input Raster Values (Example 2 and My Raster) 2.000 3.000 4.000 5.000 6.000 2.000 3.000 4.000 NoData 6.000 2.000 3.000 4.000 5.000 6.000 2.000 30.000 4.000 5.000 NoData 1.000 2.000 2.000 3.000 NoData I then ran the Low Pass Filter tool on that input raster, using both the "Ignore NoData" and "Don't Ignore NoData" switch options. Here are the results that I obtained. My Results (with Ignore NoData Setting) 2.500 3.000 3.800 5.000 5.667 2.500 3.000 3.875 5.000 5.600 7.000 6.000 7.250 4.857 5.500 6.667 5.556 6.444 4.143 4.750 8.750 6.833 7.667 3.500 4.000 My Results (with Don't Ignore NoData Setting) NoData NoData NoData NoData NoData NoData 3.000 NoData NoData NoData NoData 6.000 NoData NoData NoData NoData 5.556 6.444 NoData NoData NoData NoData NoData NoData NoData Help Page Example 2 Output Values 2.333 3.000 3.889 5.000 5.778 2.333 3.000 3.889 NoData 5.778 5.333 6.000 6.889 4.889 5.778 5.000 5.556 6.444 4.333 NoData 4.667 5.111 5.889 3.111 NoData No discrepancies exist between my results and Example 2 output values for those processing cells interior to the array and encountering no NoData values in their 3x3 neighborhood. Near the edges, however, and where NoData values are encountered the Help page explains that the processing algorithm operates as follows: "When an input raster cell on the edge of the filter has a NoData value, the z-value of the cell is substituted for the missing z-values." "On the edges of the raster, the filter lies partially outside the raster. When this occurs, the z-value of the cell at the center of the filter is substituted for the missing z-values." I note that the 3x3 kernel of the LOW filter is equally weighted, with each weight = 1/9. Also, all cell values in the input array are integers. It should be expected, then, that multiplying the values of cells in the output array by 9 should produce integer values. This is indeed the case for the Example 2 output values (e.g., 5.778 x 9 = 52; 6.889 x 9 = 62, etc.) This is NOT the case, however, for some of the values in my results -- the ones where the discrepancies noted above have been observed. For example, 3.875 x 9 = 34.875; 4.857 x 9 = 43.713, etc. I do not understand how this situation could occur. The Tool Help for the Filter "Ignore NoData in calculations" switch describes its operation as follows:' "Checked -- if a NoData value exists within the filter, the NoData value will ignored. Only cells within the filter that have data values will be used in determining the output." "Unchecked: if a NoData value exists within the filter, the output for the processing cell with be NoData." On the Help page, the Example 2 Output Values tabulated above don't explicitly state whether the "Ignore NoData in calculations" switch was checked or unchecked. By my reading it appears from the above quotes that it was checked since data values are produced in that example output for many of the cells for which a NoData cell was within the processing filter. My reading of the two quotes above, however, doesn't lead me to understand why so many of my output cells in the "Don't Ignore NoData Setting" case, were set to NoData. I'd very much appreciate help in understanding whether I have somehow misunderstood the processing algorithm for the low pass filter, most specifically the handling of NoData cells both in the interior of the input array and at its edges. Additionally, since I find in my work that the uniform weighting of the 3x3 low pass filter kernel provided in Spatial Analyst is more restrictive than would be desirable, I'd appreciate any advice as whether and how I might be able to create a low pass filter with altered kernel cell values so as to accomplish weighted low pass filtering with either a 3x3 or 5x5 kernel. Thanks in advance for your help!
... View more
06-11-2011
08:42 PM
|
0
|
1
|
1477
|
Title | Kudos | Posted |
---|---|---|
1 | 09-06-2011 10:08 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|