How the accuracy has been computed and displayed in quickcapture.

257
1
Jump to solution
10-10-2019 09:23 PM
MapDivision
New Contributor

The set value of desired accuracy is dependent on the kind of project you have. How the current accuracy which has been displayed by the Quickcapture is been computed? Can we force the field staff to capture the location only when the desired accuracy is met through application design itself?.

0 Kudos
1 Solution

Accepted Solutions
IsmaelChivite
Esri Frequent Contributor

Hi. Great question.

The author of a QuickCapture project can define two horizontal accuracy thresholds:

  • horizontalAccuracyWarning: If not met, a warning at the bottom of the app will tell the user that the threshold is not met, but users will be able to continue capturing data. By default, this value is 30, but you can change it or make it null (no warning).
  • horizontalAccuracyError: If not met, users will not be able to capture new points. For any active line or polygon data capture, vertices not in compliance with the the threshold will not be added. By default this value is null (no error threshold), but you can change it.

Both thresholds are expressed in meters and can be changed through the JSON editor. You can use decimal numbers as long as the decimal separator you use is a period (.). At this moment you can also change the Warning visually from QuickCapture designer in Settings-General. We will be adding the Error threshold into the designer shortly.

The horizontal accuracy reported by QuickCapture is derived from the location sensor you are using.  On top of the actual value (say 6.2 meters for example), it is important to understand the level of confidence level for that value. That is, with what level of confidence will the location be within a radius of the horizontal accuracy reported (say within a radius of 6.2 meters of the point for example)? 

 

The Google Developer documentation, defines horizontal accuracy as the radius of 68% confidence. This is like saying that for Android devices, there is a 68% chance that the true location is within a radius equal to the reported horizontal accuracy. Apple has not documented the definition of horizontal accuracy values they report.

 

When using external high accuracy GNSS receivers with QuickCapture, the confidence interval can be more accurately be defined if the receiver is able to report what is known as root mean square (RMS) accuracy. For RMS accuracy, the default CL (Confidence Level) is 68%. Some organizations require reporting accuracy with a 95% CL. If this is the case, the United States Federal Geographic Data Committee (FGDC), through its National Standard for Spatial Data Accuracy, establishes that the following factors should be applied:

 

  • 1.7308 for horizontal accuracy.
  • 1.9600 for vertical accuracy.

 

The conversion between RMS 68% CL and 95% CL applying the factors above should only be calculated, again, if the GNSS receiver is reporting horizontal accuracy using RMS.

View solution in original post

0 Kudos
1 Reply
IsmaelChivite
Esri Frequent Contributor

Hi. Great question.

The author of a QuickCapture project can define two horizontal accuracy thresholds:

  • horizontalAccuracyWarning: If not met, a warning at the bottom of the app will tell the user that the threshold is not met, but users will be able to continue capturing data. By default, this value is 30, but you can change it or make it null (no warning).
  • horizontalAccuracyError: If not met, users will not be able to capture new points. For any active line or polygon data capture, vertices not in compliance with the the threshold will not be added. By default this value is null (no error threshold), but you can change it.

Both thresholds are expressed in meters and can be changed through the JSON editor. You can use decimal numbers as long as the decimal separator you use is a period (.). At this moment you can also change the Warning visually from QuickCapture designer in Settings-General. We will be adding the Error threshold into the designer shortly.

The horizontal accuracy reported by QuickCapture is derived from the location sensor you are using.  On top of the actual value (say 6.2 meters for example), it is important to understand the level of confidence level for that value. That is, with what level of confidence will the location be within a radius of the horizontal accuracy reported (say within a radius of 6.2 meters of the point for example)? 

 

The Google Developer documentation, defines horizontal accuracy as the radius of 68% confidence. This is like saying that for Android devices, there is a 68% chance that the true location is within a radius equal to the reported horizontal accuracy. Apple has not documented the definition of horizontal accuracy values they report.

 

When using external high accuracy GNSS receivers with QuickCapture, the confidence interval can be more accurately be defined if the receiver is able to report what is known as root mean square (RMS) accuracy. For RMS accuracy, the default CL (Confidence Level) is 68%. Some organizations require reporting accuracy with a 95% CL. If this is the case, the United States Federal Geographic Data Committee (FGDC), through its National Standard for Spatial Data Accuracy, establishes that the following factors should be applied:

 

  • 1.7308 for horizontal accuracy.
  • 1.9600 for vertical accuracy.

 

The conversion between RMS 68% CL and 95% CL applying the factors above should only be calculated, again, if the GNSS receiver is reporting horizontal accuracy using RMS.

View solution in original post

0 Kudos