I am building an iOS app using ArcGIS Runtime SDK (UIKit) that shows property parcel boundaries in AR using WorldScaleSceneView.
Problem: When basemap is visible, the graphic EXACTLY covers the property boundary on the basemap. But when I reduce the basemap opacity to 0, the graphic appears SMALLER than the actual real world property.
Property Details example sample data:
Sample Coordinates:
rings: [ [78.275931, 17.442276], [78.275931, 17.441165], [78.277095, 17.441165], [78.277095, 17.442276], [78.275931, 17.442276]]
My Current Setup:
// Scene setup
let scene = AGSScene(basemapStyle: .arcGISImageryStandard)
scene.addElevationSource()
scene.baseSurface?.opacity = 0
arView.locationDataSource = AGSCLLocationDataSource()
arView.translationFactor = 1
// Graphics overlay setup
boundariesGraphicsOverlay.sceneProperties =
AGSLayerSceneProperties(surfacePlacement: .drapedFlat)
// Symbol
AGSSimpleLineSymbol(
style: .solid,
color: UIColor(red: 1.0, green: 1.0, blue: 0.0, alpha: 1.0),
width: 5
)
What I Have Already Tried:
Questions:
One thing I'd like to confirm: when you set the basemap to translucent, say opacity = 0.5, does the basemap (and graphics) align with the real world well?
Generally speaking, world scale AR only works well in close distance, as the vanishing point of the scene and the graphics might not be the same of the AR camera. Thus, if the graphics are quite far away from the device (say > 50m away), I can imagine that the graphics show up smaller than what the real world field patches.
I'll also check with our 3D folks to get a better understanding on this before I respond with more details…