Hi all,
Hoping to get some advice on something i want to undertake in next few days. I am trying to build a fully offline map application on iOS using the SDK. I have already complete the first of 2 major features by being able to render a map with relevant feature layer geometry but i now want to undertake the more complex feature.
I want to be able to view all feature layer geometry and/or scene layer meshes embedded inside my .mspk file in a world space AR. So be able to view a building mesh in its actual location in AR for example. Is this even possible? If so does anyone know of any tutorials or samples i can see for reference? Have never worked with AR before and the documentation and samples online never seem to mention anything regarding offline rendering in AR. Any help is appreciated. Thanks.
Solved! Go to Solution.
Ok it took me too long to realise that my .mspk file was incorrectly formatted I think (was categorised as local webscene according to error message) I followed the tutorial here on how to create an offline scene: https://pro.arcgis.com/en/pro-app/latest/help/sharing/overview/create-an-offline-scene.htm which helped me address the issue. Afterwards with a bit of fiddling i was able to see the scene content! I also added a switch so that I could view it in table top mode as well. So good progress! I will share my code in case anyone else is interested. Still has issues, namely that the scene content, despite being positioned correctly relative to each other, isn't quite in its correct geographical position. Will have to fiddle further to see what I can do about this. I also suspect this might be related to the parallax issue mentioned by @Ting so not sure if there is a workaround. All advice is welcomed. Thanks
Code below:
import Foundation
import UIKit
import ArcGIS
import ArcGISToolkit
import ARKit
import SwiftUI
import Combine
import CoreLocation
// SwiftUI View
struct AugmentRealityToCollectDataView: View {
@State private var scene: ArcGIS.Scene?
@State private var error: Error?
@State private var viewMode: ViewMode = .tabletop // Initial view mode
@State private var anchorPoint: Point? = nil
@State private var clippingDistance = 500.0
@State private var translationFactor = {
// The width of the scene, which is about 800 m.
let geographicContentWidth = 800.0
// The physical width of the surface the scene will be placed on in meters.
let tableContainerWidth = 1.0
return geographicContentWidth / tableContainerWidth
}()
enum ViewMode {
case tabletop
case worldScale
}
var body: some View {
VStack(spacing: 0) {
if let scene = scene, let anchorPoint = anchorPoint {
if viewMode == .tabletop {
TableTopSceneView(
anchorPoint: anchorPoint,
translationFactor: translationFactor,
clippingDistance: clippingDistance
) { _ in
SceneView(scene: scene)
}
.edgesIgnoringSafeArea(.all)
} else {
WorldScaleSceneView { _ in
SceneView(scene: scene)
}
.edgesIgnoringSafeArea(.all)
}
} else {
ProgressView("Loading scene...")
.task {
await loadScene()
}
.task {
await startLocationUpdates()
}
}
}
.overlay(alignment: .top) {
VStack {
Text("Scene Loaded")
.multilineTextAlignment(.center)
.frame(maxWidth: .infinity, alignment: .center)
.padding(8)
.background(.regularMaterial, ignoresSafeAreaEdges: .horizontal)
if scene != nil {
Button(action: {
switchViewMode()
}) {
Text("Switch to \(viewMode == .tabletop ? "World Scale" : "Tabletop") View")
.padding()
.background(Color.blue)
.foregroundColor(.white)
.cornerRadius(8)
}
}
}
}
.alert(isPresented: .constant(error != nil)) {
Alert(title: Text("Error"), message: Text(error?.localizedDescription ?? "Unknown error"), dismissButton: .default(Text("OK")))
}
}
func switchViewMode() {
viewMode = (viewMode == .tabletop) ? .worldScale : .tabletop
}
func loadScene() async {
guard let filePath = Bundle.main.path(forResource: "3D-East-London", ofType: "mspk") else {
print("MSPK file not found")
return
}
let mspk = MobileScenePackage(fileURL: URL(fileURLWithPath: filePath))
do {
try await mspk.load()
print("MSPK loaded successfully.")
if let loadedScene = mspk.scenes.first {
print("Attempting to load scene: \(loadedScene)")
// Ensure the scene is fully loaded
try await loadedScene.load()
// Load all operational layers
for layer in loadedScene.operationalLayers {
try await layer.load()
print("Layer name: \(layer.name)")
if layer is FeatureLayer {
print("Layer is a FeatureLayer")
} else if layer is ArcGISSceneLayer {
print("Layer is a SceneLayer")
} else {
print("Unknown layer type")
}
}
self.scene = loadedScene
print("Scene loaded successfully.")
} else {
print("No loadable scenes found in MSPK.")
}
} catch {
print("Error loading MSPK or scene: \(error)")
DispatchQueue.main.async {
self.error = error
}
}
}
func startLocationUpdates() async {
let locationManager = CLLocationManager()
locationManager.requestWhenInUseAuthorization()
let locationDataSource = SystemLocationDataSource()
do {
try await locationDataSource.start()
} catch {
print("Error starting location updates: \(error)")
return
}
Task {
guard let initialLocation = await locationDataSource.locations.first(where: { _ in true }) else {
print("Failed to get initial location")
return
}
self.anchorPoint = Point(x: initialLocation.position.x, y: initialLocation.position.y, z: initialLocation.position.z)
print("Current location: \(self.anchorPoint!)")
}
}
}
// Bridge to React Native
@objc(ARNativeViewManager)
class ARNativeViewManager: RCTViewManager {
override func view() -> UIView! {
return ARNativeView()
}
override static func requiresMainQueueSetup() -> Bool {
return true
}
}
class ARNativeView: UIView {
var hostingController: UIHostingController<AugmentRealityToCollectDataView>!
override init(frame: CGRect) {
super.init(frame: frame)
setupARView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupARView()
}
func setupARView() {
let arView = AugmentRealityToCollectDataView()
hostingController = UIHostingController(rootView: arView)
addSubview(hostingController.view)
hostingController.view.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
hostingController.view.topAnchor.constraint(equalTo: self.topAnchor),
hostingController.view.bottomAnchor.constraint(equalTo: self.bottomAnchor),
hostingController.view.leadingAnchor.constraint(equalTo: self.leadingAnchor),
hostingController.view.trailingAnchor.constraint(equalTo: self.trailingAnchor)
])
print("AR view setup completed.")
}
}
Thanks for asking this question Dominic!
The usecase you described is a World-scale AR concept. Please see this doc for the AR workflows we support.
We support AR via a product called the toolkit, which consists of a series of prebuilt components to facilitate development. In the toolkit we have the WorldScaleSceneView for the world-scale AR workflows, that allow you to insert an ArcGIS scene view into the AR experience. Please see the following docs for it:
One thing to point out is that, when you use the mobile scene package in a world-scale AR experience, the 3D scene package is "overlaid" on top of the AR view, rather than being a real 3D "object" immersed in the AR view. That leads to parallax visual effects when your device is moving in the real world - it seems that the scene view is shifting along the path you are walking. Depending on your usecase, this might or might not be a problem.
A side note: the mobile scene package also works in the Tabletop AR experience, which allows you to anchor the mobile scene package on a table top so you can examine it in AR. We have a code sample for it and you can also download the app from App Store to see it live: https://apps.apple.com/us/app/arcgis-maps-swift-samples/id1630449018
Please share any comment regarding the AR toolkit components! It is only released recently and we really would like to learn people's feeling about it, and address feedback and complaints to make it more developer-friendly. 🙂
Thanks so much for the reply @Ting! I'll be sure to share my progress as I go. I've being used the sample you linked as reference thus far. However, one thing i've noticed is despite having the toolkit installed, I'm not able to access many classes that belong in the runtime SDK like AGSScene (https://developers.arcgis.com/ios/api-reference/interface_a_g_s_scene.html) for example. Xcode reports the inclusion of these as "out of scope". I've found this very strange considering i believe the toolkit to come bundled with the vanilla runtime SDK of which these classes belong to and im of course importing both 'ArcGIS' & 'ArcGISToolkit' packages. Not sure if you might be able to shed some light on the matter?
Ah, are you using the 100.x Runtime SDK (UIKit)? It has its own companion toolkit here: https://github.com/Esri/arcgis-runtime-toolkit-ios
The link I've attached are for the latest 200.x versions. The 200.x Maps SDK moved to SwiftUI with many enhancements. If possible, we encourage people to move up to the new SDK to enjoy the better performance and new features.
The old toolkit also has AR components, although they don't include the latest ARKit advancements in raycasting, plane detection, and geo-tracking. You can still try it out and follow the tutorials here: https://developers.arcgis.com/ios/scenes-3d/display-scenes-in-augmented-reality/
Hi @Ting Been busy with other things for a little while. I think I got a little turned around between the 100.x and 200.x versions in the beginning. I was seeing documentation for the 100.x SDK and believing it to be applicable to 200.x and vica-versa. That's my bad. However, I now know better. I have spent some time with it yesterday and today and made a little progress. I will share it here for others interested. Thus far, I've been able to adapt the sample you linked to create a scene in AR and load my mspk file however despite my logs leading me to believe that i have successfully loaded my mspk i'm not able to see it AR as of yet. Just the default white grid. I will post my swift code below for reference. If you by chance spot anything glaringly wrong with my approach. I will happily accept the advice. Thanks!
import Foundation
import UIKit
import ArcGIS
import ArcGISToolkit
import ARKit
import SwiftUI
import Combine
import CoreLocation
// SwiftUI View
struct AugmentRealityToCollectDataView: View {
@StateObject private var model = Model()
@State private var statusText = "Tap to create a feature"
@State private var error: Error?
var body: some View {
VStack(spacing: 0) {
WorldScaleSceneView { _ in
if let scene = model.scene {
print("SceneView is being rendered with the scene.")
return SceneView(scene: scene, graphicsOverlays: [model.graphicsOverlay])
} else {
print("SceneView is being rendered with an empty scene.")
return SceneView(scene: ArcGIS.Scene())
}
}
.calibrationButtonAlignment(.bottomLeading)
.onSingleTapGesture { screenPoint, scenePoint in
if let scenePoint = scenePoint {
model.addGraphic(at: scenePoint)
statusText = "Feature added at: \(scenePoint)"
}
}
.task {
do {
let mspkURL = Bundle.main.url(forResource: "East-London-Map", withExtension: "mspk")!
try await model.loadMSPK(fileURL: mspkURL)
try await model.startLocationDisplay()
} catch {
self.error = error
print("Failed to load MSPK: \(error.localizedDescription)")
}
}
.overlay(alignment: .top) {
Text(statusText)
.multilineTextAlignment(.center)
.frame(maxWidth: .infinity, alignment: .center)
.padding(8)
.background(.regularMaterial, ignoresSafeAreaEdges: .horizontal)
}
.alert(isPresented: .constant(error != nil)) {
Alert(title: Text("Error"), message: Text(error?.localizedDescription ?? "Unknown error"), dismissButton: .default(Text("OK")))
}
}
}
}
// View Model
class Model: ObservableObject {
@Published var scene: ArcGIS.Scene?
@Published var graphicsOverlay = GraphicsOverlay()
private let locationDisplay = LocationDisplay(dataSource: SystemLocationDataSource())
func startLocationDisplay() async throws {
do {
// Request location permission if not determined
let locationManager = CLLocationManager()
if locationManager.authorizationStatus == .notDetermined {
locationManager.requestWhenInUseAuthorization()
}
// Start the location display
try await locationDisplay.dataSource.start()
locationDisplay.autoPanMode = .recenter
if let location = locationDisplay.mapLocation {
let camera = Camera(latitude: location.y, longitude: location.x, altitude: 10, heading: 0, pitch: 0, roll: 0)
scene?.initialViewpoint = Viewpoint(center: camera.location, scale: 1000)
print("LocationDisplay started successfully at location: \(location.y), \(location.x)")
}
} catch {
print("Error starting location display: \(error.localizedDescription)")
throw error
}
}
func loadMSPK(fileURL: URL) async throws {
let mobileScenePackage = MobileScenePackage(fileURL: fileURL)
try await mobileScenePackage.load()
if let scene = mobileScenePackage.scenes.first {
DispatchQueue.main.async { [weak self] in
self?.scene = scene
print("MSPK loaded successfully with scene: \(scene.initialViewpoint.debugDescription ?? "Unnamed Scene")")
self?.printSceneLayers(scene: scene)
}
} else {
print("No scenes found in MSPK.")
}
}
func printSceneLayers(scene: ArcGIS.Scene) {
print("Scene contains \(scene.operationalLayers.count) operational layers:")
for layer in scene.operationalLayers {
print("Layer: \(layer.name), Visible: \(layer.isVisible), Max Scale: \(layer.maxScale), Min Scale: \(layer.minScale)")
}
}
func addGraphic(at point: Point) {
let symbol = SimpleMarkerSymbol(style: .circle, color: .red, size: 20)
let graphic = Graphic(geometry: point, symbol: symbol)
graphicsOverlay.addGraphic(graphic)
print("Graphic added at: \(point)")
}
}
// React Native Bridge
@objc(ARNativeViewManager)
class ARNativeViewManager: RCTViewManager {
override func view() -> UIView! {
return ARNativeView()
}
override static func requiresMainQueueSetup() -> Bool {
return true
}
}
class ARNativeView: UIView {
var hostingController: UIHostingController<AugmentRealityToCollectDataView>!
override init(frame: CGRect) {
super.init(frame: frame)
setupARView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupARView()
}
func setupARView() {
let arView = AugmentRealityToCollectDataView()
hostingController = UIHostingController(rootView: arView)
addSubview(hostingController.view)
hostingController.view.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
hostingController.view.topAnchor.constraint(equalTo: self.topAnchor),
hostingController.view.bottomAnchor.constraint(equalTo: self.bottomAnchor),
hostingController.view.leadingAnchor.constraint(equalTo: self.leadingAnchor),
hostingController.view.trailingAnchor.constraint(equalTo: self.trailingAnchor)
])
print("AR view setup completed.")
}
}
Ok it took me too long to realise that my .mspk file was incorrectly formatted I think (was categorised as local webscene according to error message) I followed the tutorial here on how to create an offline scene: https://pro.arcgis.com/en/pro-app/latest/help/sharing/overview/create-an-offline-scene.htm which helped me address the issue. Afterwards with a bit of fiddling i was able to see the scene content! I also added a switch so that I could view it in table top mode as well. So good progress! I will share my code in case anyone else is interested. Still has issues, namely that the scene content, despite being positioned correctly relative to each other, isn't quite in its correct geographical position. Will have to fiddle further to see what I can do about this. I also suspect this might be related to the parallax issue mentioned by @Ting so not sure if there is a workaround. All advice is welcomed. Thanks
Code below:
import Foundation
import UIKit
import ArcGIS
import ArcGISToolkit
import ARKit
import SwiftUI
import Combine
import CoreLocation
// SwiftUI View
struct AugmentRealityToCollectDataView: View {
@State private var scene: ArcGIS.Scene?
@State private var error: Error?
@State private var viewMode: ViewMode = .tabletop // Initial view mode
@State private var anchorPoint: Point? = nil
@State private var clippingDistance = 500.0
@State private var translationFactor = {
// The width of the scene, which is about 800 m.
let geographicContentWidth = 800.0
// The physical width of the surface the scene will be placed on in meters.
let tableContainerWidth = 1.0
return geographicContentWidth / tableContainerWidth
}()
enum ViewMode {
case tabletop
case worldScale
}
var body: some View {
VStack(spacing: 0) {
if let scene = scene, let anchorPoint = anchorPoint {
if viewMode == .tabletop {
TableTopSceneView(
anchorPoint: anchorPoint,
translationFactor: translationFactor,
clippingDistance: clippingDistance
) { _ in
SceneView(scene: scene)
}
.edgesIgnoringSafeArea(.all)
} else {
WorldScaleSceneView { _ in
SceneView(scene: scene)
}
.edgesIgnoringSafeArea(.all)
}
} else {
ProgressView("Loading scene...")
.task {
await loadScene()
}
.task {
await startLocationUpdates()
}
}
}
.overlay(alignment: .top) {
VStack {
Text("Scene Loaded")
.multilineTextAlignment(.center)
.frame(maxWidth: .infinity, alignment: .center)
.padding(8)
.background(.regularMaterial, ignoresSafeAreaEdges: .horizontal)
if scene != nil {
Button(action: {
switchViewMode()
}) {
Text("Switch to \(viewMode == .tabletop ? "World Scale" : "Tabletop") View")
.padding()
.background(Color.blue)
.foregroundColor(.white)
.cornerRadius(8)
}
}
}
}
.alert(isPresented: .constant(error != nil)) {
Alert(title: Text("Error"), message: Text(error?.localizedDescription ?? "Unknown error"), dismissButton: .default(Text("OK")))
}
}
func switchViewMode() {
viewMode = (viewMode == .tabletop) ? .worldScale : .tabletop
}
func loadScene() async {
guard let filePath = Bundle.main.path(forResource: "3D-East-London", ofType: "mspk") else {
print("MSPK file not found")
return
}
let mspk = MobileScenePackage(fileURL: URL(fileURLWithPath: filePath))
do {
try await mspk.load()
print("MSPK loaded successfully.")
if let loadedScene = mspk.scenes.first {
print("Attempting to load scene: \(loadedScene)")
// Ensure the scene is fully loaded
try await loadedScene.load()
// Load all operational layers
for layer in loadedScene.operationalLayers {
try await layer.load()
print("Layer name: \(layer.name)")
if layer is FeatureLayer {
print("Layer is a FeatureLayer")
} else if layer is ArcGISSceneLayer {
print("Layer is a SceneLayer")
} else {
print("Unknown layer type")
}
}
self.scene = loadedScene
print("Scene loaded successfully.")
} else {
print("No loadable scenes found in MSPK.")
}
} catch {
print("Error loading MSPK or scene: \(error)")
DispatchQueue.main.async {
self.error = error
}
}
}
func startLocationUpdates() async {
let locationManager = CLLocationManager()
locationManager.requestWhenInUseAuthorization()
let locationDataSource = SystemLocationDataSource()
do {
try await locationDataSource.start()
} catch {
print("Error starting location updates: \(error)")
return
}
Task {
guard let initialLocation = await locationDataSource.locations.first(where: { _ in true }) else {
print("Failed to get initial location")
return
}
self.anchorPoint = Point(x: initialLocation.position.x, y: initialLocation.position.y, z: initialLocation.position.z)
print("Current location: \(self.anchorPoint!)")
}
}
}
// Bridge to React Native
@objc(ARNativeViewManager)
class ARNativeViewManager: RCTViewManager {
override func view() -> UIView! {
return ARNativeView()
}
override static func requiresMainQueueSetup() -> Bool {
return true
}
}
class ARNativeView: UIView {
var hostingController: UIHostingController<AugmentRealityToCollectDataView>!
override init(frame: CGRect) {
super.init(frame: frame)
setupARView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupARView()
}
func setupARView() {
let arView = AugmentRealityToCollectDataView()
hostingController = UIHostingController(rootView: arView)
addSubview(hostingController.view)
hostingController.view.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
hostingController.view.topAnchor.constraint(equalTo: self.topAnchor),
hostingController.view.bottomAnchor.constraint(equalTo: self.bottomAnchor),
hostingController.view.leadingAnchor.constraint(equalTo: self.leadingAnchor),
hostingController.view.trailingAnchor.constraint(equalTo: self.trailingAnchor)
])
print("AR view setup completed.")
}
}
Glad to hear that you made some progress! 🎉
The WorldScaleSceneView is kind of tough - it needs great GPS reception in an open area to work, good cellular reception is a plus. There is a modifier onCameraTrackingStateChanged that tells the ARCamera's tracking state. When it reaches normal, it performs the best.
Depending on the device's current location's accuracy, placing the scene might have more or less deviation. During our prototyping process, when the device's location is within 3 meters horizontal accuracy, the scene looks the most coherent in an AR camera.
Finally, parallax. Ah, parallax! 😩 Because we don't have the luxury to place the scene as a 3D object (like in RealityKit) in the AR camera, we have to use some smart ways to align the scene view on top of the AR view. That is OK when you are looking down or not too far away. But once you start walking, the relative movement of the scene and the AR view is fairly noticeable. So far, our mitigation is to tune down the clippingDistance of a scene view, and turn off the basemap's opacity to reduce the visual strangeness. But sometimes the features and graphics in the scene view have to have some context to make sense, so that's the dilemma.
I'd encourage you to share some compressed video recordings here to see how it performs, so we can provide more specific feedback!
Hi @Ting I completely missed your post originally, so apologies for the late reply myself. I have been busy with other projects but have been able to resume work on this again this week. So my first impression about the incorrect geographical position of the models was, I think, influenced by GPS. I see this now. Despite initialising incorrectly, when i started to walk around. It did seem to snap into place. Which i suppose makes sense to attribute to GPS as it tries to lock into the device location.
I also had already made the basemap transparent which did seem to make the overall experience look and feel better than before. I just cant help but feel like something is off though, as if it doesn't truly feel like its in real space particularly when moving.... I'm guessing this is simply just the limitation of having the scene on top of the AR view as opposed to existing inside of it?
I have attached two short videos of me demonstrating in my office / hallway. One should show the GPS snapping (i.e models moving) and the other of me walking in the direction of models down a hallway. I had to make them quite short and compressed to upload. I would be very interested to hear what you think and if there are any further tips you can give. Thanks!