Skip navigation
All Places > Developer Communities > Native App Developers > ArcGIS Runtime SDK for iOS > Blog
1 2 Previous Next

ArcGIS Runtime SDK for iOS

22 posts

The latest release of the ArcGIS Runtime Toolkit for iOS is here. It has been updated to work with the 100.8 release of the ArcGIS Runtime SDK for iOS.


This release includes:

  • JobManager changes:  Use UIApplication.shared.beginBackgroundTask instead of the background fetch stuff. Background fetch can take hours to get called according to the Apple documentation. Instead, beginBackgroundTask keeps the app alive for up to 10 minutes, giving most jobs the ability to finish or at least get to get out of the process download phase.
  • Auto Layout fixes for JobManagerExample and MeasureToolbar.
  • Updates minimum iOS version to 12.0 and Xcode version to 11.0 to keep inline with the ArcGIS Runtime SDK for iOS.
  • Fixes deprecation of UISearchController.dimsBackgroundDuringPresentation resulting from iOS 12.0 update.
  • Change locationDataSource property of ArcGISARView to be of type AGSLocationDataSource instead of the subclass AGSCLLocationDataSource.


You can find the Toolkit here.
See this blog post for information on the SDK release.


We hope you enjoy the new release! Let us know what you're building with it.

The Runtime team is pleased to announce the latest release of the Runtime SDK for iOS (see the release notes, and the overall announcement on the ArcGIS blog).


It continues to follow the track based approach, so there's lots of new functionality for Defense/Public Safety, and Utility customers, as well as improving platform support, but there are some significant things for iOS developers that are worth highlighting:


  • Metal Support: This is a big one! Apple deprecated support for OpenGL ES in iOS and at some point down the line will likely remove support from iOS entirely. Runtime 100.8 for iOS introduces a new Metal based rendering pipeline. It's faster and uses less memory. It also means you can actually use 3D Scenes in the simulator! But it does have an impact on using the simulator on Mojave or running in iOS 12. See this post for more details.
  • Minimum iOS 12: iOS 11 support has been dropped.
  • Minimum Xcode 11: Xcode 10 support has been dropped.
  • Tookit updated: The toolkit has some updates and fixes. See this blog post for more details.


A handful of other highlights:

  • Image overlays (scene views only): Animate weather radar or other imagery.
  • Visual variables for point layers now render in dynamic mode. Smooth zooming, everybody!
  • MMPKs and MSPKs can now include online layers (e.g. traffic). Requires ArcGIS Pro 2.6 to author these.
  • Identify on raster layers (both service based or local raster file based).


Check out the release notes for more details, and as always we hope you enjoy the new capabilities. Please feel free to DM me and let me know what you're building!

In 2018, Apple announced they were deprecating support for Open GL in macOS and iOS. Here's what we wrote about that at the time: The ArcGIS Runtime, OpenGL and Metal 


We're pleased to let you know that the 100.8 release of the ArcGIS Runtime SDK for iOS, coming soon out now, introduces Metal support. There's nothing you need to do to take advantage of this other than upgrade your version of the Runtime SDK for iOS to version 100.8.


Testing has shown improved performance (especially when displaying scenes in the simulator).


Because OpenGL ES support is no longer included, any development or testing workflow that uses the iOS Simulator must use a simulator that supports Metal to view an AGSMapView or AGSSceneView. What this means in practice is that to use version 100.8 with an iOS Simulator you must meet the following minimum conditions:

  • Developing on macOS Catalina (10.15)
  • Using Xcode 11 (this is a requirement for Runtime 100.8 anyway).
  • Simulating iOS 13


If you need to develop or test with version 100.8 against iOS 12 or using an earlier version of macOS, you need to use a physical device in place of the simulator.


You'll find this information and more in the release notes.


Please note that the Runtime SDKs for Qt, .NET (Xamarin.iOS), and Java will introduce Metal support in future releases.

The latest release of the Runtime SDK for iOS is here (see the release notes, and the general announcement over on the ArcGIS Blog), and it builds upon the foundations introduced in update 6.


Key highlights include adding a LOT of network utility functionality, so that you can now build really powerful field apps that include a full suite of network tracing and more. We've also added the ability to license runtime at all levels using a named user (previously you could only license at Lite or Basic). See the announcement for more details.


Of significance from the iOS perspective:

  • Support for iOS 11 is being deprecated. At the next Runtime update, a minimum of iOS 12 will be required.
  • Augmented Reality Scenes can be clipped to help them fit better onto a real-world surface.
  • The OAuth panel in ArcGIS Online now honors the iOS device's dark mode state.


The Toolkit has already been updated for 100.7.


Download Update 7 today, and let us know what you're building.

The latest release of the ArcGIS Runtime Toolkit for iOS is here. It has been updated to work with the 100.7 release of the Runtime SDK for iOS.

This release includes:

  • Bookmarks component and example - The Bookmarks component will display a list of bookmarks in a table view and allows the user to select a bookmark and perform some action.
  • Swift 5 - The Toolkit and Example app projects and code have been udpated to Swift 5. As a result, Xcode 10.2 is now required to build the Toolkit.
  • Unit Test target - A Unit Test target has been added to the Toolkit project. There are initial unit tests for the new Bookmarks component and a test covering component creation. The goal is for all future components to have unit tests, with tests for existing components being added over time.
  • ArcGISARView.clippingDistance - A clippingDistance property has been added to the ArcGISARView. The clippingDistance limits the data display to a radius around the origin camera. The Point Cloud - Tabletop and US - Mexico Border - Tabletop example scenes have been updated to include a clipping distance. Here's a screenshot of the updated Point Cloud - Tabletop scene:AR Clipping Distance


You can find the Toolkit here.

See this blog post for information on the SDK release.

We hope you enjoy the new release! Let us know what you're building with it.

The latest release of the ArcGIS Runtime Toolkit for iOS is here. It has been updated to work with the 100.6 release of the Runtime SDK for iOS.
This release includes:
• Augmented reality component and example
• Carthage package manager support
• SwiftLint support for both Toolkit components and the Example app
• Cleanup of the JobManager component for code consistency and style
• Minor internal tweaks


You can find the Toolkit here.


See this blog post for information on the SDK release.


To read more about Augmented Reality in the ArcGIS Runtime, read Rex's ArcGIS Blog post here.
We hope you enjoy the new release! Let us know what you're building with it.

The latest release of the Runtime SDK for iOS is here (see the release notes, and the general announcement over on the ArcGIS Blog), and it introduces some significant new functionality that will be built upon over the next few releases.


Some key highlights include Utility Network and the Navigation API, both of which merely scratch the surface of what we have in store. See the announcement for more details.


Some highlights from the iOS perspective:

  • You should now migrate to using the Dynamic Framework if you had not already. We had previously deprecated the Static Framework and it is now not even included in the SDK installer. This is Apple's preferred approach and simplifies the integration of the Runtime into your projects. See the release notes for how to migrate over.
  • We've prepared the SDK for iOS 13's Dark Mode by ensuring that UI elements like pop-ups and the attribution bar adapt correctly.
  • We've improved 3D Scene interaction and included some configuration options.


The SDK also adds the foundations for leveraging ARKit experiences into your mobile location/GIS apps. We will shortly be releasing open source components as part of the Toolkit that build upon these foundations to make great AR integration into your apps even easier.


So, download Update 6, dive in, and let us know what you're building.

This is part 3 of a 3 part series on working with Location in your Runtime applications.


In parts 1 and 2 we introduced the AGSLocationDisplay and the AGSLocationDataSource and discussed how they work together to display location on your map view as well how to configure location appearance on the map view, and how the map view behaves as the location is updated.


We finished off with an understanding of how AGSLocationDataSources are created. In this post we'll create a new location data source that provides realtime location of the International Space Station, and show it in use in a simple application.



There exists a very cool, simple, open source API that provides realtime locations of the International Space Station. You can find out about it here, but put simply you make an HTTP request to and get JSON back with the current location.


Let's use that API to build a custom AGSLocationDataSource that provides the ISS's current location. We'll call it ISSLocationDataSource.


Building the data source

Starting with a simple project already linked to the ArcGIS Runtime, let's create a new Swift file named ISSLocationDataSource and define our subclass of AGSLocationDataSource:


class ISSLocationDataSource: AGSLocationDataSource {


Now let's implement doStart() and doStop():


class ISSLocationDataSource: AGSLocationDataSource {
    override func doStart() {

        // Let Runtime know we're good to go
    override func doStop() {
        // Let Runtime know we're done shutting down


Once we've started, we'll hit the API URL every 5 seconds using an NSTimer, and parse the response into an AGSLocation object:


private var pollingTimer: Timer?

func startRequestingLocationUpdates() {
    // Get ISS positions every 5 seconds (as recommended on the
    // API documentation pages):
    pollingTimer = Timer.scheduledTimer(withTimeInterval: 5, repeats: true) {
        [weak self] _ in
        // Request the next ISS location from the API and build an AGSLocation.
        self?.requestNextLocation { newISSLocation in
            // TO AGSLocationDisplay: new location available.


Reading the URL and turning the JSON response into an AGSLocation happens in requestNextLocation() function which is called every 5 seconds using an NSTimer. Notice the call to didUpdate() on line 13. As discussed in part 2 of this series, that call will pass the new AGSLocation to the AGSLocationDisplay, which in turn will make sure location is updated on the AGSMapView as needed.


You can see a full implementation of the entire ISSLocationDataSource class here, including requestNextLocation() and the JSON Decoding logic.


Using our custom location data source

To use the new custom data source in a map view, we simply set the AGSMapView.locationDisplay.dataSource to an instance of our new class (line 8 below) and start the AGSLocationDisplay:


override func viewDidLoad() {

    // Set the Map. = AGSMap(basemap: AGSBasemap.oceans())
    // Use our custom ISS Tracking Location Data Source.
    mapView.locationDisplay.dataSource = ISSLocationDataSource()
    // Start the AGSMapView's AGSLocationDisplay. This will start the
    // custom data source and begin receiving location updates from it.
    mapView.locationDisplay.start { (error) in
        guard error == nil else {
            print("Error starting up location tracking: \(error!.localizedDescription)")


It's that easy! Now you have a map that shows the live current location of the ISS.


Of course, it's a little counter-intuitive to see the blue dot tracking the space station (we've been trained to associate it with our own location), so we use some of the AGSLocationDisplay configuration options to change the symbol to use an icon of the ISS. Find the entire Xcode project here.



Additional details about ISSLocationDataSource.requestNextLocation():

  • We use an AGSRequestOperation to get the JSON response from the API (source code).
  • We create a new AGSOperationQueue that processes 1 operation at a time. This way we don't have duplicate simultaneous calls to the API (source code).
  • For the very first location we obtain, since we don't yet have a heading or velocity, we create an AGSLocation with lastKnown = true (source code) which means it will be displayed differently in the map view (by default a grey dot rather than a blue dot, indicating that we're still acquiring a location).
  • We use AGSGeometryEngine to calculate the velocity of the ISS by comparing the new location with the previous location.


There's also a slightly more detailed version with an overview map, reference lat/lon grids, and a path-tracking geometry here.


We hope you've enjoyed this series of blog posts. The Location Display and Location Data Sources provide a powerful configurable way to integrate location into your Runtime apps, no matter what source you're using for location.

This is part 2 of a series of 3 blog posts covering Location and the ArcGIS Runtime SDKs. In part 1 we introduced the AGSLocationDisplay which is responsible for working with the AGSMapView to give your app a "blue dot" experience, and we talked about customizing the behavior and appearance of the current location on your map.


In this post, we'll talk about the third component of AGSLocationDisplay, which is where that location information comes from: the location data source…


Location Data Sources


A location data source feeds location updates to the AGSLocationDisplay which in turn takes care of updating the map view according to the configuration options discussed in part 1.


It is accessed via the AGSLocationDisplay.dataSource property.


The ArcGIS Runtime SDK for iOS comes with a few location data sources out of the box:


Data Source TypePurpose
AGSCLLocationDataSourceFollow your iOS device's built-in Core Location services (the default).
AGSSimulatedLocationDataSourceFollow a sequence of simulated locations or an AGSPolyline.
AGSGPXLocationDataSourceFollow the contents of a GPX file.


Each of these location data sources inherits from the base AGSLocationDataSource class:



By default an AGSLocationDisplay automatically creates and uses an AGSCLLocationDataSource. Just call AGSMapView.locationDisplay.start(completion) to start showing your location (but see the note here about configuring your app according to Apple's requirements for enabling location).


Custom Location Data Sources

What's neat is that by inheriting from AGSLocationDataSource and implementing a few simple methods, you can create your own custom data source that Runtime integrates with in exactly the same way as the 3 out-of-the-box ones mentioned above (which are themselves built using the pattern we'll discuss).


This extensibility is very powerful if you're working with proprietary location technology, such as an external high-accuracy GPS unit, a beacon-based indoor location system, or some other location determining system whose manufacturers provide an API or SDK.


Step-by-step, here's how you build out your own custom AGSLocationDataSource:


  1. Inherit from AGSLocationDataSource.
  2. Implement doStart().
    • Called by Runtime when the app calls AGSLocationDisplay.start(completion).
    • This must call didStartOrFailWithError() to signal that your source started OK (or failed to start).
      When you call didStartOrFailWithError() you either pass in nil if your source started OK and is ready to start providing locations, or pass in an Error if it failed to start.
  3. Implement doStop().
    • Called by Runtime when the app calls AGSLocationDisplay.stop().
    • This must call didStop() to signal that your source stopped OK.


That takes care of starting and stopping the data source when instructed to, but you also need to notify Runtime when you get new locations:


  • Call didUpdate(location) whenever you get a new location you need to pass on. You construct an AGSLocation object and call didUpdate(), passing in that location.
    An AGSLocation combines an AGSPoint with a timestamp. It also includes velocity, heading, accuracy estimates, and whether this location should be considered current or is an old (or "last known") location. Some of these properties are used to determine how the location is displayed in the map view.


That's it. With that implemented, you have a functioning Location Data Source. Let's discuss the documentation available to us about this, and how it all fits together.


Understanding Location Data Sources

First, let's look at the AGSLocationDataSource (ForSubclassEyesOnly) API, described in the reference docs:



These methods determine how your custom location data source and the AGSLocationDisplay will communicate.


A note on naming:


  • do… Instructions from the AGSLocationDisplay to your custom AGSLocationDataSource begin with "do" (doStart() and doStop()). These indicate an imperative from the location display for your data source to do something.


  • did… Feedback from your data source to the AGSLocationDisplay is done by calling functions that begin with "did" (didUpdateLocation(), didStartOrFailWithError(), didStop() etc.). These indicate feedback from your data source to update the location display (e.g. the state changed, or a new location is available). 


The "do" methods are expected to exist in your subclass of AGSLocationDataSource, so it's up to you to implement them.


The 'did' methods are all inherited from AGSLocationDataSource, so they're already defined for use by your subclass. You don't implement them. You just call them directly.


Here's how these methods work together:


  1. When an app wants to show location updates (by calling AGSMapView.locationDisplay.start(completion)), the AGSLocationDisplay will call doStart() on its dataSource.
  2. The data source initializes itself and calls didStartOrFailWithError(nil) to show it's started OK. It starts providing location updates by calling didUpdate(location) for each location update.
  3. When the app wants to stop showing location, it calls AGSMapView.locationDisplay.stop() and the AGSLocationDisplay will call doStop() on its dataSource. The data source calls didStop() and makes sure it doesn't call didUpdate(location) any more.


That's it for the theory. In the next blog post, we'll look at creating a custom AGSLocationDataSource from scratch and show it in use in a simple iOS application.

The Blue Dot

Often your app needs to show your location, or a "blue dot", on a map. Maybe it's important to know where you are, or important to know where things are in relation to you. Often that context is critical to the core functionality of the app.


Runtime provides a robust blue dot experience. Out of the box, you simply call start() or stop() and Runtime takes care of talking to your device to get the best available location.


But Runtime takes this functionality further and provides a flexible framework to provide your own location source. Perhaps you need to connect to an external GPS unit or use a proprietary indoor location beacon system.


In this, the first of 4 posts on the topic, we'll take a look at how Runtime presents location in the map view. In part 2 we'll discuss how Runtime gets location updates from various location data sources and cover what it takes to create your own location data source. Then in part 3 we'll take what we've learnt so far and build a custom location data source. Finally, in part 4, we'll look at some advanced tips and tricks for providing your own twist on the default location data source.


Your location in the Runtime

There are 3 Runtime components that work together to put that blue dot on your map: AGSMapViewAGSLocationDisplay, and AGSLocationDataSource


  • Every AGSMapView has an AGSLocationDisplay. It is responsible for showing and updating the current location on that map view.
  • You access the AGSLocationDisplay via the AGSMapView.locationDisplay property.
  • Your app starts and stops tracking location by calling AGSLocationDisplay.start(completion) and AGSLocationDisplay.stop().


Swift Note:
Swift translates method names automatically from Objective-C, so while the API doc references, for example, startWithCompletion:(), in Swift it will be written start(completion) and that's the form I'll use in this post.


Various properties on AGSLocationDisplay determine what the location display looks like on the map view (you don't have to use a blue dot if your app needs something else) and how the map view keeps up with the current location.


If you just want to use your device's built-in GPS, that's actually all you need to know. But let's look at some ways to control and configure that location display…



There are 3 key aspects of Location Display that you can change.

  1. Behavior: How does the map view update itself as the current location changes.
  2. Appearance: What does the location display look like and how does it change depending on what information is available?
  3. Data Source: Where does the location display get the current location from?


Configuring behaviour

The auto-pan mode changes how the AGSMapView keeps up with location updates, that is, how does the map view move to follow the blue dot around. Set the auto-pan mode using AGSLocationDisplay.autoPanMode.


ModePurpose and behavior
OffThis is the default. The blue dot is free to move around your map display (or beyond it) while the map view remains still.

The map will follow the blue dot as it moves, based off the AGSLocationDisplay.wanderExtentFactor, a number between 0 and 1:

  • 0 means the map view will constantly recenter on the blue dot.
  • 1 means the map view will recenter when the blue dot reaches the edge of the map view.
  • The default is 0.5, which allows a bit of movement before the map view recenters.


Good for driving. The map view updates for the current location, but orients to the direction of motion. Use the AGSLocationDisplay.navigationPointHeightFactor property to determine how far from the bottom of the display the location will be anchored.
Compass NavigationGood for walking. The map view updates for the current location, but orients to the direction the device is pointing.


When you manipulate the map view's area of interest either programmatically or by panning/zooming it, the auto-pan mode will be reset to off. You can monitor for changes to the auto-pan mode with the AGSLocationDisplay.autoPanModeChangeHandler.


When you switch on auto-pan (i.e. it was off, but you set it to one of the other values), the map view will pan and zoom to the next location update. You can control the scale level for this pan and zoom with AGSLocationDisplay.initialZoomScale.

In most use cases, fixing the zoom like this is sensible behavior, but if you want to just recenter without zooming the map, simply set the initialZoomScale to the current AGSMapView.mapScale before you set the autoPanMode.

Configuring appearance

By default the location symbol will be a blue dot. It's augmented with a few visual aids to indicate accuracy, direction of motion (course), direction the device is pointing (heading) and whether locations are being updated (a "ping"). See these properties on AGSLocationDisplay for more details:



You control whether to use the courseSymbol when movement is detected with the AGSLocationDisplay.useCourseSymbolOnMovement property.


Lastly, if you don't want a visual indication that a new location was received, you can set AGSLocationDisplay.showPingAnimation to false.


Location Data Source

One other important configurable item on AGSLocationDisplay is the location data source. We'll look at that part 2 when I'll cover how data sources work, and in part 3 we'll create our own custom data source.

I'm not going to hide it, I love the ArcGIS Runtime's loadable design pattern. We find the loadable pattern across the ArcGIS runtime; no doubt you interact with it often.

In short, the pattern allows you to work with an object that might take some time figuring stuff out before it’s ready to be used. For example, it might depend on multiple (possibly remote) resources, sometimes in sequence, before it knows enough about itself to be usable. Once these resources are retrieved, the loadable object executes a callback block signaling that it's ready. An AGSMap is a good concrete example as it might need to load multiple remote layers before it knows what extent and spatial reference to use.

Some other qualities of a good loadable object include:

  • calling the completion block on a suitable thread based off the thread the load was started from (see this blog post).
  • providing an observable loadStatus and loadError, though generally you just wait for the completion block to be called.
  • handling load failure elegantly with the option to retry later if needed (perhaps the network connection was interrupted).

As an added bonus, a full implementation of <AGSLoadable> can be found in AGSLoadableBase, a class designed to be subclassed (and which saves you reinventing a lot of wheels, doing a lot of the heavy lifting of load state and callback thread considerations).

"It's only a protocol," you might say. "You can't always subclass AGSLoadablebase," you might suggest. "Try implementing AGSLoadable yourself, do you still love it?" you might pronounce.

Hot take, but of course you'd be right. As an engineer, I'm tickled by problems like this and eagerly look for solutions.

Today I'd like to share with you a delightful maneuver that allows any class to adhere to <AGSLoadable> with neither the need for a custom async implementation (yikes) nor subclassing AGSLoadableBase.

I'd like to introduce you to LoadableSurrogate & <LoadableSurrogateProxy>, a member/protocol solution that offloads the heavy lifting of async loading onto a surrogate loader.

There are two actors in this maneuver, engaged in a parent/child delegate-like relationship:

  1. LoadableSurrogate is a concrete subclass of AGSLoadableBase that routes messages to a proxy object.
  2. A proxy object that adheres to <LoadableSurrogateProxy> we'd like to make loadable with the help of a loadable surrogate.

A class can leverage this tool by creating a LoadableSurrogate member, adhering to <LoadableSurrogateProxy> and specifying the LoadableSurrogate's proxy.

In this example, I've built a simple loader that downloads an image of (my favorite muppet) Kermit the Frog hanging out on the legendary Hollywood walk of fame.

class KermitLoader: NSObject, LoadableSurrogateProxy { }

I can use the kermit loader object like any other <AGSLoadable>:

let kermitLoader = KermitLoader()  kermitLoader.load { (error) in  
    if let error = error {        
        print("Error: \(error.localizedDescription)")    

    imageView.image = kermitLoader.kermitImage

The kermit loader is initialized with a LoadableSurrogate member, assigning the surrogate's proxy to self.

class KermitLoader: NSObject, LoadableSurrogateProxy {

    private let surrogate = LoadableSurrogate()         

    override init() {
        surrogate.proxy = self
    /* ...

For the kermit loader to conform to <LoadableSurrogateProxy> it must also conform to <AGSLoadable>. Conveniently, all <AGSLoadable> methods can be piped through the surrogate.

    ... */
    func load(completion: ((Error?) -> Void)? = nil) {
        surrogate.load(completion: completion)    

    func retryLoad(completion: ((Error?) -> Void)? = nil) {        
        surrogate.retryLoad(completion: completion)    

cancelLoad() {        
    /* ...

Following the same pattern outlined above, you might opt to compute the <AGSLoadable> properties loadStatus and loadError on the fly, getting those values from the surrogate.

Instead, I've opted to persist those properties and thus, expose them to KVO.

    ... */
    @objc var loadStatus: AGSLoadStatus = .unknown

    @objc var loadError: Error? = nil
    // Proxy informs of changes to `loadStatus` and `loadError`.

    func loadStatusDidChange(_ status: AGSLoadStatus) {        
        self.loadStatus = status    

    func loadErrorDidChange(_ error: Error?) {        
        self.loadError = error    
    /* ...

Everything we've seen up until this point is boilerplate and can be copied and pasted. Let's get to the good stuff.

First, in order to perform the loadable operation we'll need to set up some resources and properties. We need a URL to the image, a data task, and of course a reference to the loaded image.

    ... */
    private let kermitURL = URL(string: "")!

    private var kermitSessionDataTask: URLSessionDataTask?

    var kermitImage: UIImage? = nil
    /* ...

What comes next is the custom loadable implementation. If you have ever subclassed AGSLoadableBase directly, this should feel familiar.

The proxy object is responsible for starting the load and completing with an error or nil, depending on the success of the operation. The proxy object is also responsible for canceling any async operations as well.

    ... */
    func doStartLoading(_ retrying: Bool, completion: @escaping (Error?) -> Void) {   

        if retrying {                         
            let previousDataTask = kermitSessionDataTask            
            kermitSessionDataTask = nil
            kermitImage = nil

        kermitSessionDataTask = URLSession.shared.dataTask(with: kermitURL) { [weak self] data, response, error in

            guard let self = self else { return }                         

            if let data = data, let image = UIImage(data: data) {                
                self.kermitImage = image            

            if response == self.kermitSessionDataTask?.response {                

    /* ...

The proxy object is also responsible for canceling running operations. If you want the surrogate to supply a generic CancelledError, you can return true. In this example the data task reliably provides its own cancel error in the task's callback and thus we return false.

    ... */
    func doCancelLoading() -> Bool {                 
        kermitSessionDataTask = nil
        kermitImage = nil
        // Returns `false` because the URLSession returns a cancel error in the completion callback.
        // Return `true` if you want the surrogate to supply a generic cancel error.
        return false

Cool! Now let's take a look under the hood of the LoadableSurrogate, powering much of the kermit loader.

To start, a LoadableSurrogate is a subclass of AGSLoadableBase.

class LoadableSurrogate: AGSLoadableBase { /* ...

A LoadableSurrogate passes messages to a proxy object. As you saw in the KermitLoader.init(), the kermit loader specifies itself as the proxy.

    ... */
    weak var proxy: LoadableSurrogateProxy? {        
        didSet {            
    /* ...

A LoadableSurrogate observes loadError and loadStatus so that it may immediately inform the proxy of changes to either of these properties.

... */
    // Cocoa requires we hold on to observers.
    private var kvo: Set<NSKeyValueObservation> = []         

    override init() {           


        let loadStatusObservation = self.observe(\.loadStatus) { [weak self] (_, _) in

            guard let self = self else { return }                         



        let loadErrorObservation = self.observe(\.loadError) { [weak self] (_, _) in

            guard let self = self else { return }                         


    /* ...

And finally a LoadableSurrogate handles piping the loadable method calls to and from the proxy.

    ... */
    private let UnknownError = NSError(domain: "LoadableSurrogate.UnknownError", code: 1, userInfo: [NSLocalizedDescriptionKey: "An unknown error occurred."])         

    override func doStartLoading(_ retrying: Bool) {                 

        // We want to unwrap the delegate, if we have one.
        if let proxy = proxy {               

            // Call start loading on the delegate.
            proxy.doStartLoading(retrying) { [weak self] (error) in

                guard let self = self else { return }                                 

                // Finish loading with the reponse from the delegate.
        else {            
            // No delegate, finish loading.

    private let CancelledError = NSError(domain: "LoadableSurrogate.CancelledError", code: NSUserCancelledError, userInfo: [NSLocalizedDescriptionKey: "User did cancel."])     

    override func doCancelLoading() {  

        // Call cancel delegate method.
        if proxy?.doCancelLoading() == true {                         


To see this maneuver in action, have a look at this playground.

Happy loading!

The latest release of the ArcGIS Runtime Toolkit for iOS is here. It has been updated to work with the 100.5 release of the Runtime SDK for iOS.
This release includes:
• New PopupController component and example; the PopupController provides a complete feature editing and collecting experience.
• New TemplatePickerViewController component and example; the TemplatePickerViewController allows the user to choose from a list of AGSFeatureTemplates, commonly used for creating a new feature.
• Fix for a TimeSlider issue when using a combination of play/pause and manual dragging.
• Project updates for Xcode 10.2


You can find the Toolkit here.


See this blog post for information on the SDK release.
We hope you enjoy the new release! Let us know what you're building with it.

The latest release of the Runtime SDK for iOS is here (see the release notes, and the general announcement over on the ArcGIS Blog), and it brings with it a slew of great new features, and some cool new internals that pave the way for even more goodness in future releases.


Some highlights from the iOS perspective:

  • We're now promoting the use of the Dynamic Framework for integrating ArcGIS Runtime SDK for iOS into your projects (and have deprecated the Static framework, which will be removed in a future release). This is Apple's preferred approach and matches what CocoaPods has been doing since Runtime 100.1. It also makes integration with the project simpler (e.g. you no longer explicitly add ArcGIS.bundle to your projects). See Configure your Xcode Project in the iOS SDK Guide.
  • In alignment with Apple's policies and since 100.5 now requires iOS 11 or higher, support for 32-bit devices has been dropped. You shouldn't notice any change as a developer, but it means means that the SDK installer package is now much smaller to download!
  • If you've been using custom views in an AGSCallout, the Runtime now makes use of AutoLayout to fit the view. See this release note.


This is not specific to the iOS Runtime SDK, but if you work with selections in your map view, please note the new behavior we have introduced at 100.5. These changes were brought about as we prepare for significant features down the line (including Metal support).


As has been mentioned before, 100.5 is the last release of the dedicated Runtime SDK for macOS.


The Samples app and Toolkit have already been updated for 100.5, and the Open Source apps will be update shortly (Data Collection already has been!).


We hope you enjoy the new release! Let us know what you're building with it.

After some discussion, the decision has been made to deprecate the ArcGIS Runtime SDK for macOS. The upcoming 100.5 update will be the last release of the dedicated Runtime SDK for macOS. As with any other release, support will be provided according to the Product Lifecycle Support Policy.


It's not a decision taken lightly but, for a number of reasons, we're confident that it's the right move. 


Firstly: interest in the Runtime SDK for macOS has been low. By freeing up team members from maintaining it (including not just the ArcGIS Framework, but also the Samples App, guide documentation, etc.), we'll be able to implement improvements across the entire Runtime SDK family more effectively.


Secondly: the decision was made easier given that developers targeting macOS as a platform still have the option of using the ArcGIS Runtime SDK for Java or ArcGIS Runtime SDK for Qt. If you are considering developing Runtime apps targeting macOS, we recommend you investigate those.


If you have questions about this deprecation, feel free to use the comments section below or, if you're coming to the Developer Summit in Palm Springs, come and see us at the developer island.

The Runtime SDK does a lot of work behind the scenes to make it as simple as possible for you to write great, interactive mapping apps with fast, smooth user interfaces.


Getting out of the way

Key to that is making sure that when you ask Runtime to do something asynchronous (query some features, autocomplete an address, load a service definition etc.), that it gets off the thread you called from as quickly as possible and does its work on another thread. Runtime does this really well and you should never see it get in your way while it's doing something.


But when it's done doing that something, which thread does Runtime use to let you know?


The good news is iOS and Runtime work together so you might never have to worry about this. But you're a good developer, and you're doing some carefully thought out threading yourself, so you want to know the details, right?


UI and the main thread

Where you need to know which thread you're on is if you're updating your app's UI. In iOS, all UI updates must be done on the main thread (or the Main Thread Checker will come after you and your UI won't behave). So if you're not on the main thread and you want to enable that button and update that label, you need to fix that.


Luckily, any time iOS enters your code because of a user interaction or view-related hook (e.g. viewDidLoad()), you will already find yourself on the main thread. For a lot of developers this means not having to worry about threading at all until they build something cpu-intensive, at which point they'll need to push that work onto another thread¹.


Efficient behavior

Even though Runtime will use other threads to get out of your way, there are some reasons it's better not to switch threads if possible.

  • Context switching between threads takes up CPU cycles.
  • There are some common workflows where even though the pattern is asynchronous, it's quite possible Runtime already has the answer for you and can hand it over immediately. In those cases, context switching would be a waste of time.


Adding it all together

These considerations combine to help Runtime determine how to call back to you on the various callback blocks, observers, and change handlers provided by the Runtime SDK.


Here's a quick rundown of the ways the ArcGIS Runtime SDK for iOS might call back to you, and the thread you should expect to be called back on:



Explicit Calls:

Main | Any
  • If you call from Main, Runtime will call back on Main.
  • If you don't call from Main, Runtime could call back on any thread.


Calling Runtime from the main thread is a pretty good indicator that you're calling because of a user interaction, so it's reasonable to expect that you'd want to update some UI when Runtime responds (and that always needs to happen on the main thread).


But if you didn't call Runtime from the main thread, maybe there isn't a pair of eyes waiting for the result so Runtime skips the overhead of making sure it's back on the main thread when it responds, ensuring your robots can continue at full speed.


  • Runtime will always call back on Main.


Just as iOS enters your code on the main thread when there's some user interaction, Runtime makes sure to do the same through the AGSGeoViewTouchDelegate. So if you're handling a user tapping on the map, for example, it's safe to update your UI directly.


However, if you're observing some property that happens to be changing because of user interaction (e.g. using KVO to monitor mapScale), the "State Feedback" rule below applies.

  • Runtime will always call back on Main².


More often than not, customer code calls into load() or retryLoad() from the main thread. If an item is already loaded, Runtime can then respond immediately with no context switching required. It makes the loadable pattern very fast for the case where, as the result of a user interaction, you need to ensure something is loaded before doing something else (e.g. determining a feature layer's renderer). In that case, only the very first interaction will incur a performance penalty, but you write your code once and it'll handle the first time or the 100th.

State Feedback:

  • Runtime could call back on any thread.


Context switching back to the main thread could mean that feedback is returned out of order or be delayed, so Runtime will provide that feedback immediately on whichever thread is current. If you need to update your UI as a result, you should dispatch your UI code to the main thread yourself with something like:

DispatchQueue.main.async {
    /* update the UI */



The above can largely be summarized like this:

  • Things that update status or state (progress, KVO, etc.) can happen on any thread.
  • Deliberate actions (Tasks and Jobs) that are called from the main thread will receive their status updates and results on the main thread. If they're called from some other thread, the thread used to respond could be any thread.
  • AGSLoadable and AGSGeoViewTouchDelegate will always call back on the main thread.


Let me know in the comments if you have questions about any of the topics brought up here. This can be a complex issue, but iOS and the Runtime SDK make sure that you usually don't need to worry about it.


¹ iOS includes a powerful thread abstraction API (Grand Central Dispatch, or GCD) that's worth learning about to help with concurrent programming and slick app experiences. Here's a great 2-part tutorial.

² Note: this load() behavior was updated at release 100.3 to provide a number of performance optimizations. Up until release 100.2.1, Runtime would promise to call back on the main thread only if you called in from the main thread (identical to the current Task/Job behavior).