POST
|
Thanks for that additional info. Yeah. Something is causing the SDK to wait for a while when removing a vector tile layer. iOS doesn't tolerate that happening on the main thread (which makes the app unresponsive), and so is terminating the app. A couple of questions: Which version of the SDK are you on? Are you able to reproduce this yourself? If so, can you share a simple reproducer?
... View more
05-20-2024
02:36 PM
|
0
|
3
|
858
|
POST
|
At first glance it looks like the user is setting a map on the MapView when the MapView already has a map on it (i.e. replace the MapView's map). As part of tearing down the old map, it seems that we're disconnecting from a vector tile service and waiting for it to disconnect in some way. That shouldn't be happening on the main thread (I'm sure we should never call thread.sleep (see frame 3) on the main thread). Presumably setting the map is done as the result of a user action (e.g. tapping a button), but to help dig in: are there more frames after frame 14 in the crash (frames 15 and higher)?
... View more
05-20-2024
01:22 PM
|
0
|
5
|
893
|
POST
|
@NIANXI39If this is still happening, please contact customer service. You should be able to copy and paste your Lite license key from that bit of the web page that you took a screenshot of.
... View more
05-17-2024
10:58 AM
|
0
|
0
|
650
|
POST
|
Not sure why, but remove the default key has resolved the issue If you set a default API Key, it will be used for all requests so even if you configure OAuth, the OAuth/Authentication Manager workflows won't be kicked off. By no longer setting it, the SDK will interrogate services and any auth config you've set up and use that info to manage authentication. If you need API Keys for specific services, you can still set them directly on those SDK objects (just not globally) and use a mix of API Key and OAuth authentication.
... View more
05-07-2024
09:19 AM
|
1
|
0
|
852
|
POST
|
Hello, HTTP caching, while useful for reducing duplicate requests for the same data as you pan and zoom around the map, and for improving performance for the end user, is not designed to take a map offline. It's not that we prevent that from working, but there is a lot of metadata beyond just the raw map content that needs to be cached and returned by the underlying HTTP stack, and you're at the mercy of how the caching works on a particular platform (with the Kotlin SDK, it's OkHttp). Being able to do this is something that's on our radar, but it's a way off yet. The recommended and supported process for taking maps offline is to proactively download them using the OfflineMapTask API. This is a much more deliberate process; web map and data service configuration is required to enable this behavior, and the application needs to expressly make use of APIs to download the map. You can learn more here. Hope this helps!
... View more
05-02-2024
09:04 AM
|
1
|
0
|
359
|
IDEA
|
04-22-2024
07:06 PM
|
0
|
0
|
411
|
IDEA
|
We've released the first public beta of the ArcGIS Maps SDK for Flutter. You can read about it here, and join the beta here. Thanks for all your interest to this point. Please join the beta, give it a whirl, and give us your feedback in the beta forums to help us deliver the best mapping and location SDK for Flutter there is.
... View more
04-22-2024
07:51 AM
|
0
|
0
|
255
|
POST
|
Yesterday we released the first beta of the ArcGIS Maps SDK for Flutter, targeting iOS and Android. If you haven't already done so, please jump on over to the early adopter site and sign up! See this blog post for more details.
... View more
04-18-2024
02:32 PM
|
0
|
0
|
2008
|
POST
|
This article should help. It will be on your portal item settings for the vector tile layer: https://doc.arcgis.com/en/arcgis-online/manage-data/manage-hosted-tile-layers.htm#ESRI_SECTION1_0561B66EE826482297DDE002A94048B0
... View more
04-02-2024
11:48 AM
|
1
|
1
|
997
|
IDEA
|
Thanks for the question, @kris. There are a few more broad-ranging things to consider here: If a layer's opacity is set to zero, then set back to something non-zero, what is the expected behavior? Imagine an app wants to temporarily hide a few layers to help the user focus on a particular operation or workflow. Should the data (or metadata) be unloaded and then reloaded every time? This could be a poor user experience (especially if the network is slow, or the user is paying for bandwidth). Instead you would opt in to that experience by removing the layer and deallocating it. That is, by design, different to merely making the layer invisible and then visible, and gives developers a good set of options. Opacity is a visual property. The layer still participates in the map's layer collection. This has impacts on a number of things (table of contents, loaded metadata, related layers/tables, and so forth). The only way opacity 0 might impact a layer is that it will not be considered during identity (which is a visual/interaction based operation). In addition, layers have a Visible property. They also potentially have visible scale ranges. Both of those control whether a layer is visible in the map and can impact its state in a table of contents/legend, but if a layer is out of visible range it should remain in that ToC, just displayed differently. There's potential that we could do a better job of considering the Visible property in helping us prioritize removing data from in-memory cache under memory pressure, but even then we wouldn't unload the metadata, and it would be in response to cache pressure heuristics. Hopefully that helps explain why just setting opacity to 0 doesn't really signify that a layer can be deallocated. Instead, the recommended approach would be to remove the layer from the Map's layers collection and deallocating it. You'd need to keep track of where in the current layer stack to re-insert if when the user wants to see it again, but we believe that is the best balance when considering the broader picture of layers in a map.
... View more
03-07-2024
09:01 AM
|
0
|
0
|
710
|
POST
|
Excellent. That's great to hear. Thanks for confirming. Seems strange to me that the Pro tool should create one layer per row. Perhaps the issue is with how the CSV file is being parsed. Either way, glad you've got a working solution!
... View more
01-08-2024
09:34 AM
|
1
|
0
|
1149
|
POST
|
Hello. That's correct: non-Esri proprietary layer sources such as local rasters require a Standard license. This is true whether delivered as standalone files (e.g. .geotiff), or packaged up through a format such as Mobile Map Packages or Mobile Scene Packages. That information is included here, but I will see if we can be more explicit about that. For the layers that did not display, you should be able to look at the loadError on that layer to understand why. Related: where a layer can load OK but subsequently fails to draw for some other reason (e.g. the network drops for a connected layer) then you can also check the viewState. In this case, the license check error would be on layer load though, so view state won't be useful. A reasonable rule of thumb is start by checking if a layer loaded ok first, and if so, check the view state to see why it might have stopped working.
... View more
11-21-2023
12:49 PM
|
0
|
0
|
666
|
POST
|
Hello, The sample service has a very limited geographic extent (I'm not sure what it is precisely, but I think it's just a small area around San Diego). If your different points are outside that limited extent, you won't get results. In answer to your questions: You can find all the documentation on the service here: https://developers.arcgis.com/documentation/mapping-apis-and-services/routing/closest-facility-routing/ That includes the REST endpoints, as well as examples of using the service with various SDKs. Yes. Hope that helps. Nick.
... View more
11-20-2023
04:38 PM
|
0
|
0
|
634
|
POST
|
How are you creating the points that you're using for the graphics being added to the graphics overlay? Could you share a code snippet? You should ensure that the points are created with a spatial reference (unless you're using this constructor, which initializes a point with wgs84). The Map will have a spatial reference derived from the layers that are present when the map is loaded/first displayed. Typically that's the spatial reference of the basemap. Based off that spatial reference, the SDK will project geometries as needed, but those geometries will also need spatial references.
... View more
11-17-2023
12:38 PM
|
1
|
1
|
795
|
POST
|
Hi, Thanks for the question and the detailed info. In general, a map that gets into the hundreds of layers at a time raises some questions about the purpose or composition of the map. Given that a layer is meant to represent a type or category of data rather than a specific piece of data, you can see that 100 layers is a lot for a user to take in. Layers are database tables, not table rows, and you should really aim to consolidate data of one type into one table (i.e. layer). Often when you have that many types of data, they are for visual reference and best consolidated into a single layer (or basemap layer). That can be efficiently packaged up as a vector tile layer and often removes a lot of complexity. You mentioned a few things that I'd like to ask about: "approximately 1800 FeatureLayers from 9 different mmpks, with each mmpk containing between 20 to 1000 layers" - are you opening all the maps within these 9 MMPKs and selectively moving layers to your app's visible map? How large are the MMPKs? Are the data layers in each MMPK the same but include data for different geographies? Or are they 9 completely different sets of layers? "only those [layers] within the current viewing range are added" - do you mean visible geographic extent, or are you also considering scale range? Layers can be configured with a minScale and maxScale. I'm not sure that would help with performance in this case, but it's worth considering alongside other recommendations as it could simplify your code logic. Also, remember that Runtime will only load the data it needs for the visible extent. As you pan around the map, it will load more as needed. Unless you have other operational reasons to segregate your data by geography, you don't need to break it up that way for Runtime (in fact it can often complicate map content), though it should be an acceptable approach as long as you're removing layers you no longer display. When a layer or map is not being displayed, make sure you're not holding on to references to it. Consider the geometry types of these feature layers. Point layers render on the GPU in dynamic mode by default, and polyline/polygon layers render on the CPU in static mode. When adjacent layers in the layer stack all render in static mode, that static rendering is consolidated efficiently. However, if you have static and dynamic layers intermingled, each static layer is rendered independently and that's much more computationally expensive. You can explore a couple of things here: - A general good rule of thumb is order your layers by type. Polygons on the bottom, then polylines, then points. That aligns nicely with removing interleaving of static and dynamic layers. - You can control the rendering mode of feature layers (though there are some cases, like heat map renderers, where you can't), so if you do need to retain some interleaving of polyline or polygon layers between point layers, you can explicitly set the rendering mode on those polyline/polygon layers to dynamic to see if that helps. In a lot of situations, dynamic mode works well on polyline and polygon layers. If you have multiple layers showing the same kinds of data with the same symbology/renderers, then that can be wasteful. It's much more efficient if that data is merged into one layer, and at that point Runtime can render all that data with one renderer instance and do a lot more resource sharing within the rendering pipeline. If you are able to join multiple layers showing the same kind of data into one layer, you can use displayFilters and/or definitionExpressions to limit the features that are displayed at any given time. If you have picture marker symbol based off bitmaps/images, how large are those bitmaps? Each bitmap must be kept in memory in the GPU, and possibly also in the CPU as part of the renderer definition. If you have multiple layers using the same bitmap, that is wasteful (see previous bullet - if you can consolidate like layers into one, that will probably benefit you a lot). Likewise, if the bitmap is higher resolution than is needed for displaying the symbol, that can also lead to waste. Consider the renderers defined on each layer. When a layer is loaded and displayed, that renderer is created and lives in memory. For example, do you have multiple unique value renderers that use large bitmaps? It's not that unique value renderers themselves are expensive (in fact, they are a very efficient way to render given that we can do a lot of resource sharing in the GPU), but if you have higher-resolution bitmaps in use then a UVR is a quick way to multiply that problem. Also think about the spatial reference of the feature layer data in the MMPKs. If you're using a Web Mercator basemap (as all our standard basemaps are) but the data is in another spatial reference, then it must be projected on the fly. With this much data this could be impacting performance. You can see more discussion on this and other performance considerations here: https://developers.arcgis.com/ios/programming-patterns/performance-considerations/ To answer your questions: Could the sheer number of layers within each mmpk be the cause of these performance issues? Yes. Is merging layers a potential solution? Definitely. Also consider whether any layers can be consolidated into a single vector tile layer. Remember you can't interrogate vector tile layers to read attributes though - they're purely for visual reference. What is the recommended number of layers within a single mmpk? We can't really give a hard number. It depends a lot on how many are visible at any one time with scale ranges, and what type of layer it is, and the complexity/density of the data. But from experience I would say that if you have over 150 layers, you should really be looking at how you're authoring your map. However if layers have complex renderers or complex data, you could start seeing the impact sooner. Personally, I start to ask myself questions if my map has over 50 layers. Not from a performance perspective, but from a "just what is my map supposed to be doing" perspective. Sometimes the answer is "Yeah, it does need all those", but often it's a chance to simplify. It's also a bit different since (if I understand correctly) you're not displaying all the layers at once and are consolidating things from multiple MMPKs into one Map. It's what ends up in that map that is probably more important here. What is the recommended size for each layer in terms of square kilometers? There isn't a limit. They can be global, and I don't think the geographic extent of your data is a concern here. The spatial indexing on the data ensures that data is read from the MMPK's internal data store very efficiently by geography. However, make sure that your layers have suitable minScale and maxScale defined so that data isn't displayed at inappropriate zoom levels. Also note that when accessing an ArcGIS service, data is generalized for display by the server when it's requested, but in an MMPK the data in the MMPK is raw and no generalization happens until the data has been read from the MMPK and passed to the GPU for rendering. Vector tile layers can help with this. Also, if the data needs to remain as feature layers and you need to see complicated polylines and polygons at a large range of scales, consider including a generalized copy of that data in the MMPK as a separate layer. Use scale ranges on the layers so that when zoomed out you view the generalized data and when zoomed in you view the original denser data. You'll take a hit on the size of the MMPK, but will buy yourself a better experience viewing the data. It's not typically necessary, but worth bearing in mind. Hopefully the above ideas will help you reduce the number of layers you're working with, which is almost certainly the primary thing you should focus on. But I am guessing a lot about what you might be doing and how your data is authored. If you need more help then (depending on where you are globally) I would probably recommend a Professional Services engagement to dig deeper into your data workflows and use cases. Or if you have an Esri account manager or local distributor that you're working with, perhaps reach out to them and see if they have a preferred approach to getting some one-on-one help.
... View more
11-17-2023
09:56 AM
|
3
|
2
|
1305
|
Title | Kudos | Posted |
---|---|---|
1 | 12-09-2024 10:16 AM | |
2 | 12-11-2024 11:12 AM | |
3 | 11-25-2024 07:29 AM | |
3 | 11-25-2024 07:36 AM | |
4 | 11-25-2024 07:10 AM |