We've got a Xamarin Forms application that, for the most part, runs and performs pretty well. We started to see some performance degradation in the map with a new customer's data so we decided to do some hard core performance testing on it to try to find the breaking point. The last test I performed loaded around 2000 feature layers from a SQLite geodatabase into the map and the map performance was horrible and the app was consistently crashing with unhandled exceptions. Interestingly enough, all but a handful of those layers were turned off so I figured things would be ok since 99% of the data was not being rendered. It seems that the presence of all that data, whether it's turned on or not, has a negative impact on the map doing its job efficiently. And performance started degrading long before I got to 2000 layers.
I guess this is not unexpected behavior, but I'm interested to hear from the Esri folks on this stuff. Have you guys done any performance testing on the map control and can you let us know what types of limits you've found? Obviously volume of data within each layer comes into play here in addition to the way the layers were setup in the source MXDs and APRXs, but all things being equal what should we expect ?