After purchasing iPad Pros we discovered that tapping with Apple's Pencil on a map to identify features is not a viable option, at least not with the 3.x API. Finger taps work, just not the pencil taps. Can this be improved if using the 3.x API? The 4.x samples seem to work.
With the pencil you have to tap very rapidly (5-10 times per second) and eventually it might detect a feature from one of the taps in order to run the identify task. Redline drawing is also not behaving as expected, although it's at least consistent. When drawing a straight line, a double-tap was either necessary for starting and stopping or it was the opposite of what you expected to begin and end a line. I currently don't have a device with me to clarify the redlining.
Summary of symptoms:
Unlike tapping with a finger, the Apple pencil pointer is not properly detected with a single tap in an identify task click event on ArcGIS dynamic map service layers or in other click activities like redline drawing and measurement tool.
- iPad Pro
- Apple Pencil (model number A1603)
Experimenting with identify tolerance setting did not help. The problem is in the detection of the actual screen touch itself rather than taking that screen point and selecting a feature. The pencil is thoroughly charged, and it responds perfectly on other apps and web mapping applications, including Google Maps and Cartegraph OMS which pulls in feature services displayed over ArcGIS Server basemap map services (not sure of Cartegraph's application framework).