Hi all,
I am upgrading a WPF application to a Runtime .NET application. In the WPF application, I had the requirement to be able
to move the point by following a moving the mouse, or finger on tablet. That was working very well, I even had a magnifier displayed when doing that.
This function is broken in the new runtime and I am trying to get a similar behavior back. Since the method Editor.EditGeometryAsync doesn't support Point geometry, I can't reproduce this behavior. Is that any way to have the point move with the user finger on tablet?
The users love the old way the editor was working, and don't understand very well why the latest version give them a behavior they see as less user friendly, so any help on that is much appreciated.
Best regards,
Fabien
Solved! Go to Solution.
Thanks to @Morten Nielson,
I got the answers I needed came up with a custom solution inspired by the "Move Points" sample. To confirm, the editor does not support the behavior I was trying to get, hence some custom code. The workflow is the following:
1. On touch down, I set the IsDragEnabled option of my MapView.InteractionOptions to false. (Only after a successful hit test on my layer)
2. On touch move, I move the point selected by the hit test.
3. On touch up, I set the IsDragEnabled option of my MapView.InteractionOptions to true.
Hope this helps others.
Fabien
Thanks to @Morten Nielson,
I got the answers I needed came up with a custom solution inspired by the "Move Points" sample. To confirm, the editor does not support the behavior I was trying to get, hence some custom code. The workflow is the following:
1. On touch down, I set the IsDragEnabled option of my MapView.InteractionOptions to false. (Only after a successful hit test on my layer)
2. On touch move, I move the point selected by the hit test.
3. On touch up, I set the IsDragEnabled option of my MapView.InteractionOptions to true.
Hope this helps others.
Fabien