I am upgrading a WPF application to a Runtime .NET application. In the WPF application, I had the requirement to be able
to move the point by following a moving the mouse, or finger on tablet. That was working very well, I even had a magnifier displayed when doing that.
This function is broken in the new runtime and I am trying to get a similar behavior back. Since the method Editor.EditGeometryAsync doesn't support Point geometry, I can't reproduce this behavior. Is that any way to have the point move with the user finger on tablet?
The users love the old way the editor was working, and don't understand very well why the latest version give them a behavior they see as less user friendly, so any help on that is much appreciated.