[x3d-public] v4 proposed MultitouchSensor

Don Brutzman brutzman at nps.edu
Sun Jun 7 22:35:21 PDT 2020


Wow!!  Very impressive Doug.  Have added links to

* Mantis 1293: HypersurfaceSensor/MultiTouchSensor for multi-touch environments
   https://www.web3d.org/member-only/mantis/view.php?id=1293

It is quite interesting how you appear to use a single mouse to emulate a multitouch device, also providing visual feedback.  Looks like operation is very intuitive.  Am also wondering if we can write this us in spec prose as an alternate approach a multi-touch system as part of X3D4:

* X3DArchitecture,Annex G Recommended navigation behaviours
   https://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/behaviours.html

Presumably OK to post your video to YouTube and Twitter later this week I hope, please advise if not.


On 6/7/2020 3:49 PM, GPU Group wrote:
> I found bugs in my first implementation and spent another 4.5 days working through them;
> Results:
> http://dug9.users.sourceforge.net/web3d/tests/sensors/DragCascade_MultitouchSensorII.mp4
> 
> Method:
> MultitouchSensor had output fields for rotation_changed, scale_changed, translation_changed
> When computing the translation from 2 points you might end up with something like how Transform nodes are done - with a center point C for where you rotate and scale:
> T x C x R x S x (-C)
> And so to get a summary translation with no center, you need to solve:
> TxRxS = TxCxRxSx(-C)
> And I wasn't doing that properly. I found there are 2 ways:
> 1) matrix decompose - you can chain transforms together, then decompose the final transform
> 
> https://webdocs.cs.ualberta.ca/~graphics/books/GraphicsGems/gemsiv/polar_decomp/
> 
> - GraphicsGems IV matrix decomposer
> 
> 2) least squares similarity 2D solver - which solves for 4 params: rotation, scale, xy translation
> 
> https://sourceforge.net/p/freewrl/git/ci/develop/tree/freex3d/src/lib/input/SensInterps.c
> 
> - lines 1725 - 2063 are marked MIT Lic or equivalent, and have the matrix solver and 2D similarity solver
> 
> I used a bit of both - least squares for computing the transform from 2 drags,
> 
> and matrix decomposer for combining with previous offsets .offset (translation) rotationOffset, scaleOffset.
> 
> Tout = Tcomputed_from_current_drags X Toffsets
> 
> - then decomposing Tout to set the translation_changed, rotation_changed, scale_changed fields.
> 
> 
> A few days of my time were spent trying to diagnose freewrl-specific issues, primarily
> 
> we 'freeze' the transform to the sensor while a button is down. For 2 buttons, what
> 
> happens if you add a 3rd button, or remove one of 2 buttons. Having a way to update
> 
> the transforms reliably took some time, and one thing that helped was rendering the touch/drag points
> 
> in sensor space -by making it a renderable node, and rendering the drag ponits-
> 
>   as well as screen / viewer space. Weird motions often made sense relative to
> 
> the touch-down points in sensor space.
> 
> 
> Having a built-in emulator for multitouch helped a bit - didn't need to fiddle with other devices,
> 
> - just mouse.
> 
> 
> Good luck - hope other x3d browsers are easier.
> 
> -Doug
> 
> 
> 
>     My experience implementing (v4 proposed) MultitouchSensor in freewrl:
> 
>     Results:
> 
>     http://dug9.users.sourceforge.net/web3d/tests/sensors/DragCascade_MultitouchSensor.mp4
> 
>     http://dug9.users.sourceforge.net/web3d/tests/sensors/DragCascade_MultitouchSensor.x3d
> 
> 
>     Method:
>     freewrl has had a multitouch emulator built in for serveral years.
>     - a command line option turns it on, then it uses the mouse differently
>     -- a RMB click creates a new drag -with a touchID
>     -- a LMB click near an existing drag will grab it, and LMB drag will drag it
>     -- another RMB click on an existing drag will delete it
>     - and then whereever we are passing around mouse coordinates and button status, we pass an additional touchID.
>     -- I verified the touch emulator approach with another emulator that puts touches into the windows desktop from a 2nd computer, and inhales the touches via windows events.
>     But all we did was drag 2 separate dragsensors with 2 separate touches.
> 
>     This new spec node adds a node that will take 2+ touches and do something interesting - output a rotation and scale, along with the usual planesensor type translation.
>     So I started by copying the Planesensor and hacking it to count active drags, and when it has only one, it acts like a PlaneSensor, and when it has 2+ it uses the first 2 to compute rotation and scale
> 
>     Time:
>     Took 2.5 days. fiddly hard to understand code in freewrl. Hope other browsers are more organized.
> 
>     And of the 2.5 days I spent 1/2 day reviewing a least_squares option which I didn't use, but could be used with 3+ points to compute a best-fit affine (6 parameter - 2 scales, 1 shear, 1 rotation, and xy 2 translations)
> 
>     -Doug

all the best, Don
-- 
Don Brutzman  Naval Postgraduate School, Code USW/Br       brutzman at nps.edu
Watkins 270,  MOVES Institute, Monterey CA 93943-5000 USA   +1.831.656.2149
X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzman



More information about the x3d-public mailing list