[x3d-public] [x3d] Spec Comment by npolys on 19775-1: HypersurfaceSensor for multi-touch environments [corrected]

Don Brutzman brutzman at nps.edu
Sat Apr 11 17:20:22 PDT 2020


[correction: added Dr. Oppermann on addressee list]

Nicholas, thanks for posting this suggested addition for X3D4.

On 4/10/2020 8:36 AM, Spec Feedback wrote:
> -- Submitter indicates that this comment may be public: *Yes* --
> 
> Comment on 19775-1: Abstract X3D Definitions - V3.3
> 20.3.1 X3DDragSensorNode
> https://www.web3d.org/documents/specifications/19775-1/V3.3/index.html
> -----------------
> Adapting X3D for multi-touch environments
> 
> HypersurfaceSensor : X3DDragSensorNode {
[...]

Now entered as Mantis issue 1293.

==============================================================
[1] Mantis 1293: HypersurfaceSensor for multi-touch environments
     https://www.web3d.org/member-only/mantis/view.php?id=1293

Description	Consider new node adapting X3D for multi-touch environments, extending

[2] 20.3.1 X3DDragSensorNode
     https://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/components/pointingsensor.html#X3DDragSensorNode

HypersurfaceSensor : X3DDragSensorNode {
...
SFVec3f [in,out] translationOffset 0 0 0
SFRotation [in,out] rotationOffset 0 0 1 0
SFVec3f [in,out] scaleOffset 1 1 1
SFVec3f [in,out] minScale 0.1 0.1 0.1
SFVec3f [in,out] maxScale 10 10 10
SFVec3f [out] translation_changed
SFRotation [out] rotation_changed
SFVec3f [out] scale_changed
MFVec3f [out] hitNormalizedCoord_changed
}

[3] Reference:
Yvonne Jung, Jens Keil, Johannes Behr, Sabine Webel, M. Zöllner, Timo Engelke, Harald Wuest, Mario Becker, "Adapting X3D for multi-touch environments," Web3D 2008: Proceedings of the 13th international symposium on 3D web technology, August 2008 Pages 27–30.  https://doi.org/10.1145/1394209.1394218

ABSTRACT. Multi-touch interaction on tabletop displays is a very active field of todays HCI research. However, most publications still focus on tracking techniques or develop a gesture configuration for a specific application setup. Very few explore generic high level interfaces for multi-touch applications. In this paper we present a comprehensive hardware and software setup, which includes an X3D based layer to simplify the application development process.

We present a robust FTIR based optical tracking system, examine in how far current sensor and navigation abstractions in the X3D standard are useful and finally present extensions to the standard, which enable designers and other non-programmers to develop multi-touch applications very efficiently.
==============================================================

I remember the Fraunhofer IGD demos in 2008, they were compelling.  Very fast - essentially users were unable to select and drag multiple documents on a display faster than the table system could handle!

Also found:

[4] 3D Multi-Touch Environment
     contact Dr. Leif Oppermann leif.oppermann at fit.fraunhofer.de
     https://www.fit.fraunhofer.de/en/fb/cscw/projects/3d-multi-touch.html

[5] igeedee: Multi-Touch 3D Architecture Application, 2 March 2008
     https://www.youtube.com/watch?v=TAanod1F6bI

[6] Fraunhofer IGD: instant3Dhub
     https://www.igd.fraunhofer.de/en/projects/instant3dhub
     https://www.igd.fraunhofer.de/sites/default/files/media/projekte/2018-03-21_innnovationswerkstatt_banner_1440x448px.jpg

Next steps for X3D4:
- current thinking: define functionality and semantics of multiple touch,
- discussion,
- example specification prose,
- evaluate example scene and example implementation,
- refine specification, post examples to archive, multiple implementations,
- enjoy!

all the best, Don
-- 
Don Brutzman  Naval Postgraduate School, Code USW/Br       brutzman at nps.edu
Watkins 270,  MOVES Institute, Monterey CA 93943-5000 USA   +1.831.656.2149
X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzman



More information about the x3d-public mailing list