[x3d-public] [x3d] Spec Comment by dougsanden on19775-1:X3DArchitecture - V3.0

GPU Group gpugroup at gmail.com
Sun Jun 14 13:23:18 PDT 2020


Yes Joe - great thoughts.
Keyboard: mobile devices 'pop up' a keyboard or if you pin it its taking up
valuable screen space. If its not a prompt, then you have to summon the
keyboard too. Writing a spec so you _have_ to use a keyboard seems like it
would be limiting to a desktop scenario which would be appropriate in 2006.
Having said that, if a mobile touch-centric non-keyboard-involking browser
had a menu bar, they could put 3 toggle buttons for SHIFT, ALT, CTRL and
then be pretty close to the EXPLORE specs.
But what about a game controller? They have some buttons that would be
handier than clicking on a menu bar.
Something similar for HMDs you might have some wands in your hand, and
they may have buttons on them, and so you would want to map to those
triggers some how.
So I'm still in favor of generalizing input device mappings.
Having said that, there's a concept of a 'virtual device' which is
all-encompasing and specs would be written wirt the virtual device, and
then deployments would map to the virtual device. That's common with games
- a virtual game controller, and then if you have something else, its 2D,
scalar and buttons get mapped to the virtual device, and so games can be
wrtten w.r.t. the virtual device.
But what would a good all encompassing virtual device look like? something
that covers multitouch and 2 thumb game controllers, what about HMDs, what
about future stuff - will we have multi-user on same machine ie 2 game
controllers?
-Doug



On Sun, Jun 14, 2020 at 1:48 PM Joseph D Williams <joedwil at earthlink.net>
wrote:

> Once I recall maybe gravity should be default part of certain navigations
> and not others. Like how the nograv of Fly is what you want to do, mostly,
> unless you are flying by aero, then of course there are other effects.
> Suppose you are flying and somebody turns gravity on?
>
>
>
> Also, please look for current x3d ‘name’ for interactions, like maybe
> hover is isOver and use of select rather than click. Some superset of x3d
> and html event inOut styles?
>
> The important thing is that a competent input/output device must be able
> to describe itself as to its allowed data and interactions plus the author
> must be able to describe what is needed to make the show, maybe at various
> capability levels. Finally, the user must be able to tell what can be done
> and make some simple tests to observe the results of using various
> techniques with available tools.
>
>
>
> X3d recommended optional use of the keyboard and provided keys and events
> (not normative).
>
> However, for implementers there was enough behind it so that most users in
> 2006 could expect that best browsers would at least do that set of keys.
> Again, the purpose was not to stop or start anything, just to document some
> current best practices that were being done by the most interesting tools.
>
>
>
> This challenge of creating usable abstractions for human user interaction
> using modern human interface arts was last documented by x3d around 2006, I
> think. I hope to see great advancements in documenting what will be
> regarded as World-class best practices for development of an important
> collection of x3d statements and sensor nodes to support the widest range
> of human input/output features. We need to get interaction data into the
> scene, out of the scene , interpreted by the user, then put back into the
> scene, etc.
>
>
>
> Thanks,
>
> Joe
>
>
>
> *From: *GPU Group <gpugroup at gmail.com>
> *Sent: *Sunday, June 14, 2020 11:45 AM
> *To: *Joseph D Williams <joedwil at earthlink.net>
> *Cc: *X3D Graphics public mailing list <x3d-public at web3d.org>
> *Subject: *Re: [x3d-public] [x3d] Spec Comment by dougsanden
> on19775-1:X3DArchitecture - V3.0
>
>
>
> My attempt at specifying HELICOPTER and GAME navigationInfo.type
> generically
>
>
>
> "SPHERICAL" navigation freezes the location while allowing convenient
> yaw-pitch look around.
>
>
>
> "HELICOPTER" navigation is similar to "WALK" except avatar height
> adjustment is convenient, and "SPHERICAL" style yaw-pitch look around is
> convenient.
>
>
>
> "GAME" navigation is like "WALK" except yaw-ptich is done with up-drag
> primary pointing device, and Z travel with secondary pointing device.
>
>
>
>
>
> On Sun, Jun 14, 2020 at 11:57 AM GPU Group <gpugroup at gmail.com> wrote:
>
>
>
> Appendix G style mapping
>
> Example trigger mapping for desktop keyboard + mouse
>
>
>
> trigger permutation
>
> LMB
>
> RMB
>
> SHIFT
>
> ALT
>
> CTRL
>
>
>
> 1.primary
>
> *
>
>
>
>
>
>
>
>
>
>
>
> 2. secondary
>
> *
>
>
>
>
>
>
>
> *
>
>
>
> 3. tertiary
>
> *
>
>
>
> *
>
>
>
> *
>
>
>
> 4. quaternary
>
> *
>
>
>
>
>
> *
>
>
>
>
>
> 5. quinary
>
> *
>
>
>
> *
>
> *
>
>
>
>
>
> 6, senary
>
> *
>
>
>
> *
>
>
>
>
>
>
>
> 7. septenary
>
>
>
>
>
>
>
>
>
>
>
>
>
> 8. octonary
>
>
>
>
>
>
>
>
>
>
>
>
>
> 9. nonary
>
>
>
>
>
>
>
>
>
>
>
>
>
> 10 denary
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> On Sun, Jun 14, 2020 at 11:41 AM GPU Group <gpugroup at gmail.com> wrote:
>
> my attempt to abstract the EXPLORE NavigationInfo type:
>
>
>
> "EXPLORE" navigation is used to provide consistent keystroke *input
> device mapping* navigation for both geospatial and Cartesian modes. Common
> terms:
>
> Drag - moving primary pointing device with a trigger activated, primary
> trigger unless otherwise specified
>
> Click - activating and releasing a trigger, primary trigger unless
> otherwise specified
>
> up/down,left/right - relative to viewport sides
>
> trigger button state-permutation names:
>
> *1.primary*
>
> *2. secondary*
>
> *3. tertiary*
>
> *4. quaternary*
>
> 5. *quinary*
>
> *6*, *senary*
>
> *7.* *septenary*
>
> *8.* *octonary*
>
> 9. *nonary*
>
> *10* *denary*
>
> When "EXPLORE" mode is active:
>
>    1. Dragging left and right while holding the left button down causes
>    viewpoint rotation about a vertical axis that passes through the point of
>    rotation. This vertical axis is always perpendicular to the viewpoint
>    vector. Motion in the left direction rotates the viewpoint clockwise (as
>    viewed from the top) about the vertical axis. Rotation is tied to the
>    motion of the pointing device; there is no damping or delay.
>    2. Dragging the up and down while holding the left button down causes
>    rotation about a horizontal axis that passes through the point of rotation.
>    Motion in the up direction rotates the viewpoint clockwise (as viewed from
>    the right) about the horizontal axis. Rotation is tied to the motion of the
>    pointing device; there is no damping or delay.
>    3. Holding the Ctrl key (or other key that may be user-selectable)
>    down modifies the left button down Secondary trigger drag movement such
>    that up and down (Y-axis) movement causes the viewpoint to zoom toward
>    and from the point of rotation. Tertiary trigger Left and right motion while
>    Ctrl is held down has no effect. Shift and Ctrl (or other keys that may be
>    user-selectable) held at the same time also enables zoom but disables
>    TouchSensors.
>    4. Holding the Alt key (or other key that may be user-selectable)
>    modifies the movement such that motion of the pointing device while the
>    left button is held down Quaternary trigger movement is translated
>    into a pan of the viewpoint in a plane passing through the viewpoint
>    perpendicular to the vector pointing to the point of rotation. Shift
>    and Alt (or other keys that may be user-selectable) held at the same time
>    also Quinary trigger enables pan but disables TouchSensors.
>    5. The point of rotation can be set by holding the Shift key (or other
>    key that may be user-selectable) while pointing at an object and
>    clicking the left button senary trigger. To provide feedback that the
>    point has been selected, the viewpoint shall zoom about twenty percent of
>    the distance toward that point.
>    6. If the pointer is positioned over a TouchSensor, the pointer icon
>    shall change its appearance to indicate that a left primary click will
>    activate the TouchSensor.
>    7. Holding the Shift key (or other key that may be user-selectable)
>    Septenary trigger overrides any TouchSensor that the pointer may be
>    over and forces the pointing device to function as the viewpoint navigation
>    tool; *i.e.*, drag operations cause rotation, click operations cause
>    center of rotation point selection.
>
> Whether user-selectable alternatives to the Shift, Ctrl, and/or Alt are
> provided is browser-dependent. If provided, the method by which such
> alternatives are specified is also browser-dependent.
>
>
>
> On Sun, Jun 14, 2020 at 9:45 AM GPU Group <gpugroup at gmail.com> wrote:
>
> Yes I see there's terminology it looks like its is mapping keyboard to
> work as a pointing device
>
> Hypothesis: there could be a way to remap EXPLORE (which is heavy in
> button / pointing-device-motion specifics)
>
> a) write it more generally somehow
>
> b) then as with Appendix G, do some device-specific mappings.
>
> Same with (new v4) HELICOPTER, GAME motions - could they be written
> generally like the WALK, FLY, EXAMINE - with very little talk about buttons
> and pointing devices - and then somehow articulate mappings in Appendix G?
>
>
>
> Here are some device scenarios
>
> i) desktop 2 button + wheel mouse, full keyboard with arrow keys,
> ctrl,shft,alt
> ii) mobile - gyro, touch screen
> iii) non-mobile touch screen
> iv) HMD with gyro and viewport center
>
> v) desktop game controller
>
>
>
> EXPLORE could define 'drag / dragging' in such a way to share with
> HELICOPTER, GAME and new ones if they can't be specified notion-lessly.
>
>
>
>
>
> On Sun, Jun 14, 2020 at 9:36 AM Joseph D Williams <joedwil at earthlink.net>
> wrote:
>
> For example:
>
>
>
> *WALK:      forward/backward/left/right*
>
> *FLY:       forward/backward/left/right*
>
> *EXAMINE:   orbit up/down/left/right around center of rotation*
>
> *           with camera pointed at center of rotation*
>
>
>
>
>
>
>
> Sent from Mail <https://go.microsoft.com/fwlink/?LinkId=550986> for
> Windows 10
>
>
>
> *From: *Joseph D Williams <joedwil at earthlink.net>
> *Sent: *Sunday, June 14, 2020 8:31 AM
> *To: *GPU Group <gpugroup at gmail.com>; X3D Graphics public mailing list
> <x3d-public at web3d.org>
> *Subject: *Re: [x3d-public] [x3d] Spec Comment by dougsanden on
> 19775-1:X3DArchitecture - V3.0
>
>
>
>
>
>    - Q. And could/should named navigation modes/types be
>
>
>
> How about some more work on:
>
> Extensible 3D (X3D) Part 1: Architecture and base components
>
> Annex G Recommended navigation behaviours
>
> (informative)
>
>
>
>
> https://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/behaviours.html
>
>
>
> Several reasons the annex was informative at that time, or even attempted.
>
> Maybe more Is understood now, and this offers some guidance from some
> point in time.
>
> Thanks,
>
> Joe
>
>
>
>
>
> *From: *GPU Group <gpugroup at gmail.com>
> *Sent: *Sunday, June 14, 2020 6:28 AM
> *To: *X3D Graphics public mailing list <x3d-public at web3d.org>
> *Subject: *Re: [x3d-public] [x3d] Spec Comment by dougsanden on 19775-1:
> X3DArchitecture - V3.0
>
>
>
> Forwarding some comments on other channels
>
> Q. should specs > NavigationInfo attempt to abstract input / pointing
> device terminology
>
> - so gyros, game controllers/pads, touch screens, 3D pointing devices ?,
> HMD / AR gyro/view-center - can all be mapped more generically:
>
>
>
> Instead of mouse xy, it would be 'primaryXY channel' or  'primary
> 2D pointing device'
>
>
>
> Q. And could/should named navigation modes/types be specified in terms of
> the order and transform element being mapped to:
>
> WALK: yaw and Z are applied to last yaw-z pose, then pitch-roll applied
>
> FREEFLY: yaw,z,pitch,roll are applied in any order
>
> Perhaps something in a table format?
>
>
>
> Thanks,
>
> Doug Sanden
>
>
>
>
>
> On Sat, Jun 13, 2020 at 12:15 PM GPU Group <gpugroup at gmail.com> wrote:
>
> "have your ray loop forget about ID=2"
>
> One notable difference between a touch device and a mouse: a mouse has an
> up-drag. A touch doesn't.
>
> - that makes no difference to MultiTouch/MultiDragSensor, which only works
> with down-drags.
>
> Where you see the difference: isOver, and navigation modes that assume
> updrag is available - like proposed GAME mode.
>
> And what you do when a button / touch is 'released':
>
> - updrag-capabile input devices: you likely just change a button state,
> and keep drawing the cursor at the last location
>
> - updrag-incapable input devices - you likely release/forget/recycle the
> ID number
>
> SUMMARY: web3d specs may need more terms to describe input device classes
> and capabilities more abstractly
>
> - up-drags
>
> - drag-ID
>
> - drag-ID recycling
>
>
>
>
>
> On Sat, Jun 13, 2020 at 11:59 AM GPU Group <gpugroup at gmail.com> wrote:
>
> Drags need an ID ie 1,2,3 and that comes normally in windows 7
> desktop WM_TOUCH events, or more precisely you can get an index number in a
> lookup table of touches.
>
> A mouse-friendly use of a MultiDragSensor:
>
> - your regular pointing device ray might be ID=1
>
> - to create a second ray, you can park/freeze ID=1 with some mouse or
> keyboard button ie MMB - and push a 2nd one with ID=2 onto a stack
>
> - then you would move drag ID= 2, until done with scaling and rotation,
> and unfreeze / pop to get back to dragging ID=1 - maybe same button- and
> have your ray loop forget about ID=2 after that, until the user repeats the
> cycle. Or something like that.
>
> SUMMARY: yes - it can be abstracted from touch devices, but you still need
> a per-drag ID.
>
> -Doug
>
>
>
> On Sat, Jun 13, 2020 at 11:18 AM Spec Feedback <spec-comment at web3d.org>
> wrote:
>
> -- Submitter indicates that this comment may be public: *Yes* --
>
> Comment on 19775-1: X3D Architecture - V3.0
> MultiTouchSensor
>
>
> -----------------
> MultiTouchSensor > Touch vs Drag > MutliDragSensor
> In theory it should be called something that abstracts it from the
> particular
> type of input device.
> Drag is more input device neutral - there could be other device-neutral
> terms.
> For example a typical game controller has 2 thumb sticks that could act
> like
> 2 touches.
> A combination of device gyro -in HMD or mobile phone or Wii Controller- and
> freeze-button could shoot one ray, freeze it as one touch, then shoot
> another
> ray to drag.
> Internally, code works with a plane sensor xy origin when a button goes
> down/a ray is shot, then a drag xy is compared to the orgin xy to see how
> far
> in what direction: internally its drag-oriented and doesn't care what
> device
> shot the rays.
> -----------------
>
> Submitted on Saturday, 2020,  June 13 - 11:18am
> by dougsanden (dougsanden )
> IP: 75.159.18.239
>
> See: https://www.web3d.org/node/1694/submission/4039
>
>
> _______________________________________________
> x3d mailing list
> x3d at web3d.org
> http://web3d.org/mailman/listinfo/x3d_web3d.org
>
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20200614/10ba2d8b/attachment-0001.html>


More information about the x3d-public mailing list