[x3d-public] [x3d] Spec Comment by dougsanden on 19775-1:X3DArchitecture - V3.0

J. Scheurich mufti11 at web.de
Sun Jun 14 20:35:15 PDT 2020


:
>
> Appendix G style mapping
>
> Example trigger mapping for desktop keyboard + mouse
>
> trigger permutation
>
>
>
> LMB
>
>
>
> RMB
>
>
>
> SHIFT
>
>
>
> ALT
>
>
>
> CTRL
>
>
>
> 1.primary
>
>
>
> *
>
>
>
>
>
>
>
>
>
>
>
> 2. secondary
>
>
>
> *
>
>
>
>
>
>
>
>
>
> *
>
>
>
> 3. tertiary
>
>
>
> *
>
>
>
>
>
> *
>
>
>
>
>
> *
>
>
>
> 4. quaternary
>
>
>
> *
>
>
>
>
>
>
>
> *
>
>
>
>
>
> 5. quinary
>
>
>
> *
>
>
>
>
>
> *
>
>
>
> *
>
>
>
>
>
> 6, senary
>
>
>
> *
>
>
>
>
>
> *
>
>
>
>
>
>
>
> 7. septenary
>
>
>
>
>
>
>
>
>
>
>
>
>
> 8. octonary
>
>
>
>
>
>
>
>
>
>
>
>
>
> 9. nonary
>
>
>
>
>
>
>
>
>
>
>
>
>
> 10 denary
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> On Sun, Jun 14, 2020 at 11:41 AM GPU Group <gpugroup at gmail.com
> <mailto:gpugroup at gmail.com>> wrote:
>
>     my attempt to abstract the EXPLORE NavigationInfo type:
>
>     "EXPLORE" navigation is used to provide consistent keystroke_input
>     device mapping_ navigation for both geospatial and Cartesian modes.
>

Is it wise to define keyboard mapping, while VR systems have usually
multple buttons
on their input device, but are awkward to use a leyboard ?

The apple problem (one button mouse in past) could be solved by leybord
usage...

so long
MUFTI
>
>     Common terms:
>
>     Drag - moving primary pointing device with a trigger activated,
>     primary trigger unless otherwise specified
>
>     Click - activating and releasing a trigger, primary trigger unless
>     otherwise specified
>
>     up/down,left/right - relative to viewport sides
>
>     trigger button state-permutation names:
>
>     /1.primary/
>
>     /2. secondary/
>
>     /3. tertiary/
>
>     /4. quaternary/
>
>     5. /quinary/
>
>     /6/, /senary/
>
>     /7.//septenary/
>
>     /8.//octonary/
>
>     9. /nonary/
>
>     /10//denary/
>
>     When "EXPLORE" mode is active:
>
>      1. Dragging left and right while holding the left button down
>         causes viewpoint rotation about a vertical axis that passes
>         through the point of rotation. This vertical axis is always
>         perpendicular to the viewpoint vector. Motion in the left
>         direction rotates the viewpoint clockwise (as viewed from the
>         top) about the vertical axis. Rotation is tied to the motion
>         of the pointing device; there is no damping or delay.
>      2. Dragging the up and down while holding the left button down
>         causes rotation about a horizontal axis that passes through
>         the point of rotation. Motion in the up direction rotates the
>         viewpoint clockwise (as viewed from the right) about the
>         horizontal axis. Rotation is tied to the motion of the
>         pointing device; there is no damping or delay.
>      3. Holding the Ctrl key (or other key that may be
>         user-selectable) down modifies the left button downSecondary
>         triggerdrag movement such that up and down (Y-axis) movement
>         causes the viewpoint to zoom toward and from the point of
>         rotation. Tertiary trigger Left and right motion while Ctrl is
>         held down has no effect. Shift and Ctrl (or other keys that
>         may be user-selectable) held at the same time also enables
>         zoom but disables TouchSensors.
>      4. Holding the Alt key (or other key that may be user-selectable)
>         modifies the movement such that motion of the pointing device
>         while the left button is held downQuaternary trigger
>         movementis translated into a pan of the viewpoint in a plane
>         passing through the viewpoint perpendicular to the vector
>         pointing to the point of rotation. Shift and Alt (or other
>         keys that may be user-selectable) held at the same time
>         alsoQuinary trigger enables pan but disables TouchSensors.
>      5. The point of rotation can be set by holding the Shift key (or
>         other key that may be user-selectable) while pointing at an
>         object and clicking the left buttonsenary trigger. To provide
>         feedback that the point has been selected, the viewpoint shall
>         zoom about twenty percent of the distance toward that point.
>      6. If the pointer is positioned over a TouchSensor, the pointer
>         icon shall change its appearance to indicate that a
>         leftprimary click will activate the TouchSensor.
>      7. Holding the Shift key (or other key that may be
>         user-selectable)Septenary triggeroverrides any TouchSensor
>         that the pointer may be over and forces the pointing device to
>         function as the viewpoint navigation tool; /i.e./, drag
>         operations cause rotation, click operations cause center of
>         rotation point selection.
>
>     Whether user-selectable alternatives to the Shift, Ctrl, and/or
>     Alt are provided is browser-dependent. If provided, the method by
>     which such alternatives are specified is also browser-dependent.
>
>
>     On Sun, Jun 14, 2020 at 9:45 AM GPU Group <gpugroup at gmail.com
>     <mailto:gpugroup at gmail.com>> wrote:
>
>         Yes I see there's terminology it looks like its is mapping
>         keyboard to work as a pointing device
>         Hypothesis: there could be a way to remap EXPLORE (which is
>         heavy in button / pointing-device-motion specifics)
>         a) write it more generally somehow
>         b) then as with Appendix G, do some device-specific mappings.
>         Same with (new v4) HELICOPTER, GAME motions - could they be
>         written generally like the WALK, FLY, EXAMINE - with very
>         little talk about buttons and pointing devices - and then
>         somehow articulate mappings in Appendix G?
>
>         Here are some device scenarios
>         i) desktop 2 button + wheel mouse, full keyboard with arrow
>         keys, ctrl,shft,alt
>         ii) mobile - gyro, touch screen
>         iii) non-mobile touch screen
>         iv) HMD with gyro and viewport center
>         v) desktop game controller
>
>         EXPLORE could define 'drag / dragging' in such a way to share
>         with HELICOPTER, GAME and new ones if they can't be specified
>         notion-lessly.
>
>
>         On Sun, Jun 14, 2020 at 9:36 AM Joseph D Williams
>         <joedwil at earthlink.net <mailto:joedwil at earthlink.net>> wrote:
>
>             For example:
>
>             *WALK: forward/backward/left/right*
>
>             *FLY: forward/backward/left/right*
>
>             *EXAMINE: orbit up/down/left/right around center of rotation*
>
>             *with camera pointed at center of rotation*
>
>             **
>
>             Sent from Mail
>             <https://go.microsoft.com/fwlink/?LinkId=550986> for
>             Windows 10
>
>             *From: *Joseph D Williams <mailto:joedwil at earthlink.net>
>             *Sent: *Sunday, June 14, 2020 8:31 AM
>             *To: *GPU Group <mailto:gpugroup at gmail.com>; X3D Graphics
>             public mailing list <mailto:x3d-public at web3d.org>
>             *Subject: *Re: [x3d-public] [x3d] Spec Comment by
>             dougsanden on 19775-1:X3DArchitecture - V3.0
>
>               * Q. And could/should named navigation modes/types be
>
>             How about some more work on:
>
>             Extensible 3D (X3D) Part 1: Architecture and base components
>
>             Annex G Recommended navigation behaviours
>
>             (informative)
>
>             https://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/behaviours.html
>
>             Several reasons the annex was informative at that time, or
>             even attempted.
>
>             Maybe more Is understood now, and this offers some
>             guidance from some point in time.
>
>             Thanks,
>
>             Joe
>
>             *From: *GPU Group <mailto:gpugroup at gmail.com>
>             *Sent: *Sunday, June 14, 2020 6:28 AM
>             *To: *X3D Graphics public mailing list
>             <mailto:x3d-public at web3d.org>
>             *Subject: *Re: [x3d-public] [x3d] Spec Comment by
>             dougsanden on 19775-1: X3DArchitecture - V3.0
>
>             Forwarding some comments on other channels
>
>             Q. should specs > NavigationInfo attempt to abstract input
>             / pointing device terminology
>
>             - so gyros, game controllers/pads, touch screens, 3D
>             pointing devices ?, HMD / AR gyro/view-center - can all be
>             mapped more generically:
>
>             Instead of mouse xy, it would be 'primaryXY channel' or 
>             'primary 2D pointing device'
>
>             Q. And could/should named navigation modes/types be
>             specified in terms of the order and transform element
>             being mapped to:
>
>             WALK: yaw and Z are applied to last yaw-z pose, then
>             pitch-roll applied
>
>             FREEFLY: yaw,z,pitch,roll are applied in any order
>
>             Perhaps something in a table format?
>
>             Thanks,
>
>             Doug Sanden
>
>             On Sat, Jun 13, 2020 at 12:15 PM GPU Group
>             <gpugroup at gmail.com <mailto:gpugroup at gmail.com>> wrote:
>
>                 "have your ray loop forget about ID=2"
>
>                 One notable difference between a touch device and a
>                 mouse: a mouse has an up-drag. A touch doesn't.
>
>                 - that makes no difference to
>                 MultiTouch/MultiDragSensor, which only works with
>                 down-drags.
>
>                 Where you see the difference: isOver, and navigation
>                 modes that assume updrag is available - like proposed
>                 GAME mode.
>
>                 And what you do when a button / touch is 'released':
>
>                 - updrag-capabile input devices: you likely just
>                 change a button state, and keep drawing the cursor at
>                 the last location
>
>                 - updrag-incapable input devices - you likely
>                 release/forget/recycle the ID number
>
>                 SUMMARY: web3d specs may need more terms to describe
>                 input device classes and capabilities more abstractly
>
>                 - up-drags
>
>                 - drag-ID
>
>                 - drag-ID recycling
>
>                 On Sat, Jun 13, 2020 at 11:59 AM GPU Group
>                 <gpugroup at gmail.com <mailto:gpugroup at gmail.com>> wrote:
>
>                     Drags need an ID ie 1,2,3 and that comes
>                     normally in windows 7 desktop WM_TOUCH events, or
>                     more precisely you can get an index number in a
>                     lookup table of touches.
>
>                     A mouse-friendly use of a MultiDragSensor:
>
>                     - your regular pointing device ray might be ID=1
>
>                     - to create a second ray, you can park/freeze ID=1
>                     with some mouse or keyboard button ie MMB - and
>                     push a 2nd one with ID=2 onto a stack
>
>                     - then you would move drag ID= 2, until done with
>                     scaling and rotation, and unfreeze / pop to get
>                     back to dragging ID=1 - maybe same button- and
>                     have your ray loop forget about ID=2 after that,
>                     until the user repeats the cycle. Or something
>                     like that.
>
>                     SUMMARY: yes - it can be abstracted from touch
>                     devices, but you still need a per-drag ID.
>
>                     -Doug
>
>                     On Sat, Jun 13, 2020 at 11:18 AM Spec Feedback
>                     <spec-comment at web3d.org
>                     <mailto:spec-comment at web3d.org>> wrote:
>
>             -- Submitter indicates that this comment may be public:
>             *Yes* --
>
>             Comment on 19775-1: X3D Architecture - V3.0
>             MultiTouchSensor
>
>
>             -----------------
>             MultiTouchSensor > Touch vs Drag > MutliDragSensor
>             In theory it should be called something that abstracts it
>             from the particular
>             type of input device.
>             Drag is more input device neutral - there could be other
>             device-neutral
>             terms.
>             For example a typical game controller has 2 thumb sticks
>             that could act like
>             2 touches.
>             A combination of device gyro -in HMD or mobile phone or
>             Wii Controller- and
>             freeze-button could shoot one ray, freeze it as one touch,
>             then shoot another
>             ray to drag.
>             Internally, code works with a plane sensor xy origin when
>             a button goes
>             down/a ray is shot, then a drag xy is compared to the
>             orgin xy to see how far
>             in what direction: internally its drag-oriented and
>             doesn't care what device
>             shot the rays.
>             -----------------
>
>             Submitted on Saturday, 2020,  June 13 - 11:18am
>             by dougsanden (dougsanden )
>             IP: 75.159.18.239
>
>             See: https://www.web3d.org/node/1694/submission/4039
>
>
>             _______________________________________________
>             x3d mailing list
>             x3d at web3d.org <mailto:x3d at web3d.org>
>             http://web3d.org/mailman/listinfo/x3d_web3d.org
>
>
> _______________________________________________
> x3d-public mailing list
> x3d-public at web3d.org
> http://web3d.org/mailman/listinfo/x3d-public_web3d.org




More information about the x3d-public mailing list