[x3d-public] [x3d] Spec Comment by dougsanden on 19775-1:X3DArchitecture - V3.0

GPU Group gpugroup at gmail.com
Sun Jun 14 17:37:17 PDT 2020


EXPLORE III

For navigaiton we mostly use the pointing device in the graphics window as
a 2D slidebar Often the slidebar maps to parameters only indirectly related
to the position of the 2D slidebar.



In theory, a number of regular slidebars or 2D slidebars could be provided
separately from the graphics window, to control the parameters.



Here's another re-write of EXPLORE that uses the notion of general scalar
and 2D scalar controls, and leaving it to browsers to apply meaning to the
slidebar value, perhaps with named tabs on 2D touch area separate from
graphics area, or navigation-meaning-named modifier buttons


"EXPLORE" navigation is used to provide consistent mapping of user inputs
to navigation for both geospatial and Cartesian modes. Browsers will
provide scalar and 2D scalar controls with meanings applied during drags.
When "EXPLORE" mode is active:

   1. vertical axis rotation - scalar control - Dragging the scalar control
   causes viewpoint rotation about a vertical axis that passes through the
   point of rotation. This vertical axis is always perpendicular to the
   viewpoint vector. Motion in the -ve  direction rotates the viewpoint
   clockwise (as viewed from the top) about the vertical axis. Rotation is
   tied to the motion of the scalar control; there is no damping or delay.
   2. horizontal axis rotation - scalar control - Dragging the scalar
   control causes rotation about a horizontal axis that passes through the
   point of rotation. Motion in the +ve direction rotates the viewpoint
   clockwise (as viewed from the right) about the horizontal axis. Rotation is
   tied to the motion of the pointing device; there is no damping or delay.
   3. zoom - scalar control - drag movement in +ve direction causes the
   viewpoint to zoom toward and -ve away from the point of rotation.
   4. zoom with disabled touch sensors - scalar control - similar to zoom
   5. pan - 2D control - motion of the 2d control is translated into a pan
   of the viewpoint in a plane passing through the viewpoint perpendicular to
   the vector pointing to the point of rotation.
   6. pan without touchsensors - 2D control - similar to pan
   7. object / point centering - 2D control - The point of rotation can be
   set by casting a ray onto a scene object . To provide feedback that the
   point has been selected, the viewpoint shall zoom about twenty percent of
   the distance toward that point.
   8. isOver ray display - if the ray is positioned over a TouchSensor, the
   ray shall change its appearance to indicate a click will activate the
   TouchSensor.
   9. navigation - only - 2D control - overrides any TouchSensor, functions
   as the viewpoint navigation tool





On Sun, Jun 14, 2020 at 11:41 AM GPU Group <gpugroup at gmail.com> wrote:

> my attempt to abstract the EXPLORE NavigationInfo type:
>
> "EXPLORE" navigation is used to provide consistent keystroke *input
> device mapping* navigation for both geospatial and Cartesian modes. Common
> terms:
>
> Drag - moving primary pointing device with a trigger activated, primary
> trigger unless otherwise specified
>
> Click - activating and releasing a trigger, primary trigger unless
> otherwise specified
>
> up/down,left/right - relative to viewport sides
>
> trigger button state-permutation names:
>
> *1.primary*
>
> *2. secondary*
>
> *3. tertiary*
>
> *4. quaternary*
>
> 5. *quinary*
>
> *6*, *senary*
>
> *7.* *septenary*
>
> *8.* *octonary*
>
> 9. *nonary*
>
> *10* *denary*
>
> When "EXPLORE" mode is active:
>
>    1. Dragging left and right while holding the left button down causes
>    viewpoint rotation about a vertical axis that passes through the point of
>    rotation. This vertical axis is always perpendicular to the viewpoint
>    vector. Motion in the left direction rotates the viewpoint clockwise (as
>    viewed from the top) about the vertical axis. Rotation is tied to the
>    motion of the pointing device; there is no damping or delay.
>    2. Dragging the up and down while holding the left button down causes
>    rotation about a horizontal axis that passes through the point of rotation.
>    Motion in the up direction rotates the viewpoint clockwise (as viewed from
>    the right) about the horizontal axis. Rotation is tied to the motion of the
>    pointing device; there is no damping or delay.
>    3. Holding the Ctrl key (or other key that may be user-selectable)
>    down modifies the left button down Secondary trigger drag movement such
>    that up and down (Y-axis) movement causes the viewpoint to zoom toward
>    and from the point of rotation. Tertiary trigger Left and right motion while
>    Ctrl is held down has no effect. Shift and Ctrl (or other keys that may be
>    user-selectable) held at the same time also enables zoom but disables
>    TouchSensors.
>    4. Holding the Alt key (or other key that may be user-selectable)
>    modifies the movement such that motion of the pointing device while the
>    left button is held down Quaternary trigger movement is translated
>    into a pan of the viewpoint in a plane passing through the viewpoint
>    perpendicular to the vector pointing to the point of rotation. Shift
>    and Alt (or other keys that may be user-selectable) held at the same time
>    also Quinary trigger enables pan but disables TouchSensors.
>    5. The point of rotation can be set by holding the Shift key (or other
>    key that may be user-selectable) while pointing at an object and
>    clicking the left button senary trigger. To provide feedback that the
>    point has been selected, the viewpoint shall zoom about twenty percent of
>    the distance toward that point.
>    6. If the pointer is positioned over a TouchSensor, the pointer icon
>    shall change its appearance to indicate that a left primary click will
>    activate the TouchSensor.
>    7. Holding the Shift key (or other key that may be user-selectable)
>    Septenary trigger overrides any TouchSensor that the pointer may be
>    over and forces the pointing device to function as the viewpoint navigation
>    tool; *i.e.*, drag operations cause rotation, click operations cause
>    center of rotation point selection.
>
> Whether user-selectable alternatives to the Shift, Ctrl, and/or Alt are
> provided is browser-dependent. If provided, the method by which such
> alternatives are specified is also browser-dependent.
>
> On Sun, Jun 14, 2020 at 9:45 AM GPU Group <gpugroup at gmail.com> wrote:
>
>> Yes I see there's terminology it looks like its is mapping keyboard to
>> work as a pointing device
>> Hypothesis: there could be a way to remap EXPLORE (which is heavy in
>> button / pointing-device-motion specifics)
>> a) write it more generally somehow
>> b) then as with Appendix G, do some device-specific mappings.
>> Same with (new v4) HELICOPTER, GAME motions - could they be written
>> generally like the WALK, FLY, EXAMINE - with very little talk about buttons
>> and pointing devices - and then somehow articulate mappings in Appendix G?
>>
>> Here are some device scenarios
>> i) desktop 2 button + wheel mouse, full keyboard with arrow keys,
>> ctrl,shft,alt
>> ii) mobile - gyro, touch screen
>> iii) non-mobile touch screen
>> iv) HMD with gyro and viewport center
>> v) desktop game controller
>>
>> EXPLORE could define 'drag / dragging' in such a way to share with
>> HELICOPTER, GAME and new ones if they can't be specified notion-lessly.
>>
>>
>> On Sun, Jun 14, 2020 at 9:36 AM Joseph D Williams <joedwil at earthlink.net>
>> wrote:
>>
>>> For example:
>>>
>>>
>>>
>>> *WALK:      forward/backward/left/right*
>>>
>>> *FLY:       forward/backward/left/right*
>>>
>>> *EXAMINE:   orbit up/down/left/right around center of rotation*
>>>
>>> *           with camera pointed at center of rotation*
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> Sent from Mail <https://go.microsoft.com/fwlink/?LinkId=550986> for
>>> Windows 10
>>>
>>>
>>>
>>> *From: *Joseph D Williams <joedwil at earthlink.net>
>>> *Sent: *Sunday, June 14, 2020 8:31 AM
>>> *To: *GPU Group <gpugroup at gmail.com>; X3D Graphics public mailing list
>>> <x3d-public at web3d.org>
>>> *Subject: *Re: [x3d-public] [x3d] Spec Comment by dougsanden on
>>> 19775-1:X3DArchitecture - V3.0
>>>
>>>
>>>
>>>
>>>
>>>    - Q. And could/should named navigation modes/types be
>>>
>>>
>>>
>>> How about some more work on:
>>>
>>> Extensible 3D (X3D) Part 1: Architecture and base components
>>>
>>> Annex G Recommended navigation behaviours
>>>
>>> (informative)
>>>
>>>
>>>
>>>
>>> https://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/behaviours.html
>>>
>>>
>>>
>>> Several reasons the annex was informative at that time, or even
>>> attempted.
>>>
>>> Maybe more Is understood now, and this offers some guidance from some
>>> point in time.
>>>
>>> Thanks,
>>>
>>> Joe
>>>
>>>
>>>
>>>
>>>
>>> *From: *GPU Group <gpugroup at gmail.com>
>>> *Sent: *Sunday, June 14, 2020 6:28 AM
>>> *To: *X3D Graphics public mailing list <x3d-public at web3d.org>
>>> *Subject: *Re: [x3d-public] [x3d] Spec Comment by dougsanden on
>>> 19775-1: X3DArchitecture - V3.0
>>>
>>>
>>>
>>> Forwarding some comments on other channels
>>>
>>> Q. should specs > NavigationInfo attempt to abstract input / pointing
>>> device terminology
>>>
>>> - so gyros, game controllers/pads, touch screens, 3D pointing devices ?,
>>> HMD / AR gyro/view-center - can all be mapped more generically:
>>>
>>>
>>>
>>> Instead of mouse xy, it would be 'primaryXY channel' or  'primary
>>> 2D pointing device'
>>>
>>>
>>>
>>> Q. And could/should named navigation modes/types be specified in terms
>>> of the order and transform element being mapped to:
>>>
>>> WALK: yaw and Z are applied to last yaw-z pose, then pitch-roll applied
>>>
>>> FREEFLY: yaw,z,pitch,roll are applied in any order
>>>
>>> Perhaps something in a table format?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Doug Sanden
>>>
>>>
>>>
>>>
>>>
>>> On Sat, Jun 13, 2020 at 12:15 PM GPU Group <gpugroup at gmail.com> wrote:
>>>
>>> "have your ray loop forget about ID=2"
>>>
>>> One notable difference between a touch device and a mouse: a mouse has
>>> an up-drag. A touch doesn't.
>>>
>>> - that makes no difference to MultiTouch/MultiDragSensor, which only
>>> works with down-drags.
>>>
>>> Where you see the difference: isOver, and navigation modes that assume
>>> updrag is available - like proposed GAME mode.
>>>
>>> And what you do when a button / touch is 'released':
>>>
>>> - updrag-capabile input devices: you likely just change a button state,
>>> and keep drawing the cursor at the last location
>>>
>>> - updrag-incapable input devices - you likely release/forget/recycle the
>>> ID number
>>>
>>> SUMMARY: web3d specs may need more terms to describe input device
>>> classes and capabilities more abstractly
>>>
>>> - up-drags
>>>
>>> - drag-ID
>>>
>>> - drag-ID recycling
>>>
>>>
>>>
>>>
>>>
>>> On Sat, Jun 13, 2020 at 11:59 AM GPU Group <gpugroup at gmail.com> wrote:
>>>
>>> Drags need an ID ie 1,2,3 and that comes normally in windows 7
>>> desktop WM_TOUCH events, or more precisely you can get an index number in a
>>> lookup table of touches.
>>>
>>> A mouse-friendly use of a MultiDragSensor:
>>>
>>> - your regular pointing device ray might be ID=1
>>>
>>> - to create a second ray, you can park/freeze ID=1 with some mouse or
>>> keyboard button ie MMB - and push a 2nd one with ID=2 onto a stack
>>>
>>> - then you would move drag ID= 2, until done with scaling and rotation,
>>> and unfreeze / pop to get back to dragging ID=1 - maybe same button- and
>>> have your ray loop forget about ID=2 after that, until the user repeats the
>>> cycle. Or something like that.
>>>
>>> SUMMARY: yes - it can be abstracted from touch devices, but you still
>>> need a per-drag ID.
>>>
>>> -Doug
>>>
>>>
>>>
>>> On Sat, Jun 13, 2020 at 11:18 AM Spec Feedback <spec-comment at web3d.org>
>>> wrote:
>>>
>>> -- Submitter indicates that this comment may be public: *Yes* --
>>>
>>> Comment on 19775-1: X3D Architecture - V3.0
>>> MultiTouchSensor
>>>
>>>
>>> -----------------
>>> MultiTouchSensor > Touch vs Drag > MutliDragSensor
>>> In theory it should be called something that abstracts it from the
>>> particular
>>> type of input device.
>>> Drag is more input device neutral - there could be other device-neutral
>>> terms.
>>> For example a typical game controller has 2 thumb sticks that could act
>>> like
>>> 2 touches.
>>> A combination of device gyro -in HMD or mobile phone or Wii Controller-
>>> and
>>> freeze-button could shoot one ray, freeze it as one touch, then shoot
>>> another
>>> ray to drag.
>>> Internally, code works with a plane sensor xy origin when a button goes
>>> down/a ray is shot, then a drag xy is compared to the orgin xy to see
>>> how far
>>> in what direction: internally its drag-oriented and doesn't care what
>>> device
>>> shot the rays.
>>> -----------------
>>>
>>> Submitted on Saturday, 2020,  June 13 - 11:18am
>>> by dougsanden (dougsanden )
>>> IP: 75.159.18.239
>>>
>>> See: https://www.web3d.org/node/1694/submission/4039
>>>
>>>
>>> _______________________________________________
>>> x3d mailing list
>>> x3d at web3d.org
>>> http://web3d.org/mailman/listinfo/x3d_web3d.org
>>>
>>>
>>>
>>>
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20200614/49f8ba18/attachment-0001.html>


More information about the x3d-public mailing list