[x3d-public] [x3d] Spec Comment by dougsanden on 19775-1:X3DArchitecture - V3.0

Joseph D Williams joedwil at earthlink.net
Sun Jun 14 08:36:32 PDT 2020


For example:

WALK:      forward/backward/left/right
FLY:       forward/backward/left/right
EXAMINE:   orbit up/down/left/right around center of rotation
           with camera pointed at center of rotation



Sent from Mail for Windows 10

From: Joseph D Williams
Sent: Sunday, June 14, 2020 8:31 AM
To: GPU Group; X3D Graphics public mailing list
Subject: Re: [x3d-public] [x3d] Spec Comment by dougsanden on 19775-1:X3DArchitecture - V3.0


➢ Q. And could/should named navigation modes/types be

How about some more work on:
Extensible 3D (X3D) Part 1: Architecture and base components
Annex G Recommended navigation behaviours
(informative)

https://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/behaviours.html

Several reasons the annex was informative at that time, or even attempted. 
Maybe more Is understood now, and this offers some guidance from some point in time.
Thanks, 
Joe


From: GPU Group
Sent: Sunday, June 14, 2020 6:28 AM
To: X3D Graphics public mailing list
Subject: Re: [x3d-public] [x3d] Spec Comment by dougsanden on 19775-1: X3DArchitecture - V3.0

Forwarding some comments on other channels
Q. should specs > NavigationInfo attempt to abstract input / pointing device terminology
- so gyros, game controllers/pads, touch screens, 3D pointing devices ?, HMD / AR gyro/view-center - can all be mapped more generically:

Instead of mouse xy, it would be 'primaryXY channel' or  'primary 2D pointing device'

Q. And could/should named navigation modes/types be specified in terms of the order and transform element being mapped to:
WALK: yaw and Z are applied to last yaw-z pose, then pitch-roll applied
FREEFLY: yaw,z,pitch,roll are applied in any order
Perhaps something in a table format?

Thanks,
Doug Sanden


On Sat, Jun 13, 2020 at 12:15 PM GPU Group <gpugroup at gmail.com> wrote:
"have your ray loop forget about ID=2"
One notable difference between a touch device and a mouse: a mouse has an up-drag. A touch doesn't. 
- that makes no difference to MultiTouch/MultiDragSensor, which only works with down-drags.
Where you see the difference: isOver, and navigation modes that assume updrag is available - like proposed GAME mode.
And what you do when a button / touch is 'released': 
- updrag-capabile input devices: you likely just change a button state, and keep drawing the cursor at the last location
- updrag-incapable input devices - you likely release/forget/recycle the ID number
SUMMARY: web3d specs may need more terms to describe input device classes and capabilities more abstractly
- up-drags
- drag-ID
- drag-ID recycling


On Sat, Jun 13, 2020 at 11:59 AM GPU Group <gpugroup at gmail.com> wrote:
Drags need an ID ie 1,2,3 and that comes normally in windows 7 desktop WM_TOUCH events, or more precisely you can get an index number in a lookup table of touches.
A mouse-friendly use of a MultiDragSensor:
- your regular pointing device ray might be ID=1
- to create a second ray, you can park/freeze ID=1 with some mouse or keyboard button ie MMB - and push a 2nd one with ID=2 onto a stack 
- then you would move drag ID= 2, until done with scaling and rotation, and unfreeze / pop to get back to dragging ID=1 - maybe same button- and have your ray loop forget about ID=2 after that, until the user repeats the cycle. Or something like that. 
SUMMARY: yes - it can be abstracted from touch devices, but you still need a per-drag ID.
-Doug

On Sat, Jun 13, 2020 at 11:18 AM Spec Feedback <spec-comment at web3d.org> wrote:
-- Submitter indicates that this comment may be public: *Yes* --

Comment on 19775-1: X3D Architecture - V3.0
MultiTouchSensor


-----------------
MultiTouchSensor > Touch vs Drag > MutliDragSensor
In theory it should be called something that abstracts it from the particular
type of input device.
Drag is more input device neutral - there could be other device-neutral
terms.
For example a typical game controller has 2 thumb sticks that could act like
2 touches.
A combination of device gyro -in HMD or mobile phone or Wii Controller- and
freeze-button could shoot one ray, freeze it as one touch, then shoot another
ray to drag.
Internally, code works with a plane sensor xy origin when a button goes
down/a ray is shot, then a drag xy is compared to the orgin xy to see how far
in what direction: internally its drag-oriented and doesn't care what device
shot the rays.
-----------------

Submitted on Saturday, 2020,  June 13 - 11:18am
by dougsanden (dougsanden )
IP: 75.159.18.239

See: https://www.web3d.org/node/1694/submission/4039


_______________________________________________
x3d mailing list
x3d at web3d.org
http://web3d.org/mailman/listinfo/x3d_web3d.org


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20200614/bdb5b67a/attachment.html>


More information about the x3d-public mailing list