[x3d-public] X3D VR

Andreas Plesch andreasplesch at gmail.com
Sun Jan 8 06:29:32 PST 2017


> Date: Sat, 7 Jan 2017 21:04:42 +0100
> From: Holger Seelig <holger.seelig at yahoo.de>
> To: x3d-public at web3d.org
> Subject: Re: [x3d-public] X3D VR
>
>
> I think instead of defining a new field to the existing viewpoints, it
> should be considered that for that purpose and new viewpoint node could
> be appropriate. A node called something like ?MatrixViewpoint?. This
> node could have two fields matrix and projectionMatrix. The matrix field
> would store the transformation matrix with orientation and position
> values. The projectionMatrix field would store the projection matrix.
> This viewpoint would be an all purpose viewpoint. The main idea is if
> one is able to define an ?viewMatrix? field as Andreas proposes he is
> able to handle a node like MatrixViewpoint with its matrix fields.
> Defining a new node has the advantages to left the existing viewpoints
> untouched and to handle this behavior in a new node.
>
> Holger

It turns out that x3dom and InstantReality have such a node, called
Viewfrustum:

https://doc.x3dom.org/author/Navigation/Viewfrustum.html

So this is a pretty strong argument for a new node instead of new fields.

New fields have the advantage that they should make it easier to adapt
existing scenes. I think default values could be such that they guarantee
full backward compatibility.

I will try to use Viewfrustum with the x3dom classroom example to gain some
experience.

Andreas

>
> Am 07.01.2017 um 18:07 schrieb Andreas Plesch:
> > Hi,
> >
> > back from the holiday season, here are proposals for new fields for
> > Viewpoint.
> >
> > I do not have an x3dom implementation but the idea is to add two
> > Matrix4d fields:
> >
> > projectionMatrix :
> >
> > default: NULL
> >
> > purpose: allow control of the view projection from 3d to 2d by such a
> > matrix from sources such HMD drivers. The projection matrix controls fov
> >
> > The projectionMatrix should be used as is by the browser if not NULL. If
> > NULL, a projection matrix shall be contructed from viewpoint fields.
> >
> >
> > viewMatrix:
> >
> > default: unit matrix
> >
> > purpose: allow control of the viewer orientation and location by such a
> > matrix from sources such as HMD drivers.
> >
> > The viewMatrix V shall be combined with the matrix M derived from the
> > location/ orientation fields by M * V for rendering use.
> >
> > This means that if V is used in a scene exclusively the location fields
> > needs to be reset to 0 0 0 from its default by the scene.
> >
> > Typical use would be as sinks of routes from (new) ProjectionSensor or
> > VRDevice node fields.
> >
> > A VRDevice delivers both matrixes for each eye. A scene can then set up
> > a viewpoint for each eye, and use Layers to show them side by side.
> >
> > In terms of efficiency, this would seem to require repeated traversals
> > of the scene unless both Layers can be recognized to show the same scene
> > content.
> >
> > A more convenient and integrated alternative would be to introduce a
> > stereo SFBool field and have left and right variants of the matrix
> > fields ( and perhaps also left and right variants of the location,
> > orientation, fov, zfar, znear fields).
> >
> > stereo = false would ignore all left and right fields.
> > stereo = true would use the left/right fields and render them side by
side.
> >
> > Since in this case there is a guaranty that views for each eye use the
> > same scene content, only one traversal should be needed to render both
eyes.
> >
> > The idea would be switching mono to stereo and back only requires
> > toggling the stereo field. Navigation may have a similar switch to and
> > from mouse/keyboard modes ?
> >
> > Implementation in x3dom of viewMatrix and projectionMatrix may be rather
> > straightforward. I do not see large obstacles (famous last words ...).
> > Implementation in cobweb of viewMatrix and projectionMatrix may be also
> > straightforward but I am less sure.
> >
> > Without layers in x3dom then one could put two scene side by side on the
> > web page, use external js to feed the matrix fields, and then also use
> > external js to grab the frames from both scene canvases, combine them
> > into a single frame on another canvas, and use that finally as a source
> > for the HMD display. (This sounds slow).
> > In cobweb, it should not be hard to set up a scene with a layer for each
> > eye.
> >
> > Implementation of stereo, left and rightView/ProjectionMatrix in x3dom
> > is harder since it needs changes to the rendering code which is a bit
> > unstructured. The first approach would be for stereo=true to go through
> > the rendering twice with the rendering surfaces appropriately set
somehow.
> >
> > -Andreas
> >
> > On Dec 21, 2016 6:00 AM, "Roy Walmsley" <roy.walmsley at ntlworld.com
> > <mailto:roy.walmsley at ntlworld.com>> wrote:
> >
> >     Hi Andreas,____
> >
> >     __ __
> >
> >     That?s great to see what you have been doing here. Would  you like
> >     to take your node suggestions one step further and propose the
> >     fields that might be specified for each node?____
> >
> >     __ __
> >
> >     What we could do then is, in the new year, is to host an open
> >     discussion on WebVR and X3D. Contributions from any other readers
> >     are welcome, and could lead to a lively discussion on this
topic.____
> >
> >     __ __
> >
> >     Thank you for the suggestions,____
> >
> >     __ __
> >
> >     All the best,____
> >
> >     __ __
> >
> >     Roy____
> >
> >     __ __
> >
> >     *From:*x3d-public [mailto:x3d-public-bounces at web3d.org
> >     <mailto:x3d-public-bounces at web3d.org>] *On Behalf Of *Andreas Plesch
> >     *Sent:* 21 December 2016 05:51
> >     *To:* X3D Graphics public mailing list <x3d-public at web3d.org
> >     <mailto:x3d-public at web3d.org>>
> >     *Subject:* [x3d-public] X3D VR____
> >
> >     __ __
> >
> >     Hi____
> >
> >     Working with x3dom and WebVR I am exploring what additions to X3D
> >     would be necessary or useful to effectively use current VR hardware
> >     such as the Rift, Vive, Gear, or Cardboard.____
> >
> >     The proposed RenderedTexture node is currently used in x3dom with
> >     special stereo modes to generate left and right views from a single
> >     viewpoint in a single GL context.____
> >
> >     A Layer for each left and right view could also be used with two
> >     coordinated but separate viewpoints. This would be Cobweb's
pattern.____
> >
> >     Since the HMD and its own runtime know best the internal optical
> >     characteristics of the display, the preferred interface at least in
> >     WebVR are view and projection matrixes directly usable by consumers
> >     for 3d graphics generation, and not position, orientation and fov.
> >     This leads to a new requirement for X3D to accept these raw matrixes
> >     as input.____
> >
> >     Since x3dom uses RenderedTexture for VR, I added there additional
> >     special stereo modes which directly receive these matrixes from the
> >     HMD to be used for rendering at each frame. This in effect
> >     accomplishes a first, head and body based, level of navigation. In
> >     my first tests (on github) it works well and should be robust across
> >     devices. This approach does require some standard API to the HMD
> >     such as WebVR.____
> >
> >     Another possibility is to add view and projection matrix fields
> >     input fields to viewpoints which can be continually updated. One
> >     could convolve the view matrix with position/orientation fields, or
> >     optionally completely ignore them.____
> >
> >     One then could use external SAI to keep updating or perhaps
> >     introduce an environment ProjectionSensor. It would relay these
> >     matrix events from an HMD runtime to then be routed to the
> >     viewpoints.____
> >
> >     A second level of navigation is accomplished with handheld
> >     controllers. Until some standard gestures evolve, it will be
> >     necessary to expect custom per scene navigation. X3d should have a
> >     way to sense buttons and axes velocities of such controllers so
> >     these are then available to manipulate the view in some way.
> >     InstantReality has a general IOSensor to that effect. On the web the
> >     GamePad API is a standard which would need to be used.____
> >
> >     Summary: RenderedTexture with stereo modes, matrix fields for
> >     viewpoints, a ProjectionSensor, and a ControllerSensor would all be
> >     candidates for x3d VR support.____
> >
> >     Thanks for reading, any thoughts welcome,____
> >
> >     Andreas ____
> >
> >
> >
> >
> > _______________________________________________
> > x3d-public mailing list
> > x3d-public at web3d.org
> > http://web3d.org/mailman/listinfo/x3d-public_web3d.org
> >
>
>
> --
> Holger Seelig
> Mediengestalter Digital ? Digital Media Designer
>
> Scheffelstra?e 31a
> 04277 Leipzig
> Germany
>
> Cellular: +49 1577 147 26 11
> E-Mail:   holger.seelig at create3000.de
> Web:      http://titania.create3000.de
>
> Future to the fantasy ? ?
>
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> x3d-public mailing list
> x3d-public at web3d.org
> http://web3d.org/mailman/listinfo/x3d-public_web3d.org
>
>
> ------------------------------
>
> End of x3d-public Digest, Vol 94, Issue 17
> ******************************************
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20170108/cfd4e8d0/attachment.html>


More information about the x3d-public mailing list