<p dir="ltr">Let's start with the RenderedTexture node since it is in actual use.</p>
<p dir="ltr">References first:</p>
<p dir="ltr"><a href="http://doc.x3dom.org/author/Texturing/RenderedTexture.html">http://doc.x3dom.org/author/Texturing/RenderedTexture.html</a></p>
<p dir="ltr"><a href="http://realism.com/blog/3d-google-cardboard">http://realism.com/blog/3d-google-cardboard</a></p>
<p dir="ltr"><a href="http://doc.instantreality.org/documentation/nodetype/RenderedTexture/?filter=None">http://doc.instantreality.org/documentation/nodetype/RenderedTexture/?filter=None</a></p>
<p dir="ltr"><a href="http://castle-engine.sourceforge.net/x3d_implementation_texturing_extensions.php#section_ext_rendered_texture">http://castle-engine.sourceforge.net/x3d_implementation_texturing_extensions.php#section_ext_rendered_texture</a></p>
<p dir="ltr"><a href="http://www.xj3d.org/extensions/render_texture.html">http://www.xj3d.org/extensions/render_texture.html</a></p>
<p dir="ltr">All follow the initial xj3d extension but only x3dom uses the node for stereo rendering by adding stereoMode and interpupillaryDistance fields. For a stereo pair it is necessary to use two RenderedTexture nodes, for each of the eyes, with 'RIGHT_EYE' or 'LEFT_EYE' stereoMode values.</p>
<p dir="ltr">Using the interEye distance the node can then construct offsets from the viewpoint position for rendering.</p>
<p dir="ltr">My extension adds 'RIGHT_VR' and 'LEFT_VR' stereoMode values which trigger direct retrieval of the current left and right view and projection matrixes from the VR display via the WebVR API. I currently multiply this view matrix with the view matrix derived from the viewpoint's position and orientation. I believe this behavior is more compatible with mouse navigation which is also relative to the bound viewpoint. Mitch suggested that viewpoint orientation values do not apply for GearVR . This is consistent with leaving position and orientation values at their defaults, eg. looking straight ahead.</p>
<p dir="ltr">The interpupillaryDistance field can be safely ignored since the VR display provided matrixes already take this into account (after a user went through a runtime calibration procedure). Similarly, the fov and width/height aspect ratios are device specific and already properly accounted for. In addition, apparently there is some fine tuning of the projection matrix to lens characteristics.</p>
<p dir="ltr">There is a failure mode when 'RIGHT_VR' is requested but no VR display is available. Currently, I plan to fall back to 'RIGHT_EYE' .</p>
<p dir="ltr">There is another additional, SFInt field 'vrDisplay' in my extension which defines the index of the webVR display to be used in case multiple devices are available.</p>
<p dir="ltr">The RenderedTexture approach is a low-level solution which requires some changes to a scene for VR purposes. It turns out it is possible to build helpers in the form of injectors or likely as protos on top which make it very easy to use this approach for existing scenes.</p>
<p dir="ltr">In terms of parallel rendering of stereo pairs the RenderedTexture approach may not be ideal since the nodes for each eye are independent from each other. Presumably, a smart browser could search for matched pairs of nodes and combine their rendering but that does not sound very promising.</p>
<p dir="ltr">Andreas </p>
<div class="gmail_extra"><br><div class="gmail_quote">On Dec 21, 2016 6:00 AM, "Roy Walmsley" <<a href="mailto:roy.walmsley@ntlworld.com">roy.walmsley@ntlworld.com</a>> wrote:<br type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div lang="EN-GB" link="#0563C1" vlink="#954F72"><div class="m_-2036425259710453379WordSection1"><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif">Hi Andreas,<u></u><u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif"><u></u> <u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif">That’s great to see what you have been doing here. Would you like to take your node suggestions one step further and propose the fields that might be specified for each node?<u></u><u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif"><u></u> <u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif">What we could do then is, in the new year, is to host an open discussion on WebVR and X3D. Contributions from any other readers are welcome, and could lead to a lively discussion on this topic.<u></u><u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif"><u></u> <u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif">Thank you for the suggestions,<u></u><u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif"><u></u> <u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif">All the best,<u></u><u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif"><u></u> <u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif">Roy<u></u><u></u></span></p><p class="MsoNormal"><span style="font-size:11.0pt;font-family:"Calibri",sans-serif"><u></u> <u></u></span></p><p class="MsoNormal"><b><span lang="EN-US" style="font-size:11.0pt;font-family:"Calibri",sans-serif">From:</span></b><span lang="EN-US" style="font-size:11.0pt;font-family:"Calibri",sans-serif"> x3d-public [mailto:<a href="mailto:x3d-public-bounces@web3d.org" target="_blank">x3d-public-bounces@<wbr>web3d.org</a>] <b>On Behalf Of </b>Andreas Plesch<br><b>Sent:</b> 21 December 2016 05:51<br><b>To:</b> X3D Graphics public mailing list <<a href="mailto:x3d-public@web3d.org" target="_blank">x3d-public@web3d.org</a>><br><b>Subject:</b> [x3d-public] X3D VR<u></u><u></u></span></p><p class="MsoNormal"><u></u> <u></u></p><p>Hi<u></u><u></u></p><p>Working with x3dom and WebVR I am exploring what additions to X3D would be necessary or useful to effectively use current VR hardware such as the Rift, Vive, Gear, or Cardboard.<u></u><u></u></p><p>The proposed RenderedTexture node is currently used in x3dom with special stereo modes to generate left and right views from a single viewpoint in a single GL context.<u></u><u></u></p><p>A Layer for each left and right view could also be used with two coordinated but separate viewpoints. This would be Cobweb's pattern.<u></u><u></u></p><p>Since the HMD and its own runtime know best the internal optical characteristics of the display, the preferred interface at least in WebVR are view and projection matrixes directly usable by consumers for 3d graphics generation, and not position, orientation and fov. This leads to a new requirement for X3D to accept these raw matrixes as input.<u></u><u></u></p><p>Since x3dom uses RenderedTexture for VR, I added there additional special stereo modes which directly receive these matrixes from the HMD to be used for rendering at each frame. This in effect accomplishes a first, head and body based, level of navigation. In my first tests (on github) it works well and should be robust across devices. This approach does require some standard API to the HMD such as WebVR.<u></u><u></u></p><p>Another possibility is to add view and projection matrix fields input fields to viewpoints which can be continually updated. One could convolve the view matrix with position/orientation fields, or optionally completely ignore them.<u></u><u></u></p><p>One then could use external SAI to keep updating or perhaps introduce an environment ProjectionSensor. It would relay these matrix events from an HMD runtime to then be routed to the viewpoints.<u></u><u></u></p><p>A second level of navigation is accomplished with handheld controllers. Until some standard gestures evolve, it will be necessary to expect custom per scene navigation. X3d should have a way to sense buttons and axes velocities of such controllers so these are then available to manipulate the view in some way. InstantReality has a general IOSensor to that effect. On the web the GamePad API is a standard which would need to be used.<u></u><u></u></p><p>Summary: RenderedTexture with stereo modes, matrix fields for viewpoints, a ProjectionSensor, and a ControllerSensor would all be candidates for x3d VR support.<u></u><u></u></p><p>Thanks for reading, any thoughts welcome,<u></u><u></u></p><p>Andreas <u></u><u></u></p></div></div></blockquote></div></div>