[x3d-public] X3D and AR
andreasplesch at gmail.com
Wed May 11 10:13:55 PDT 2022
There is some interest in using WebXR for augmented reality (AR) on
phones. WebXR is browser built in functionality to help with VR/AR and
provides pose information (for phone/headset), or hit testing
(locating a real object in the look direction).
What are your thoughts on if X3D is useful for AR as it is and if
there is opportunity or need for AR specific X3D nodes such as
sensors, or navigation modes ?
AR on the phone is meant as an overlay over the real world shown as a
video feed background from the frontal camera. The overlay could be an
X3D scene. The phone would act as a viewpoint, eg. one moves and looks
around by moving and rotating the phone. This could become a new first
person, non-mouse, non-touch, navigation mode ? Or is it just Walk
mode for AR ?
The primary AR modality requires registration of the virtual X3D world
spatially to the video feed (the real world). This registration occurs
naturally for geospatial X3D scenes. But for regular X3D scenes, an
additional step is needed. Should this additional registration step
become part of or facilitated by X3D ? WebXR has anchor objects which
I think play a role here.
A typical AR scenario is that you put virtual objects (a new sofa) in
real space (your living room). That requires hit testing of real space
to find locations. This capability is provided by AR devices. Since
Touchsensor only works with the virtual scene, and not the real space,
do we need another sensor ? For X3D, the real world feed would be
equivalent to the X3D Background. So perhaps a BackgroundTouchSensor ?
That only would make sense for AR. For desktop or VR such a sensor
could switch to a "sense anything in the scene" mode, perhaps.
Let's stop here. Can augmented, or mixed reality work with X3D in some
standard way ?
Waltham, MA 02453
More information about the x3d-public