Difference between revisions of "Discussions for Merging X3D AR Proposals"
(New page: Here we discuss on how to merge X3D AR proposals As described in Plans for Merging X3D AR Proposals, here we discuss and produce a merged proposal for each functional components by in...) |
m |
||
Line 1: | Line 1: | ||
− | |||
− | |||
As described in [[Plans for Merging X3D AR Proposals]], here we discuss and produce a merged proposal for each functional components by investigating each functional features stepwise. | As described in [[Plans for Merging X3D AR Proposals]], here we discuss and produce a merged proposal for each functional components by investigating each functional features stepwise. | ||
Revision as of 23:06, 18 October 2012
As described in Plans for Merging X3D AR Proposals, here we discuss and produce a merged proposal for each functional components by investigating each functional features stepwise.
1. Camera video stream image into the scene (texture and background)
- New node structure for supporting live camera video stream as a background or texture.
Option 1. Explicit Defining a node that represents the camera/image sensor, then routing it to other nodes (e.g. Pixel Texture node or a new Background node such as ImageBackground or MovieBackground)
Pros. - Open for using it in other purposes in the future (more extensible)
Cons. - Relatively more complicated to write scenes and implement browsers
Option 2. Implicit
Defining a node that represents "background" or "texture" with user media (either from
Pros. - Simpler on content creators perspective - Easier to implement and test since lesser interaction with other nodes
Cons. - Single purpose node, which might not be used much for other purposes
Option 3. Allowing both Pros. - Letting user to choose the option that meets their needs
Cons. - Cost to implement both to browser developers
- Selecting a device
Reference: HTML5 getUserMedia() API
2. Tracking (including support for general tracking devices) 3. Camera calibration (viewpoints) 4. Others (color-keying, depth occlusion)