[x3d-public] Allowing Normal node to accompany PointSet, then achieving well-defined rendering

Don Brutzman brutzman at nps.edu
Fri Dec 27 18:32:57 PST 2019

Thanks John for considering and Andreas for the field-by-field "deep dive" analysis.

First may I suggest a "combination visualization" use case, that any production of a point mesh with normal and possibly color/texture information is worth looking at in a variety of ways - points, lines, and polygons.  Often the superposition of (or switching among) such renderings can yield insight regarding connectivity, numerical noisiness, and other characteristics of interest.  A presentation technique already in fairly common practice is to superimpose lines over polygons in order to emphasize the edges (and this is where DEF-USE come into play).

Second may I suggest that there is likely to be a lot of innovation in such visualizations as 3D scanning becomes commonplace.  We'll want to look at meshes in a number of ways, including nonphotorealistic rendering (NPR).

So, it seems prudent to not prohibit the inclusion of any such parameter information in a scanned mesh that is being used for point and/or line renderings.  Hence the recommendation to simply use X3DComposedGeometryNode consistently throughout.

Looks like the relevant geometry nodes are PointSet, LineSet and IndexedLineSet.

As we work through the specification prose for various point/line combinations with normals colors and textures, a helpful design thumbrule might be "how does polygonal rendering draw that particular point or line?"  Consistency is helpful for implementers and authors.

If I dare attempt to remember back that far, I believe our VRML motivations for the initial careful distinctions made between polygonal, line and point rendering usually fell along the lines of "you can't do that with lines" and "you can't do that with points."  So in effect any harmonization in X3Dv4 object types is benefiting from many advances in 3D graphics rendering.

Default values might be tricky for scan results that don't utilize all fields... but at least it is good to have a default, so that ongoing progress in scanning hardware/software/tools can strive to achieve consistent results.  Such datasets are also "fixable" by authors and tools if the default values (for example, booleans like ccw etc.) don't initially match the actual data characteristics.

Suggest we square away all this work before considering new capabilities such as Tangents, in order to avoid confusion.  When we do get to that stage, a good question will be "why not convert your tangent information into a Normal node?"  Any "new" capability must show how it is different than existing capabilities.

[For simplicity I pushed all of these responses to the top Andreas - am planning to use your field-by-field analysis below when we review specification details.]

Again thanks everyone for continuing to think this through, feels like a path to convergence is becoming possible.

Have fun with X3Dv4 points and lines!  8)

On 12/27/2019 11:08 AM, John Carlson wrote:
> Sounds good, Don.   Are there any other geometry nodes we need to inherit from composed node?
> John

On 12/27/2019 1:10 PM, Andreas Plesch wrote:
> On Wed, Dec 25, 2019 at 6:06 PM Don Brutzman <brutzman at nps.edu> wrote:
>> Continuing with the subject concept, I think several issues are fairly clear.
>> a.  Since many 3D scanning tools generate normals as part of their producing point clouds and meshes, there is a clear use case for including Normal vectors with point values in X3D models.
> Agreed. I think we can assume that the generated normals are meant to
> represent (an estimate of) the normal of a plane tangential to the
> surface at the scanned point.
>> b.  When such information is available, there remain a number of issues with point rendering that need to be sorted out.
> Does that mean when no normals are available we want to keep the
> current Point cloud proposal (as long as there are no conflicts) ?
>> c.  Since Coordinate nodes can be re-USEd for making points lines and polygons, having clarity and consistent approaches for node relationships can have benefits for authors and tools.
> Not sure if DEF-USE is relevant.
>> d.  Any approach will need to be workable for classic X3Dv3 rendering and forthcoming X3Dv4 physically based rendering (which in turn is being designed for glTF compatibility).
> ok. I guess "workable" means that existing Shapes with Pointset nodes
> continue to render the same after introducing Point cloud features to
> Pointset.
> PBR may require a Tangent node/texture, in addition to a Normal node/texture.
>> So... I was looking at scene-graph structural relationships in the X3D Schema and DOCTYPE.  Fairly involved with a lot of variation; typically do-able but rather complex.
>> Interestingly, a simplifying principle may be possible.  If we look at X3DComposedGeometryNode, it is implemented by each of the polygonal nodes.
>> * org.web3d.x3d.sai.Rendering.X3DComposedGeometryNode
>>     https://www.web3d.org/specifications/java/javadoc/org/web3d/x3d/sai/Rendering/X3DComposedGeometryNode.html
>> * IndexedFaceSet, IndexedQuadSet, IndexedTriangleFanSet, IndexedTriangleSet, IndexedTriangleStripSet, QuadSet, TriangleFanSet, TriangleSet, TriangleStripSet
>> ==============================
>> 11.3.2 X3DComposedGeometryNode
>> https://www.web3d.org/specifications/X3Dv4Draft/ISO-IEC19775-1v4-WD1/Part01/components/rendering.html#X3DComposedGeometryNode
>> X3DComposedGeometryNode : X3DGeometryNode {
>>     MFNode [in,out] attrib          []   [X3DVertexAttributeNode]
>>     SFNode [in,out] color           NULL [X3DColorNode]
>>     SFNode [in,out] coord           NULL [X3DCoordinateNode]
>>     SFNode [in,out] fogCoord        NULL [FogCoordinate]
>>     SFNode [in,out] metadata        NULL [X3DMetadataObject]
>>     SFNode [in,out] normal          NULL [X3DNormalNode]
>>     SFNode [in,out] texCoord        NULL [X3DTextureCoordinateNode]
>>     SFBool []       ccw             TRUE
>>     SFBool []       colorPerVertex  TRUE
>>     SFBool []       normalPerVertex TRUE
>>     SFBool []       solid           TRUE
>> }
>> ==============================
>> One might reasonably make a case that, similar to use of Normal information, authors might want to apply fog or shader rendering techniques to lines and points.  Sure enough, the X3D specification already supports that:
>> ==============================
>> 11.4.11 PointSet
>> PointSet : X3DGeometryNode {
>>     MFNode [in,out] attrib   []   [X3DVertexAttributeNode]
>>     SFNode [in,out] color    NULL [X3DColorNode]
>>     SFNode [in,out] coord    NULL [X3DCoordinateNode]
>>     SFNode [in,out] fogCoord NULL [FogCoordinate]
>>     SFNode [in,out] metadata NULL [X3DMetadataObject]
>> }
>> ==============================
>> 11.4.9 LineSet
>> LineSet : X3DGeometryNode {
>>     MFNode  [in,out] attrib         []   [X3DVertexAttributeNode]
>>     SFNode  [in,out] color          NULL [X3DColorNode]
>>     SFNode  [in,out] coord          NULL [X3DCoordinateNode]
>>     SFNode  [in,out] fogCoord       NULL [FogCoordinate]
>>     SFNode  [in,out] metadata       NULL [X3DMetadataObject]
>>     MFInt32 [in,out] vertexCount    []   [2,∞)
>> }
>> ==============================
>> So here is a unifying proposal for treating points and lines similarly to polygons: *use X3DComposedGeometryNode consistently throughout*. Such a step simply adds normal and texCoord fields to point and line nodes.  We know that such a change is implementable in source code because a such large number of nodes shares that abstract node interface already.
> Let's confirm that PointSet and LineSet are the only geometry nodes
> which implement X3DGeometryNode directly. I think this is the case.
> Since X3DGeometryNode does not require any fields over X3DNode, it can
> be seen as a non-structural node in the hierarchy and could be
> eliminated without real consequences.
> So far so good.
> X3DComposedGeometryNode was defined to provide a shared basis for all
> polygonal geometries: "A composed geometry node type defines an
> abstract type that composes geometry from a set of nodes that define
> individual components". The individual components are the polygons.
> Apparently, there was a desire to define the node somewhat more
> generally than just for polygons as it could have been named
> X3DPolygonalGeometryNode. Perhaps volumetric components such as
> tetrahedra or hexahedra were envisioned.
> Similarly, going to less dimensions, Linesets and Pointsets can be
> seen as being composed. Linesets are composed of vertices,
> connectivity and attached information (very similar to polygons), and
> Pointsets are composed of vertices and information (color/size) for
> each vertex.
> So, yes, there does not seem to a good reason why Linesets and
> Pointsets should not be considered a composed geometry, in an abstract
> way.
> Then, let's look at the specific fields of X3DComposedGeometryNode.
> Here are all fields which would become additionally available to
> Pointset and Lineset:
> .
>     SFNode [in,out] normal          NULL [X3DNormalNode]
> as desired. Can be used for shading/lighting points.
>     SFNode [in,out] texCoord        NULL [X3DTextureCoordinateNode]
> defaults for Pointsets needs to be clarified but same as existing for
> other nodes (longest dimension is U).
>     SFBool []       ccw             TRUE
> maybe meaningless for Pointset ? So ignore ?
>     SFBool []       colorPerVertex  TRUE
> Pointset would be always colorPerVertex. So ignored for Pointset ?
> Lineset is currently spec'ed as color always per vertex. It would make
> sense to additionally allow color per segment if colorPerVertex=false.
>     SFBool []       normalPerVertex TRUE
> same. Ignored ?
>     SFBool []       solid           TRUE
> There could be a front and back side of a point/line with a normal. So
> solid=true would mean only render front ? Is that useful ? But the
> default should be then "FALSE" for points/lines, (and in general but
> that would be backwards incompatible).
>> Certainly the specifics of expected rendering will then need to be specified and implemented as appropriate for X3Dv4 upgrades, but we will have much more regularity for all models utilizing X3D coordinate-related data.
>> Such an alignment would have no backwards compatibility issues with prior X3D/VRML content since those extra fields (normal and texcoord) are simply not present in X3Dv3 content.  Further we have already decided that old-style models are rendered in old-style fashion, side by side with X3Dv4 PBR models, unless some modification has occurred to upgrade their rendering to X3Dv4.
> I think this makes sense. But we still need to sort out textures for
> Pointsets, including perhaps explicitly not dealing with them, and how
> to provide sprite texture data. Also, it is important to keep in mind
> that "not present" means the default value is used.
> Was size of point cloud points as a function of distance from
> viewpoint discussed ?
> Cheers, -Andreas
>> Hope this makes sense!  If so it should simplify and generalize a number of design issues, to good effect.
>> Thanks for considering the possibilities.  I hope that everyone has a great holiday season, full of good will and good cheer towards all.
>> v/r Don
>> On 12/12/2019 11:05 AM, Michalis Kamburelis wrote:
>>> czw., 12 gru 2019 o 19:16 Don Brutzman <brutzman at nps.edu> napisał(a):
>>>> On 12/5/2019 7:01 AM, Andreas Plesch wrote:
>>>>> Hi Michalis,
>>>>> thanks for thinking this through.
>>>> yes
>>>>> To me, then, there are really two options concerning normal maps for Pointset:
>>>>> (A) allow normal maps for Pointsets, and introduce a MFVec3f Tangent
>>>>> and probably Bitangent nodes, similar to the existing Normal node.
>>>>> Similar to the Normal node, values would be automatically computed for
>>>>> continuous surfaces using the MIKKTspace method when not provided.
>>>>> (B) If introducing Tangent nodes is not feasible at this point, do not
>>>>> allow normal maps for Pointsets and rely exclusively on the Normal
>>>>> node to provide normals, per vertex.
>>>> hmmm, aren't tangents unambiguously computable from surface geometry and Normal vectors?  Knowing the surrounding mesh geometry constrains the more-general case of a point.
>>>> If they are computable (tangent, bitangent, whatever) and accurately renderable only when another property or two is specified, then such properties might be included in PointProperties (i.e. tangent=true/false).
>>> The tangent and bitangent are not computable at all for PointSet.
>>> PointSet is missing the necessary connectivity information (that
>>> allows to compute tangents on polygons, like IndexedFaceSet).
>>> Also, "tangent" informations is a list of 3D vectors. It's an
>>> additional list of 3D vectors (in addition to all existing stuff, like
>>> coordinates and normals). So it would not look like
>>> "tangent=true/false".
>>> About polygons: While the tangent/bitangent can be computed for
>>> polygons, like IndexedFaceSet.... but it's not perfect.
>>> There are multiple ways to do this computation, that yield similar but
>>> not exactly the same results. If you want to the X3D browser to render
>>> normalmaps in exactly the same way as it was authored (e.g. Blender
>>> baking normalmaps), then the authoring tool should provide explicit
>>> tangent information.
>>> A nice screenshot what happens if the calculation of tangents
>>> "mismatches" between the renderer and authoring tool is on
>>> https://gamedev.stackexchange.com/questions/146855/how-do-you-compute-the-tangent-space-vectors-with-normals-given-in-the-mesh
>>> (the screenshot is in the answer,
>>> https://gamedev.stackexchange.com/a/147030 ).
>>> That's why, if we consider adding "tangent" to X3D, it makes sense to
>>> add it to all geometry nodes that already include "normal" field. This
>>> would be consistent. It would also be consistent with glTF.
>>> Regards,
>>> Michalis
>> all the best, Don
>> --
>> Don Brutzman  Naval Postgraduate School, Code USW/Br       brutzman at nps.edu
>> Watkins 270,  MOVES Institute, Monterey CA 93943-5000 USA   +1.831.656.2149
>> X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzman

all the best, Don
Don Brutzman  Naval Postgraduate School, Code USW/Br       brutzman at nps.edu
Watkins 270,  MOVES Institute, Monterey CA 93943-5000 USA   +1.831.656.2149
X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzman

More information about the x3d-public mailing list