[x3d-public] wondering about Texture mapping specified in material nodes, X3D4 usage tradeoffs and draft meeting agenda

Don Brutzman brutzman at nps.edu
Fri Sep 25 07:21:20 PDT 2020


Thank you Michalis and Andreas for steady improvements.

Today we will review to make sure that it all makes sense, summarizing with next-step changes for specification.

On 9/24/2020 8:33 AM, Andreas Plesch wrote:
> 
> See below
> 
> ---on the phone---
> 
> On Thu, Sep 24, 2020, 6:34 AM Michalis Kamburelis <michalis.kambi at gmail.com <mailto:michalis.kambi at gmail.com>> wrote:
> 
>      From my point of view, we have already progressed here further with
>     Andreas, making some of these questions obsolete :) Today I wrote a
>     summary of our proposals with Andreas, filling in some details:
> 
>     A. https://github.com/michaliskambi/x3d-tests/wiki/Proposal:-Change-mapping-concept-into-a-number-of-xxxTexCoord-fields-in-geometry
> 
>     B. https://github.com/michaliskambi/x3d-tests/wiki/Proposal:-wrap-index-values-in-a-new-node
> 
>     Andreas: Please look at them, to see whether this captures what we
>     spoke about nicely. 
> 
> 
> Thanks for writing this up. I think it covers all what had come up.
> 
> 
>     It also turns out that we don't need "proposal B
>     (Index node)" desperately (because your "proposal A (xxxTexCoord
>     fields in geometry)" already allows to reuse meshes as a whole). So
>     "B" is a good thing to consider anyway, but it becomes an independent
>     decision entirely from "A".
> 
> 
> Agreed. B is independent.
> 
> Also agreed that xxxTexCoordIndex nodes are probably not necessary as explained in the last wiki section.
> 
> 
>     As for Don's question "what can I live with": Personally:
> 
>     - "I can live with" my "mapping" design. I'm obviously heavy biased
>     here, as it's my own design and already tested in implementation.
> 
> 
> If it is possible to get rid of MultiTexture* node requirements and overloading, it would become less foreign to X3D.
> 
> 
>     - I'm afraid 'I cannot live" with the design where texture coordinates
>     are inside a material. I gave my arguments above in this thread
>     already :) Texture coordinates are conceptually part of a mesh, not
>     the material, and putting them in the material creates problems.
> 
> 
> agreed.
> 
> 
>     - "I can live" with the solution in "proposal A". And right now I
>     think it's the best way forward (although it also requires most work,
>     to encode it in spec, and to test in implementation; but hey, we
>     worked on this for so long, I want to make X3Dv4 perfect :) ).
> 
> 
> If the mapping approach finds more support, I would not want to delay anything.
> 
> 
>     My proposed for Friday schedule is:
> 
>     - go over any questions (from Don and anyone). I feel that if
>     something was unclear from my email communication or wiki pages, then
>     it is best to answer it "live".
> 
>     - go over our proposals with Andreas (links A B above). If we agree
>     that "this is best", then I can work on encoding one or both of these
>     proposals in the spec (and also implementing them in CGE, to test that
>     we didn't miss some issue). I would start with A and defer B, but
>     that's my point of view.
> 
> 
> Deferring B, eg. allowing reuse of index fields by making them SFNode values, would be ok with me, as well. Conceptually, it would be a small change but it still requires standards work.
> 
> See below for short responses to Don's specific questions.
> 
> 
>     śr., 23 wrz 2020 o 20:28 Don Brutzman <brutzman at nps.edu <mailto:brutzman at nps.edu>> napisał(a):
>      >
>      > Really interesting discussion and investigation, thanks for continued review and pushing forward.
>      >
>      > Excerpt and suggestion follow:
>      >
>      > > On 9/22/2020 11:24 AM, Michalis Kamburelis wrote:
>      > >> We cannot really make diffuseTextureCoord, specularTextureCoord inside
>      > >> a mesh -- since diffuse and specular are for Phong. For PBR
>      > >> (PhysicalMaterial) we have base, metallicRoughness.
>      > >>
>      > >> That being said, I believe you're going toward something like my
>      > >> "Example F" onhttps://github.com/michaliskambi/x3d-tests/wiki/How-to-add-PBR-to-X3D%3F#can-we-make-texture-mapping-using-def--use <http://github.com/michaliskambi/x3d-tests/wiki/How-to-add-PBR-to-X3D%3F#can-we-make-texture-mapping-using-def--use>
>      > >> . This presents a situation where
>      > >>
>      > >> A. the Material and/or Appearance can be reused
>      > >>
>      > >> B. the mapping is performed without the need for SFstring "mapping",
>      > >> instead it is done using DEF / USE
>      > >>
>      > >> C. the IndexedFaceSet cannot be reused
>      > >>
>      > >> So we win A and B, we lose C. Maybe this is a way forward? Preserving
>      > >> C has probably the lowest priority. I can agree here that merely
>      > >> reusing the "meat" of geometry node can be a solution.
>      >
>      > Yes, have been thinking in similar directions.  This is all boiling down to a tradeoff question.  If all the necessary information is known, then _where_ we put the information is the design choice.
>      >
>      > Let's consider whether this is less about DEF-USE bookkeeping, and more about modeling use cases and usage.  Divide and conquer:
>      >
>      > ---
>      >
>      > a. Defining a geometry node (frequently a mesh) can include information about common rendering tasks.
>      >
>      > - For example, texture coordinates (texCoord and texCoordIndex) are defined as part of the mesh so that texture images can be applied appropriately to the mesh.  Different rectangular images thus have comparable effects for a given piece of geometry.
> 
> 
> The main use case for texCoords is simply that there is no automatic way to map texture data from a rectangle to complex geometries.
> 
>      >
>      > - We (of course) want best flexibility for both geometry and PBR appearance.
>      >
>      > ---
>      >
>      > b. Defining a PhysicalMaterial (and other PBR Material nodes) includes additional textures which modulating the basic colors in special ways.
> 
> 
> Texture maps are similarly useful for Phong materials.
> 
>      >
>      > - For example, RGB bits in a 2D array (an ImageTexture or PixelTexture perhaps) define special rendering perturbations on the base and emissive colors, normal vectors, etc.
>      >
>      > - Correctly stated?
> 
> 
> Yes. Texture maps for normal vectors are also widely used for Phong materials.
> 
>      >
>      > - It is not seem possible for a modeled IndexedFaceSet to foresee all of the different ways it might be rendered.  Indeed, the PhysicalMaterial might be changing at run time for whatever reasons.
>      >
>      > - Thus, a given PhysicalMaterial node needs some way to refer to the properties being used to render geometry, either via "mapping" relationships shared by multiple nodes, or each relationship specifically defined by fields.
> 
> 
> It is more standard to have the reverse point if view. A PhysicalMaterial cannot foresee how it might be associated with some geometry in a shape. Thus, geometries need to have texcoord fields for texture mapping. Only the geometry knows the details of this mapping.
> 
>      >
>      > ---
>      >
>      > c. Do we have functionally viable alternatives?
>      >
>      > - Classic texture image application remains controlled by keeping texCoord coordinates with geometry itself,
> 
> 
> I think it is useful to start to consider texCoord as having the same function as diffuseTexCoord. Then, in another step, as being a fallback value for any texcoords.
> 
>      >
>      > - parameters for advanced PBR rendering by materials kept with those materials,
> 
> 
> yes, for the texture maps, no for the texCoords.
> 
>      >
>      > - Since DEF/USE can occur in any order, authors and modeling tools can reuse any definition anywhere desired within a model.
> 
> 
> I think DEF node need to come first ?
> 
>      >
>      > - what about texCoordIndex, are multiple arrays of values for specular/diffuse/normal/etc. needed when perturbating the colors of a PBR material?
> 
> 
> probably not.
> 
>      >
>      > - if PBR material libraries can be defined that are independent of geometry they are applied to, does reuse mean they have "mapping" fields embedded that the geometry meshes would need to match?
> 
> 
>      >
>      > ---
>      >
>      > d.  Hopefully this all reduces to a simple choice for our usage tradeoff:  are we optimizing for
>      >
>      > - PBR library reuse (i.e. matching "mapping" fields) or
>      >
>      > - geometry/Material independence (additional fields in Material nodes, more typical X3D approach).
>      >
>      > ---
>      >
>      > e. Next steps for X3D4 specification/examples/implementations.
>      >
>      > Consensus: is everyone able to answer "can I live with that" as our threshold to go forward with X3D4 PBR and glTF?
>      >
>      > - TODO multiple-node "mapping" approach (if selected.  see my TODO recap message about 2 hours ago)
>      >
>      > - TODO Materials field-centric approach (if selected.  apply key points from Friday discussion)
>      >
>      > How does this look as default agenda for our discussions Friday?  Hoping we can make the most of our precious time together.  Pending feedback, will post.
>      >
>      > Thanks for continued posts and preparations for Friday.
>      >
>      > all the best, Don
>      > --
>      > Don Brutzman  Naval Postgraduate School, Code USW/Br brutzman at nps.edu <mailto:brutzman at nps.edu>
>      > Watkins 270,  MOVES Institute, Monterey CA 93943-5000 USA   +1.831.656.2149
>      > X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzmante
> 

all the best, Don
-- 
Don Brutzman  Naval Postgraduate School, Code USW/Br       brutzman at nps.edu
Watkins 270,  MOVES Institute, Monterey CA 93943-5000 USA   +1.831.656.2149
X3D graphics, virtual worlds, navy robotics http://faculty.nps.edu/brutzman



More information about the x3d-public mailing list