[x3d-public] beyond Blinn-Phong: PBR

Michalis Kamburelis michalis.kambi at gmail.com
Sun Feb 10 11:53:27 PST 2019


When I wrote

"""
Each xxxTexture coordinate should be accompanied by a
xxxTextureCoordinateId ...
"""

I meant

"""
Each xxxTexture field should be accompanied by a xxxTextureCoordinateId
field...
""""

:)

niedz., 10 lut 2019, 20:50 użytkownik Michalis Kamburelis <
michalis.kambi at gmail.com> napisał:

> 1. As for TextureCoordinates: Each xxxTexture coordinate should be
> accompanied by a xxxTextureCoordinateId , which is an index into node's
> MultiTextureCoordinate (in geometry texCoord). I deliberately omitted this
> for simplicity in my previous mail :), but in reality these are needed.
>
> This is the approach that CommonSurfaceShader is using (
> https://castle-engine.io/x3d_implementation_texturing_extensions.php#section_ext_common_surface_shader
> ) and glTF. CGE is already using it when converting glTF to X3D with
> CommonSurfaceShader, so normal map can reuse or not reuse texture coords of
> the base texture.
>
> So, this is something we already can do easily. MultiTextureCoordinate is
> simple and cooperates nicely with this approach.
>
> 2. As for reUSEing the nodes: Hm, good point. If you reuse textures, then
> you probably reuse diffuse textures and specular textures and normalmaps at
> the same time. All these textures must be prepared to work on the shapes
> you're applying them on (e.g. having separate areas for each shape, if
> you're reusing the texture to achieve texture atlas optimization).
>
> Maybe, instead of Appearance.normalMap, it is better to just put normalMap
> (and normalMapCoordinateId) in a base material abstract class (like
> X3DOneSidedMaterialNode) and derive both Material and PhysicalMaterial from
> it.
>
> P.S. And change normalMap/normalMapCoordinateId to
> normalTexture/normalTextureCoordinateId, these are more consistent.
> CommonSurfaceShader also calls it normalTexture, I guess for the same
> reason.
>
> 3. P.P.S. I deliberately avoided thinking for now what to do with
> TwoSidedMaterial ;) Introducing TwoSidedPhysicalMaterial, for consistency,
> feels uncomfortable -- it would be a node with many fields. Possibly new
> node like GenericTwoSidedMaterial should be added, with two slots for back
> and front  X3DOneSidedMaterialNode, with the requirement that they should
> be the same class (that is, both front and back should be Material, or they
> both should be PhysicalMaterial). I'm not convinced about the usefulness of
> TwoSidedMaterial in practice (while it seems very useful for authors, 3D
> software I know doesn't support it, so in practice you just create a
> separate mesh with flipped normals, and a different material, which
> nullifies the need for TwoSidedMaterial). So I don't have much of an
> opinion here yet.
>
> Anyway, TwoSidedMaterial is why we currently have X3DMaterialNode in spec.
>
> Regards,
> Michalis
>
> niedz., 10 lut 2019, 17:10 użytkownik Andreas Plesch <
> andreasplesch at gmail.com> napisał:
>
>> Super input. Just a few remarks.
>>
>> On Fri, Feb 8, 2019 at 5:48 PM Michalis Kamburelis <
>> michalis.kambi at gmail.com> wrote:
>> >
>> > Thanks for many good ideas! I agree with everything, I only want to
>> > add a note to the below paragraph:
>> >
>> > Andreas Plesch <andreasplesch at gmail.com> wrote:
>> > > A node design choice was to add the texture maps as fields to
>> > > PhyicalMaterial rather than Appearance. I think Timo's reasoning was
>> > > that this way all new functionality can be contained in one new node.
>> > > But this is a bit of a departure. Other designs are certainly
>> > > possible.
>> >
>> > I see advantages of this design, where a material node allows to
>> > configure every property by a constant factor (scalar, vector) and
>> > optionally to multiply it by a texture.
>> >
>> > CommonSurfaceShader also uses this design, CGE docs (linking to X3DOM
>> > and InstantReality docs):
>> >
>> https://castle-engine.io/x3d_implementation_texturing_extensions.php#section_ext_common_surface_shader
>> >
>> > E.g. in CommonSurfaceShader you have
>> >
>> >   SFVec3f diffuseFactor
>> >   SFNode  diffuseTexture
>> >   SFVec3f emissiveFactor
>> >   SFNode  emissiveTexture
>> >   SFVec3f specularFactor
>> >   SFNode  specularTexture
>> >   .. and so on
>> >
>> > (I'm simplifying a bit, in reality you need at least a way to provide
>> > texture coordinate index to each texture slot, so there are more
>> > fields.)
>>
>> > In my view, we should follow this approach to the consistent end :) So
>> > PhysicalMaterial would have
>> >
>> >   baseColorFactor
>> >   baseColorTexture
>> >   .. and so on
>> >   (or maybe just baseColor, baseTexture)
>>
>> It is called basecolortexture in glTF but baseTexture would be following
>> x3d conventions.
>>
>> > and to the regular Material we add
>> >
>> >   diffuseTexture
>> >   specularTexture
>> >   emissiveTexture
>> >   .. and so on
>> >
>> > So all factors (scalars/vectors) have a counterpart texture, and the
>> > texture field is present right next to the non-texture (scalar/vector)
>> > value.
>>
>> > Advantages:
>> >
>> > - This is simple for authors. """Question: What does the texture in
>> > slot xxxTexture do? Answer: The same thing as xxxFactor, but it's a
>> > texture so it allows to vary this material property over a surface.
>> > See the xxx treatment in lighting equations."""
>> >
>> > - It is simple to implement. Plug the texture xxxTexture into the
>> > shader at the same place where you use xxxFactor.
>> >
>> > Compare this to the current approach of X3D:
>>
>> Perhaps somebody has insight into the underlying reasoning behind the
>> current approach.
>>
>> In a way, the new Materials are on the level of the current Appearance.
>>
>> Structurally, the main question may be what works best for reuse of
>> Appearance, Material and Texture.
>>
>> Currently, reusing Material gives you the option to use a different
>> diffuse texture for the same set colors. This does not seem to be a very
>> common requirement. With the new design this flexibility disappears since
>> colors (as factors) and existing maps are reused. This is the common
>> requirement.
>>
>> > - The lighting equations say to use the Apperance.texture for diffuse,
>> >
>> http://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/components/lighting.html#Lightingmodel
>> > . (And to treat RGB and grayscale textures differently, which is
>> > another point I find bad -- RGB textures should "replace" by default,
>> > while RGB should "modulate", according to the spec.)
>> >
>> > - The multi-texturing specification says that when MultiTexture is
>> > present, the "MultiTexture.mode" rules (and the default is "modulate",
>> > regardless of RGB or grayscale,
>> >
>> http://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/components/texturing.html#MultiTexture
>> > ). And the "MultiTexture.source" determines whether the texture
>> > affects the diffuse or specular calculation.
>> >
>> > There are a couple of inconsistencies here. And it's not implemented
>> > completely by X3D browsers, as far as I tested.
>> >
>> > And it still doesn't provide all the necessary flexibility. E.g. you
>> > can only modify "diffuse" and "specular" by "MultiTexture.source", you
>> > cannot modify "emissive" color, and it's unclear what modifies the
>> > "transparency". Although some of these things could be easily fixed in
>> > the multi-texturing spec, but I think that overall the approach of
>> > PhysicalMaterial/CommonSurfaceShader is better.
>> >
>> > I discussed some of the above in """How does this relate to the
>> > existing X3D multi-texturing nodes?""" on
>> >
>> https://github.com/michaliskambi/x3d-tests/wiki/How-to-add-PBR-to-X3D%3F
>> > . In short, I think we have to keep "Appearance.texture" supported,
>> > and when it's present it should override "Material.diffuseTexture" or
>> > "PhysicalMaterial.baseColorTexture" (depending whether
>> > "Appearance.material" is Material or PhysicalMaterial). But new models
>> > should be advised to instead add textures inside Material or
>> > PhysicalMaterial.
>> >
>> > And the normalmap texture (that is not related to lighting equations)
>> > should be specified outside of the material, in "Appearance.normalMap"
>> > field. This is also consistent with glTF, that places "normalTexture"
>> > outside of the "pbrMetallicRoughness" block, recognizing that
>> > normalmaps make sense for all lighting models, since "varying normal
>> > vectors over a surface" makes sense for all algorithms that look at
>> > surface normals. CGE has "Appearance.normalMap" already.
>>
>> Although I agree that normal maps are generated independently of the
>> lighting model, the use of a particular normal map is strongly coupled to
>> use of a corresponding set of other maps. I do not see a situation where
>> you would want to reuse a PhysicalMaterial and its maps with a different
>> normal map. In fact, reusing a PhysicalMaterial should not require
>> repeating the use of the same normal map (outside of PhysicalMaterial).
>>
>> This does not leave much space for Appearance to act in, really only
>> TextureTransform I believe. This may be appropriate as variations in
>> TextureTransforms produces variations in the appearance of the same PBR
>> material.
>>
>> We may also have to discuss the option to use more than one set of
>> texture coordinates for the same geometry, for different maps. I think glTF
>> requires that two UV sets are supported. I think x3dom supports this for
>> glTF but may not expose it for X3D. Let me check.
>>
>> Cheers,
>>
>> Andreas
>>
>> >
>> > Sorry for a long train of thought :) Hopefully this is informative.
>> >
>> > Regards,
>> > Michalis
>>
>>
>>
>> --
>> Andreas Plesch
>> Waltham, MA 02453
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20190210/aed1f4ac/attachment.html>


More information about the x3d-public mailing list