[x3d-public] rendering the X3D standard...is it extensive enough

Michalis Kamburelis michalis.kambi at gmail.com
Wed Oct 18 10:41:50 PDT 2017


Various answers:

- It is not about X3D. It is true about absolutely any 3D format, and
any 3D renderer in existence.

- We *could* in theory take the values from the 3D file and perform
perfectly precise calculations of the projection, lighting etc.
equations, using some library for arbitrary-precision numbers (
https://en.wikipedia.org/wiki/List_of_arbitrary-precision_arithmetic_software
).

    But that would be prohibitively slow, and also not necessary for
normal use-cases (when we render the images for humans to view). So
instead we use floating-point arithmetic (
https://en.wikipedia.org/wiki/Floating-point_arithmetic ) that is not
precise. And, with various GPUs, and various browsers, making the
calculations in various order, with various optimizations, and with
various floating-point precision (single, double, half-float)... well,
the results cannot be guaranteed to be equal pixel-by-pixel.

- If you want to automatically test by comparing the resulting images,
you need to compare with some "tolerance". (I'm doing such tests for
view3dscene when a major version happens, to catch regressions -- I
compare renders between the old and new version.) So ignore the
differences in colors smaller than some epsilon, and also allow some %
of pixels to have totally different color. In practice, this testing
is not perfectly automated anyway, and requires human eye to judge at
the end "what is an important difference / what is an ignorable
detail".

    For this reason, I doubt that the X3D standard certification is
using such automated testing. In the best case, such automated result
can be used as a starting point, but then human testing is required
anyway.

    (Also, X3D is about animations and interactions too. You would
need to simulate clicks, and record movies, and compare those... Which
introduces even more things that are absolutely not guaranteed (or
required) to be perfectly equal between browsers.)

- I see that this comes to you as a surprise, but it's really a
standard way to do things. You can ask GPU vendors why they don't
guarantee pixel-perfect equality, but they'll answer the same thing:
it would make calculations very slow, as it would prohibit a *lot* of
possible optimizations.

Regards,
Michalis



More information about the x3d-public mailing list