[x3d-public] Gaussian splats

GPU Group gpugroup at gmail.com
Sun Nov 30 09:49:40 PST 2025


Hypothesis:  Splat rendering could be done with Proto containing first node
Shape, and with geometry node PointSet with attrib field containing
(31.4.2) FloatVertexAttribute node with 1 float per vertex representing
scale , a ProgrammableShader to render points as individually scaled 2D
rectangles, a Script node to parse .ply file to PointSet, and a new node
type FileLoader to load a .ply file for parsing.
-Doug
https://www.useblurry.com/blog/anatomy-of-a-ply-file
- shows variant of .ply for gaussian splats
- John noted Aaron's ply samples don't have spherical harmonic parameters
(so a simpler shader)

On Fri, Nov 28, 2025 at 10:24 PM Don Brutzman <don.brutzman at gmail.com>
wrote:

> Just found several more interesting references relating to Gaussian splats.
>
>    - *NY Times, Spatial Journalism: A Field Guide To Gaussian Splatting*
>    - by By A.J. Chavar, Oscar Durand, Mint Boonyapanachoti
>    - December 16, 2024
>    - Gaussian splatting holds a lot of promise for 3D recreation and
>    spatial storytelling. It’s faster and more photorealistic than
>    photogrammetry, and much easier to process and interact with than neural
>    radiance fields — giving journalists and readers the best of both worlds.
>    - These advantages are due to the novel way that splats reconstruct 3D
>    scenes. In a splat, people, places, and objects are made up of a point
>    cloud defined by gaussian functions. Each gaussian function is essentially
>    a 2D disc assigned to a point in 3D space with attributes for orientation,
>    color, transparency, and size. When viewed in aggregate, these coalesce
>    into a 3D scene that can very accurately represent certain things that
>    other volumetric captures cannot: reflection, transparency, fine detail and
>    the qualities of the light in a scene.
>    - In this guide, we give an overview of the practical takeaways we
>    learned exploring exploring gaussian splatting for spatial journalism. We
>    tested a variety of capture and processing techniques using a variety of
>    hardware and software, and found solutions on desktop and mobile devices.
>    We also assessed splatting against the benchmarks we typically associate
>    with photogrammetry, the current standard for 3D recreation.
>    - https://rd.nytimes.com/projects/gaussian-splatting-guide
>
>
>    - *NY Times, Spatial Journalism:   Pushing the Limits of Gaussian
>    Splatting for Spatial Storytelling*
>    - By AJ Chavar, Mint Boonyapanchoti, Juan Cabrera, Oscar Durand,
>    Yasmin Elayat, Claudia Miranda, Kojo Opuni
>    - December 16, 2024
>    - Abstract.   Advancements in capturing and rendering 3D scenes have
>    the potential to create spatial stories faster and more easily than has
>    previously been possible. Building off of our earlier experiments with
>    photogrammetry and neural radiance fields (NeRF), R&D, with support from
>    the Graphics Department, set out to understand gaussian splatting and
>    continue to explore the future of photography and 3D media. Modern gaussian
>    splatting caught our eye after "3D Gaussian Splatting for Real-Time
>    Radiance Field Rendering" was named one of the best papers of SIGGRAPH
>    2023. While photogrammetry is a tried-and-tested method, it’s extremely
>    labor- and compute- intensive, and it reliably fails to reconstruct certain
>    textures and settings. NeRF technology addresses some of these concerns,
>    but is extremely difficult to deliver these radiance fields to audiences.
>    We wanted to see how splatting fared under similar circumstances.
>    - https://rd.nytimes.com/projects/gaussian-splatting
>
>
>    - *3D Gaussian Splatting for Real-Time Radiance Field Rendering*
>    - SIGGRAPH 2023 (ACM Transactions on Graphics)
>    - Bernhard Kerbl* 1,2      Georgios Kopanas* 1,2      Thomas
>    Leimkühler3      George Drettakis1,2
>    - Abstract.  Radiance Field methods have recently revolutionized
>    novel-view synthesis of scenes captured with multiple photos or videos.
>    However, achieving high visual quality still requires neural networks that
>    are costly to train and render, while recent faster methods inevitably
>    trade off speed for quality. For unbounded and complete scenes (rather than
>    isolated objects) and 1080p resolution rendering, no current method can
>    achieve real-time display rates.
>    - We introduce three key elements that allow us to achieve
>    state-of-the-art visual quality while maintaining competitive training
>    times and importantly allow high-quality real-time (≥ 100 fps) novel-view
>    synthesis at 1080p resolution.
>    - First, starting from sparse points produced during camera
>    calibration, we represent the scene with 3D Gaussians that preserve
>    desirable properties of continuous volumetric radiance fields for scene
>    optimization while avoiding unnecessary computation in empty space; Second,
>    we perform interleaved optimization/density control of the 3D Gaussians,
>    notably optimizing anisotropic covariance to achieve an accurate
>    representation of the scene; Third, we develop a fast visibility-aware
>    rendering algorithm that supports anisotropic splatting and both
>    accelerates training and allows realtime rendering. We demonstrate
>    state-of-the-art visual quality and real-time rendering on several
>    established datasets.
>    - https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting
>
>
> all the best, Don
> --
> X3D Graphics, Maritime Robotics, Distributed Simulation
> Relative Motion Consulting  https://RelativeMotion.info
>
>
> On Fri, Nov 28, 2025 at 9:04 PM Don Brutzman <don.brutzman at gmail.com>
> wrote:
>
>> Thank you for exploring this topic Doug.  Interesting developments.
>>
>> Traditionally .ply files have been used for defining triangles or
>> polygonal mesh, similar in many ways to .stl files.
>>
>> Possible future extension for X3D Architecture is to define a new
>> InlineGeometry node that can read .stl or ply files to create a coordinate
>> mesh, returning a Coordinate node perhaps.  Such offline conversions for
>> scanned meshes are possible now.
>>
>> However, am not sure how gaussian splats can utilize the .ply format.
>> This seems like a relatively new development - do you know where that is
>> written up?
>>
>> PLY references follow.
>>
>>    - Wikipedia PLY (file format)
>>    - https://en.wikipedia.org/wiki/PLY_(file_format)
>>
>>
>>    - Internet Archive: The PLY Polygon File Format, Author: Greg Turk
>>    -
>>    https://web.archive.org/web/20161204152348/http://www.dcs.ed.ac.uk/teaching/cs4/www/graphics/Web/ply.html
>>
>>
>>    - Wikipedia: Gaussian splatting
>>    - https://en.wikipedia.org/wiki/Gaussian_splatting
>>    - (no mention of files or formats that i saw)
>>
>> Since there is a lot of innovation ongoing with Gaussian splats, we want
>> to be thorough and careful when looking at potential X3D standardization.
>>
>> all the best, Don
>> --
>> X3D Graphics, Maritime Robotics, Distributed Simulation
>> Relative Motion Consulting  https://RelativeMotion.info
>>
>>
>> On Sat, Nov 22, 2025 at 9:31 AM GPU Group via x3d-public <
>> x3d-public at web3d.org> wrote:
>>
>>> "Gaussian Splatting"
>>> .ply files appear to deliver similar polygons as X3D IndexedTriangleSet
>>> https://en.wikipedia.org/wiki/PLY_(file_format)
>>> https://www.web3d.org/documents/specifications/19775-1/V4.0/index.html
>>> Methods to get .ply into x3d scene:
>>> 0. pre-process with non-browser tool to convert to x3d scene snippet for
>>> inlining
>>> 1. Special new IndexedTriangleSetPly node with a url field
>>> 2. Proto containing i) first node IndexedTriangleSet, ii) Script node to
>>> process ply into IndexedTriangleSet, and either:
>>> a) SAI function for javascript to read an arbitrary file url into a blob
>>> for processing from ply to IndexedTriangleSet by javascript
>>> b) a new node type that reads an arbitrary file into an SFImage as a
>>> blob, a route to move the SFImage to Script for processing from ply to
>>> IndexedTriangleSet
>>> Q. or is there another method that's easier / already working?
>>> -Doug
>>>
>>> On Tue, Nov 18, 2025 at 9:31 AM Bergstrom, Aaron via x3d-public <
>>> x3d-public at web3d.org> wrote:
>>>
>>>> X3D and AI Stakeholders,
>>>>
>>>>
>>>>
>>>> Please join us for the X3D-AI Working Group meeting this * Wed, Nov
>>>> 19th* at *2pm GMT* (*9am US Eastern* – *6am Pacific*)
>>>>
>>>>
>>>>
>>>> The agenda for this week will include the following topics:
>>>>
>>>>    - Metaverse Standards Forum - AI Working Group
>>>>    - Gaussian Splatting – Proposed Node Text
>>>>    - AI Assisted Content Creation
>>>>    - Near-term Activities of the X3D-AI Working Group
>>>>
>>>>
>>>>
>>>> The Zoom link for the meeting and a more detailed agenda can be found
>>>> at the Calendar link below
>>>>
>>>>
>>>> https://www.web3d.org/calendar/2800/x3d-ai-working-group-meeting/2025-11-19t140000-2025-12-17t140000-2026-01-21t140000
>>>>
>>>>
>>>>
>>>> Hope to see everyone there,
>>>>
>>>>
>>>>
>>>> Aaron Bergstrom
>>>>
>>>> X3D-AI Working Group Chair
>>>>
>>>> https://www.web3d.org/working-groups/ai-x3d
>>>>
>>>>
>>>> _______________________________________________
>>>> x3d-public mailing list
>>>> x3d-public at web3d.org
>>>> http://web3d.org/mailman/listinfo/x3d-public_web3d.org
>>>>
>>> _______________________________________________
>>> x3d-public mailing list
>>> x3d-public at web3d.org
>>> http://web3d.org/mailman/listinfo/x3d-public_web3d.org
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20251130/c90a27aa/attachment-0001.html>


More information about the x3d-public mailing list