[x3d-public] CreateHumanoid
John Carlson
yottzumm at gmail.com
Wed Dec 3 15:53:33 PST 2025
JSON parsing:
javascript_value = JSON.parse(json_string);
javascript_value abstract syntax tree parsing, creates a DOM document:
https://github.com/coderextreme/X3DJSONLD/blob/master/src/main/node/X3DJSONLD.js
That’s just JavaScript. This code is already in X3DOM and X_ITE, no need
to grab mine. It’s probably under something like JSON_Parser.js.
So take JSON and convert it to scenegraph. I’ve been working on that in
Pascal. The only thing I’ve been successful at is converting JSON to DOM
documents…my bread and butter. I’ve also converted C++ to DOM documents.
If you quote a “ in with \ in JSON, there should be no surprises as to the
parsed result.
If you want to take over the JSON to scenegraph task, that would be
terrific!
John
On Wed, Dec 3, 2025 at 5:30 PM GPU Group <gpugroup at gmail.com> wrote:
> "SFJSON/MFString to route to script from reader?"
> Good question. The MakeHuman files have one .obj (with the main mesh) and
> a bunch of .json files with modifiers, and a few images.
> One option is a FileLoader node that reads any file as a binary blob, and
> puts it in an SFImage (1 row, n columns, where n is the size of the file in
> bytes) then there could be Script node javascript parsers to parse the
> files. JSON has "" strings in the middle of it, so MFString wouldn't work
> as a carrier / transport for .json. A Script node script function could go
> from blob to filling out fields with the specific information. How would
> json parsing work in javascript?
> -Doug
>
> On Wed, Dec 3, 2025 at 3:55 PM John Carlson <yottzumm at gmail.com> wrote:
>
>> And SFJSON/MFStrimg to route to script from reader?
>>
>> On Wed, Dec 3, 2025 at 10:28 AM GPU Group via x3d-public <
>> x3d-public at web3d.org> wrote:
>>
>>> CreateHumanoid: MakeHuman adapted as proto with first type
>>> HAnimHumanoid, and maybe (new node type) FileReader for coveroid assets in
>>> json like makehuman.
>>> -Doug
>>>
>>> H: for web3d discussions, don't say 'makehuman' or 'MPFB2'. Say
>>> createHumanoid. Then it's more generic.
>>> Copilot Prompt:
>>> What AI systems could I use to modify an existing computer program. In
>>> particular I'd like to adapt makehuman or its MPFB blender python variant,
>>> to web3d Script node in javascript.
>>> Copilot Response:
>>>
>>> *Quick Answer:* To adapt *MakeHuman* or its MPFB Blender Python variant
>>> into a Web3D Script Node in JavaScript, you’ll want AI systems that
>>> specialize in *code translation, refactoring, and cross-language
>>> adaptation*. The most relevant tools are *GitHub Copilot, Amazon
>>> CodeWhisperer, OpenAI Codex, Tabnine, Replit Ghostwriter, and Syntha AI*,
>>> which can help you convert Python logic into JavaScript while preserving
>>> the 3D modeling functionality blog.bluelupin.com
>>> <https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/>
>>> syntha.ai.
>>> ------------------------------
>>> 🔧 AI Systems for Code Modification & Translation 1. *GitHub Copilot*
>>>
>>> - Built on advanced language models.
>>> - Excellent at *inline code suggestions* and adapting snippets
>>> between languages.
>>> - Can help translate Python-based geometry or Blender API calls into
>>> JavaScript/Web3D equivalents blog.bluelupin.com
>>> <https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/>
>>> .
>>>
>>> 2. *Amazon CodeWhisperer*
>>>
>>> - Strong at *cross-language code generation*.
>>> - Useful for adapting *Python scripts into JavaScript*, especially
>>> when targeting web frameworks or APIs blog.bluelupin.com
>>> <https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/>
>>> .
>>>
>>> 3. *OpenAI Codex (API-based)*
>>>
>>> - Specialized in *code conversion and explanation*.
>>> - Can take Python functions (e.g., MPFB mesh generation) and output
>>> equivalent JavaScript for Web3D Script nodes blog.bluelupin.com
>>> <https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/>
>>> .
>>>
>>> 4. *Tabnine*
>>>
>>> - Predictive AI coding assistant.
>>> - Helps with *refactoring and filling in missing logic* when porting
>>> code across languages blog.bluelupin.com
>>> <https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/>
>>> .
>>>
>>> 5. *Replit Ghostwriter*
>>>
>>> - Integrated into Replit IDE.
>>> - Supports *multi-language translation* and debugging.
>>> - Handy if you want a *cloud-based environment* to test Python-to-JS
>>> conversions blog.bluelupin.com
>>> <https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/>
>>> .
>>>
>>> 6. *Syntha AI*
>>>
>>> - Offers *code conversion agents* specifically designed for language
>>> translation.
>>> - Includes tools like *Code Converter* and *Code Explainer* to move
>>> from Python to JavaScript while documenting the changes syntha.ai.
>>>
>>> ------------------------------
>>> 🧩 How These Apply to Your Case
>>>
>>> - *MakeHuman / MPFB Blender Python Variant*: These rely heavily on
>>> Python for mesh generation, parameterized models, and Blender integration.
>>> - *Web3D Script Node in JavaScript*: Requires porting logic into
>>> JavaScript, likely using *X3DOM, WebGL, or Three.js* to handle 3D
>>> rendering.
>>>
>>> Workflow Example:
>>>
>>> 1. *Extract core mesh generation logic* from MPFB Python scripts.
>>> 2. Use *Codex or Syntha AI* to translate Python functions into
>>> JavaScript.
>>> 3. Refactor with *Copilot or CodeWhisperer* to integrate into Web3D
>>> Script Node APIs.
>>> 4. Test rendering in a Web3D framework (e.g., X3DOM or Three.js).
>>>
>>> ------------------------------
>>> ⚡ Key Considerations
>>>
>>> - *API Differences*: Blender’s Python API vs. Web3D’s JavaScript API
>>> will require *manual adaptation*, not just direct translation.
>>> - *Performance*: Web3D runs in browsers, so optimization for *real-time
>>> rendering* is crucial.
>>> - *Hybrid Approach*: AI tools can automate translation, but you’ll
>>> need to guide them with *domain-specific knowledge* of 3D modeling.
>>>
>>> ------------------------------
>>>
>>> ✅ *Best Recommendation:* Start with *OpenAI Codex or Syntha AI* for
>>> initial Python-to-JS translation, then refine with *GitHub Copilot*
>>> inside your IDE to adapt the code to Web3D Script Node standards.
>>>
>>> Would you like me to sketch out a *step-by-step migration plan* (Python
>>> → JavaScript → Web3D integration) so you can see exactly how to structure
>>> the adaptation?
>>>
>>> yes.
>>> Migration plan from MPFB/MakeHuman Python to Web3D Script node in
>>> JavaScript
>>>
>>> You’re taking a parametric, Python-based human generator and porting it
>>> to a browser runtime that’s event-driven, sandboxed, and GPU-backed. The
>>> crux is to isolate pure geometry and parameter logic from Blender/MakeHuman
>>> APIs, then rebuild those parts against Web3D/X3D Script node semantics and
>>> WebGL-backed rendering.
>>> ------------------------------
>>> Scope and target environment
>>>
>>> - *Target runtime:* JavaScript in Web3D/X3D (e.g., X3DOM or X_ITE),
>>> using a Script node to generate/update geometry.
>>> - *Rendering backend:* Browser WebGL via the Web3D engine.
>>> - *Output format:* X3D geometry nodes (IndexedFaceSet + Coordinate +
>>> Normal + TextureCoordinate), materials, and possibly morph updates.
>>>
>>> ------------------------------
>>> Phase 1 — Audit and isolation (Python side)
>>>
>>> - *Inventory modules:* Identify files/functions responsible for
>>> - Parameter schema (age, gender, height, BMI, muscle, etc.)
>>> - Topology definition (vertex count, face indices)
>>> - Morph targets / sliders (vertex delta arrays)
>>> - UV layout and materials
>>> - Rigging (defer initially; focus on static mesh + morphs)
>>> - *Separate pure logic from host APIs:*
>>> - *Label:* Mesh math
>>> - Extract functions that compute vertices, indices, normals, UVs.
>>> - *Label:* Morph system
>>> - Gather data structures mapping slider values to vertex deltas.
>>> - *Label:* Asset loading
>>> - Note where files (OBJ, JSON, MHX2) are read; plan
>>> browser-friendly formats.
>>> - *Define a minimal core:* A Python “core” that, given parameters,
>>> returns:
>>> - *Label:* Vertices
>>> - Float32 array shape N×3
>>> - *Label:* Indices
>>> - Uint32 array shape M×3
>>> - *Label:* Normals
>>> - Optional, or compute in JS
>>> - *Label:* UVs
>>> - N×2
>>>
>>> ------------------------------
>>> Phase 2 — Target design (Web3D/X3D and JS)
>>>
>>> - *Choose Web3D stack:* X3DOM or X_ITE; confirm Script interface
>>> (fields, events).
>>> - *Data structures in JS:*
>>> - *Label:* Param object
>>> - A JSON schema for sliders and constraints.
>>> - *Label:* Geometry buffers
>>> - Typed arrays for coordinates, indices, normals, UVs.
>>> - *X3D nodes mapping:*
>>> - *Label:* Geometry
>>> - IndexedFaceSet with Coordinate, Normal, TextureCoordinate.
>>> - *Label:* Materials
>>> - Appearance/Material, optionally ImageTexture.
>>> - *Event flow:*
>>> - *Label:* Input
>>> - Script node receives parameters via fields or UI.
>>> - *Label:* Compute
>>> - Update buffers; set_changed events to push into geometry nodes.
>>>
>>> Example Script node skeleton:
>>>
>>> <X3D>
>>> <Scene>
>>> <Shape>
>>> <Appearance>
>>> <Material diffuseColor='0.8 0.7 0.6'/>
>>> </Appearance>
>>> <IndexedFaceSet DEF="humanIFS" coordIndex="" solid="true">
>>> <Coordinate DEF="humanCoord" point=""/>
>>> <Normal DEF="humanNormal" vector=""/>
>>> <TextureCoordinate DEF="humanUV" point=""/>
>>> </IndexedFaceSet>
>>> </Shape>
>>>
>>> <Script DEF="HumanGenerator" directOutput="true">
>>> <field name="params" type="SFString" accessType="inputOnly"/>
>>> <field name="coord" type="SFNode" accessType="initializeOnly">
>>> <Coordinate USE="humanCoord"/>
>>> </field>
>>> <field name="ifs" type="SFNode" accessType="initializeOnly">
>>> <IndexedFaceSet USE="humanIFS"/>
>>> </field>
>>> <field name="normal" type="SFNode" accessType="initializeOnly">
>>> <Normal USE="humanNormal"/>
>>> </field>
>>> <field name="uv" type="SFNode" accessType="initializeOnly">
>>> <TextureCoordinate USE="humanUV"/>
>>> </field>
>>> <field name="update" type="SFBool" accessType="inputOnly"/>
>>> <![CDATA[
>>> function initialize() {
>>> // bootstrap default geometry
>>> }
>>> function set_params(p) {
>>> var config = JSON.parse(p);
>>> // recompute geometry with config
>>> computeHuman(config);
>>> pushBuffers();
>>> }
>>> function set_update(flag) {
>>> if (flag) { pushBuffers(); }
>>> }
>>> function computeHuman(config) {
>>> // TODO: implement param->geometry
>>> }
>>> function pushBuffers() {
>>> // Assign coord.point, ifs.coordIndex, normal.vector, uv.point
>>> }
>>> ]]>
>>> </Script>
>>> </Scene>
>>> </X3D>
>>>
>>> ------------------------------
>>> Phase 3 — Porting the core logic
>>>
>>> - *Parameter schema translation:*
>>> - *Label:* Constraints
>>> - Preserve ranges, dependencies, and anatomical constraints.
>>> - *Label:* Units
>>> - Keep consistent scaling; define meters as base for WebGL.
>>> - *Topology and indices:*
>>> - *Label:* Fixed topology
>>> - Port static index arrays as JSON or binary blobs.
>>> - *Label:* Variants
>>> - If topology varies by gender/age, load appropriate set at init.
>>> - *Morph targets:*
>>> - *Label:* Data format
>>> - Store each slider’s vertex delta as Float32Array aligned to
>>> topology.
>>> - *Label:* Application
>>> - In JS, compute vertices: base + Σ(weight_i × delta_i).
>>> - *Normals and UV:*
>>> - *Label:* Normals
>>> - Start by computing per-face; then switch to per-vertex with
>>> area/angle weighting.
>>> - *Label:* UV
>>> - Use base UV; ensure no seams break under morphs.
>>> - *Blender/MH API replacement:*
>>> - *Label:* Mesh ops
>>> - Replace bpy functions with your own array operations.
>>> - *Label:* Materials
>>> - Map materials to X3D Material/ImageTexture; bake procedural
>>> textures as images if needed.
>>> - *Serialization pipeline:*
>>> - *Label:* Assets
>>> - Host base mesh and morphs as .bin (ArrayBuffer) or .json; fetch
>>> via HTTP.
>>> - *Label:* Caching
>>> - Use IndexedDB for offline caching of large arrays.
>>>
>>> Minimal JS compute example (morph application):
>>>
>>> // base: Float32Array length 3*N
>>> // deltas: Array of Float32Array length 3*N
>>> // weights: Array of numbers length K
>>> function applyMorphs(base, deltas, weights) {
>>> const out = new Float32Array(base.length);
>>> out.set(base);
>>> for (let k = 0; k < deltas.length; k++) {
>>> const w = weights[k];
>>> if (!w) continue;
>>> const d = deltas[k];
>>> for (let i = 0; i < out.length; i++) out[i] += w * d[i];
>>> }
>>> return out;
>>> }
>>>
>>> ------------------------------
>>> Phase 4 — Performance and UX
>>>
>>> - *Typed arrays and transfer:*
>>> - *Label:* Buffers
>>> - Use Float32Array/Uint32Array; avoid frequent reallocs.
>>> - *Label:* Worker
>>> - Move heavy morph computations to Web Workers; message
>>> ArrayBuffers.
>>> - *Incremental updates:*
>>> - *Label:* Slider changes
>>> - Recompute only affected regions if your morph system supports
>>> locality; otherwise full recompute.
>>> - *LOD and decimation:*
>>> - *Label:* Multiple LODs
>>> - Offer low/high topology; switch based on device performance.
>>> - *GPU considerations:*
>>> - *Label:* Normals on GPU
>>> - Optionally compute normals in a vertex shader from morphs if
>>> engine allows custom shaders.
>>> - *UI integration:*
>>> - *Label:* Binding
>>> - Bind sliders to Script params via routes/events; throttle
>>> updates.
>>>
>>> ------------------------------
>>> Phase 5 — Testing, validation, and roadmap
>>>
>>> - *Correctness:*
>>> - *Label:* Snapshot tests
>>> - Compare JS vertices against Python outputs for a set of
>>> parameter vectors.
>>> - *Label:* Visual checks
>>> - Side-by-side renders to catch UV stretching or normal artifacts.
>>> - *Numerical stability:*
>>> - *Label:* Constraints
>>> - Clamp extreme combinations; add soft regularization to avoid
>>> self-intersections.
>>> - *Interoperability:*
>>> - *Label:* Export
>>> - Provide GLB/OBJ export for downstream verification.
>>> - *Future features:*
>>> - *Label:* Rigging
>>> - Port armature and skin weights; map to X3D HAnim or custom
>>> animation.
>>> - *Label:* Textures
>>> - Parameterized texture generation or material variants.
>>> - *Label:* Persistence
>>> - Save parameter sets; reproducible IDs for models.
>>>
>>> ------------------------------
>>> Suggested next steps
>>>
>>> - *Define your parameter JSON* with ranges and a small test suite of
>>> 10 preset bodies.
>>> - *Extract and freeze the base topology and 5–10 key morphs* from
>>> MPFB to validate the pipeline.
>>> - *Pick X3DOM or X_ITE* and wire the Script node to push buffers
>>> into IndexedFaceSet.
>>> - *Run cross-checks*: emit vertices from both Python and JS for
>>> identical parameters; assert max deviation under a small epsilon.
>>>
>>> If you share a snippet of the MPFB functions that generate
>>> vertices/morphs, I can map them one-to-one into the Script node compute
>>> stubs and propose exact data formats for the assets.
>>> /Copilot
>>> _______________________________________________
>>> x3d-public mailing list
>>> x3d-public at web3d.org
>>> http://web3d.org/mailman/listinfo/x3d-public_web3d.org
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20251203/3eac728b/attachment-0001.html>
More information about the x3d-public
mailing list