<div dir="ltr">"SFJSON/MFString to route to script from reader?"<div>Good question. The MakeHuman files have one .obj (with the main mesh) and a bunch of .json files with modifiers, and a few images.</div><div>One option is a FileLoader node that reads any file as a binary blob, and puts it in an SFImage (1 row, n columns, where n is the size of the file in bytes) then there could be Script node javascript parsers to parse the files. JSON has "" strings in the middle of it, so MFString wouldn't work as a carrier / transport for .json. A Script node script function could go from blob to filling out fields with the specific information. How would json parsing work in javascript?</div><div>-Doug</div></div><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">On Wed, Dec 3, 2025 at 3:55 PM John Carlson <<a href="mailto:yottzumm@gmail.com">yottzumm@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="auto">And SFJSON/MFStrimg to route to script from reader?</div><div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Dec 3, 2025 at 10:28 AM GPU Group via x3d-public <<a href="mailto:x3d-public@web3d.org" target="_blank">x3d-public@web3d.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">CreateHumanoid: MakeHuman adapted as proto with first type HAnimHumanoid, and maybe (new node type) FileReader for coveroid assets in json like makehuman.</div><div>-Doug</div><div><br></div><div dir="ltr">H: for web3d discussions, don't say 'makehuman' or 'MPFB2'. Say createHumanoid. Then it's more generic.</div><div>Copilot Prompt: </div><div> What AI systems could I use to modify an existing computer program. In particular I'd like to adapt makehuman or its MPFB blender python variant, to web3d Script node in javascript.</div><div>Copilot Response:</div><div><p><strong>Quick Answer:</strong> To adapt <em>MakeHuman</em> or its MPFB Blender Python variant into a Web3D Script Node in JavaScript, you’ll want AI systems that specialize in <strong>code translation, refactoring, and cross-language adaptation</strong>. The most relevant tools are <strong>GitHub Copilot, Amazon CodeWhisperer, OpenAI Codex, Tabnine, Replit Ghostwriter, and Syntha AI</strong>, which can help you convert Python logic into JavaScript while preserving the 3D modeling functionality <a href="https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/" target="_blank">blog.bluelupin.com</a> <a href="https://syntha.ai/" target="_blank">syntha.ai</a>.</p>
<hr>
<h2>🔧 AI Systems for Code Modification & Translation</h2>
<h3>1. <strong>GitHub Copilot</strong></h3>
<ul>
<li>Built on advanced language models.</li>
<li>Excellent at <strong>inline code suggestions</strong> and adapting snippets between languages.</li>
<li>Can help translate Python-based geometry or Blender API calls into JavaScript/Web3D equivalents <a href="https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/" target="_blank">blog.bluelupin.com</a>.</li>
</ul>
<h3>2. <strong>Amazon CodeWhisperer</strong></h3>
<ul>
<li>Strong at <strong>cross-language code generation</strong>.</li>
<li>Useful for adapting <strong>Python scripts into JavaScript</strong>, especially when targeting web frameworks or APIs <a href="https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/" target="_blank">blog.bluelupin.com</a>.</li>
</ul>
<h3>3. <strong>OpenAI Codex (API-based)</strong></h3>
<ul>
<li>Specialized in <strong>code conversion and explanation</strong>.</li>
<li>Can take Python functions (e.g., MPFB mesh generation) and output equivalent JavaScript for Web3D Script nodes <a href="https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/" target="_blank">blog.bluelupin.com</a>.</li>
</ul>
<h3>4. <strong>Tabnine</strong></h3>
<ul>
<li>Predictive AI coding assistant.</li>
<li>Helps with <strong>refactoring and filling in missing logic</strong> when porting code across languages <a href="https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/" target="_blank">blog.bluelupin.com</a>.</li>
</ul>
<h3>5. <strong>Replit Ghostwriter</strong></h3>
<ul>
<li>Integrated into Replit IDE.</li>
<li>Supports <strong>multi-language translation</strong> and debugging.</li>
<li>Handy if you want a <strong>cloud-based environment</strong> to test Python-to-JS conversions <a href="https://blog.bluelupin.com/best-ai-code-tools-for-python-javascript-more-compared/" target="_blank">blog.bluelupin.com</a>.</li>
</ul>
<h3>6. <strong>Syntha AI</strong></h3>
<ul>
<li>Offers <strong>code conversion agents</strong> specifically designed for language translation.</li>
<li>Includes tools like <strong>Code Converter</strong> and <strong>Code Explainer</strong> to move from Python to JavaScript while documenting the changes <a href="https://syntha.ai/" target="_blank">syntha.ai</a>.</li>
</ul>
<hr>
<h2>🧩 How These Apply to Your Case</h2>
<ul>
<li><strong>MakeHuman / MPFB Blender Python Variant</strong>: These rely heavily on Python for mesh generation, parameterized models, and Blender integration.</li>
<li><strong>Web3D Script Node in JavaScript</strong>: Requires porting logic into JavaScript, likely using <strong>X3DOM, WebGL, or Three.js</strong> to handle 3D rendering.</li>
</ul>
<h3>Workflow Example:</h3>
<ol>
<li><strong>Extract core mesh generation logic</strong> from MPFB Python scripts.</li>
<li>Use <strong>Codex or Syntha AI</strong> to translate Python functions into JavaScript.</li>
<li>Refactor with <strong>Copilot or CodeWhisperer</strong> to integrate into Web3D Script Node APIs.</li>
<li>Test rendering in a Web3D framework (e.g., X3DOM or Three.js).</li>
</ol>
<hr>
<h2>⚡ Key Considerations</h2>
<ul>
<li><strong>API Differences</strong>: Blender’s Python API vs. Web3D’s JavaScript API will require <strong>manual adaptation</strong>, not just direct translation.</li>
<li><strong>Performance</strong>: Web3D runs in browsers, so optimization for <strong>real-time rendering</strong> is crucial.</li>
<li><strong>Hybrid Approach</strong>: AI tools can automate translation, but you’ll need to guide them with <strong>domain-specific knowledge</strong> of 3D modeling.</li>
</ul>
<hr>
<p>✅ <strong>Best Recommendation:</strong> Start with <strong>OpenAI Codex or Syntha AI</strong> for initial Python-to-JS translation, then refine with <strong>GitHub Copilot</strong> inside your IDE to adapt the code to Web3D Script Node standards.</p>
<p>Would you like me to sketch out a <strong>step-by-step migration plan</strong> (Python → JavaScript → Web3D integration) so you can see exactly how to structure the adaptation?</p><p>yes.</p><h1>Migration plan from MPFB/MakeHuman Python to Web3D Script node in JavaScript</h1><p>You’re taking a parametric, Python-based human generator and porting it to a browser runtime that’s event-driven, sandboxed, and GPU-backed. The crux is to isolate pure geometry and parameter logic from Blender/MakeHuman APIs, then rebuild those parts against Web3D/X3D Script node semantics and WebGL-backed rendering.</p><hr><h2>Scope and target environment</h2><ul>
<li><strong>Target runtime:</strong> JavaScript in Web3D/X3D (e.g., X3DOM or X_ITE), using a Script node to generate/update geometry.</li>
<li><strong>Rendering backend:</strong> Browser WebGL via the Web3D engine.</li>
<li><strong>Output format:</strong> X3D geometry nodes (IndexedFaceSet + Coordinate + Normal + TextureCoordinate), materials, and possibly morph updates.</li>
</ul><hr><h2>Phase 1 — Audit and isolation (Python side)</h2><ul>
<li><strong>Inventory modules:</strong> Identify files/functions responsible for
<ul>
<li>Parameter schema (age, gender, height, BMI, muscle, etc.)</li>
<li>Topology definition (vertex count, face indices)</li>
<li>Morph targets / sliders (vertex delta arrays)</li>
<li>UV layout and materials</li>
<li>Rigging (defer initially; focus on static mesh + morphs)</li>
</ul></li>
<li><strong>Separate pure logic from host APIs:</strong>
<ul>
<li><strong>Label:</strong> Mesh math</li>
<li>Extract functions that compute vertices, indices, normals, UVs.</li>
<li><strong>Label:</strong> Morph system</li>
<li>Gather data structures mapping slider values to vertex deltas.</li>
<li><strong>Label:</strong> Asset loading</li>
<li>Note where files (OBJ, JSON, MHX2) are read; plan browser-friendly formats.</li>
</ul></li>
<li><strong>Define a minimal core:</strong> A Python “core” that, given parameters, returns:
<ul>
<li><strong>Label:</strong> Vertices</li>
<li>Float32 array shape N×3</li>
<li><strong>Label:</strong> Indices</li>
<li>Uint32 array shape M×3</li>
<li><strong>Label:</strong> Normals</li>
<li>Optional, or compute in JS</li>
<li><strong>Label:</strong> UVs</li>
<li>N×2</li>
</ul></li>
</ul><hr><h2>Phase 2 — Target design (Web3D/X3D and JS)</h2><ul>
<li><strong>Choose Web3D stack:</strong> X3DOM or X_ITE; confirm Script interface (fields, events).</li>
<li><strong>Data structures in JS:</strong>
<ul>
<li><strong>Label:</strong> Param object</li>
<li>A JSON schema for sliders and constraints.</li>
<li><strong>Label:</strong> Geometry buffers</li>
<li>Typed arrays for coordinates, indices, normals, UVs.</li>
</ul></li>
<li><strong>X3D nodes mapping:</strong>
<ul>
<li><strong>Label:</strong> Geometry</li>
<li>IndexedFaceSet with Coordinate, Normal, TextureCoordinate.</li>
<li><strong>Label:</strong> Materials</li>
<li>Appearance/Material, optionally ImageTexture.</li>
</ul></li>
<li><strong>Event flow:</strong>
<ul>
<li><strong>Label:</strong> Input</li>
<li>Script node receives parameters via fields or UI.</li>
<li><strong>Label:</strong> Compute</li>
<li>Update buffers; set_changed events to push into geometry nodes.</li>
</ul></li>
</ul><p>Example Script node skeleton:</p><pre><code><X3D>
<Scene>
<Shape>
<Appearance>
<Material diffuseColor='0.8 0.7 0.6'/>
</Appearance>
<IndexedFaceSet DEF="humanIFS" coordIndex="" solid="true">
<Coordinate DEF="humanCoord" point=""/>
<Normal DEF="humanNormal" vector=""/>
<TextureCoordinate DEF="humanUV" point=""/>
</IndexedFaceSet>
</Shape>
<Script DEF="HumanGenerator" directOutput="true">
<field name="params" type="SFString" accessType="inputOnly"/>
<field name="coord" type="SFNode" accessType="initializeOnly">
<Coordinate USE="humanCoord"/>
</field>
<field name="ifs" type="SFNode" accessType="initializeOnly">
<IndexedFaceSet USE="humanIFS"/>
</field>
<field name="normal" type="SFNode" accessType="initializeOnly">
<Normal USE="humanNormal"/>
</field>
<field name="uv" type="SFNode" accessType="initializeOnly">
<TextureCoordinate USE="humanUV"/>
</field>
<field name="update" type="SFBool" accessType="inputOnly"/>
<![CDATA[
function initialize() {
// bootstrap default geometry
}
function set_params(p) {
var config = JSON.parse(p);
// recompute geometry with config
computeHuman(config);
pushBuffers();
}
function set_update(flag) {
if (flag) { pushBuffers(); }
}
function computeHuman(config) {
// TODO: implement param->geometry
}
function pushBuffers() {
// Assign coord.point, ifs.coordIndex, normal.vector, uv.point
}
]]>
</Script>
</Scene>
</X3D>
</code></pre><hr><h2>Phase 3 — Porting the core logic</h2><ul>
<li><strong>Parameter schema translation:</strong>
<ul>
<li><strong>Label:</strong> Constraints</li>
<li>Preserve ranges, dependencies, and anatomical constraints.</li>
<li><strong>Label:</strong> Units</li>
<li>Keep consistent scaling; define meters as base for WebGL.</li>
</ul></li>
<li><strong>Topology and indices:</strong>
<ul>
<li><strong>Label:</strong> Fixed topology</li>
<li>Port static index arrays as JSON or binary blobs.</li>
<li><strong>Label:</strong> Variants</li>
<li>If topology varies by gender/age, load appropriate set at init.</li>
</ul></li>
<li><strong>Morph targets:</strong>
<ul>
<li><strong>Label:</strong> Data format</li>
<li>Store each slider’s vertex delta as Float32Array aligned to topology.</li>
<li><strong>Label:</strong> Application</li>
<li>In JS, compute vertices: base + Σ(weight_i × delta_i).</li>
</ul></li>
<li><strong>Normals and UV:</strong>
<ul>
<li><strong>Label:</strong> Normals</li>
<li>Start by computing per-face; then switch to per-vertex with area/angle weighting.</li>
<li><strong>Label:</strong> UV</li>
<li>Use base UV; ensure no seams break under morphs.</li>
</ul></li>
<li><strong>Blender/MH API replacement:</strong>
<ul>
<li><strong>Label:</strong> Mesh ops</li>
<li>Replace bpy functions with your own array operations.</li>
<li><strong>Label:</strong> Materials</li>
<li>Map materials to X3D Material/ImageTexture; bake procedural textures as images if needed.</li>
</ul></li>
<li><strong>Serialization pipeline:</strong>
<ul>
<li><strong>Label:</strong> Assets</li>
<li>Host base mesh and morphs as .bin (ArrayBuffer) or .json; fetch via HTTP.</li>
<li><strong>Label:</strong> Caching</li>
<li>Use IndexedDB for offline caching of large arrays.</li>
</ul></li>
</ul><p>Minimal JS compute example (morph application):</p><pre><code>// base: Float32Array length 3*N
// deltas: Array of Float32Array length 3*N
// weights: Array of numbers length K
function applyMorphs(base, deltas, weights) {
const out = new Float32Array(base.length);
out.set(base);
for (let k = 0; k < deltas.length; k++) {
const w = weights[k];
if (!w) continue;
const d = deltas[k];
for (let i = 0; i < out.length; i++) out[i] += w * d[i];
}
return out;
}
</code></pre><hr><h2>Phase 4 — Performance and UX</h2><ul>
<li><strong>Typed arrays and transfer:</strong>
<ul>
<li><strong>Label:</strong> Buffers</li>
<li>Use Float32Array/Uint32Array; avoid frequent reallocs.</li>
<li><strong>Label:</strong> Worker</li>
<li>Move heavy morph computations to Web Workers; message ArrayBuffers.</li>
</ul></li>
<li><strong>Incremental updates:</strong>
<ul>
<li><strong>Label:</strong> Slider changes</li>
<li>Recompute only affected regions if your morph system supports locality; otherwise full recompute.</li>
</ul></li>
<li><strong>LOD and decimation:</strong>
<ul>
<li><strong>Label:</strong> Multiple LODs</li>
<li>Offer low/high topology; switch based on device performance.</li>
</ul></li>
<li><strong>GPU considerations:</strong>
<ul>
<li><strong>Label:</strong> Normals on GPU</li>
<li>Optionally compute normals in a vertex shader from morphs if engine allows custom shaders.</li>
</ul></li>
<li><strong>UI integration:</strong>
<ul>
<li><strong>Label:</strong> Binding</li>
<li>Bind sliders to Script params via routes/events; throttle updates.</li>
</ul></li>
</ul><hr><h2>Phase 5 — Testing, validation, and roadmap</h2><ul>
<li><strong>Correctness:</strong>
<ul>
<li><strong>Label:</strong> Snapshot tests</li>
<li>Compare JS vertices against Python outputs for a set of parameter vectors.</li>
<li><strong>Label:</strong> Visual checks</li>
<li>Side-by-side renders to catch UV stretching or normal artifacts.</li>
</ul></li>
<li><strong>Numerical stability:</strong>
<ul>
<li><strong>Label:</strong> Constraints</li>
<li>Clamp extreme combinations; add soft regularization to avoid self-intersections.</li>
</ul></li>
<li><strong>Interoperability:</strong>
<ul>
<li><strong>Label:</strong> Export</li>
<li>Provide GLB/OBJ export for downstream verification.</li>
</ul></li>
<li><strong>Future features:</strong>
<ul>
<li><strong>Label:</strong> Rigging</li>
<li>Port armature and skin weights; map to X3D HAnim or custom animation.</li>
<li><strong>Label:</strong> Textures</li>
<li>Parameterized texture generation or material variants.</li>
<li><strong>Label:</strong> Persistence</li>
<li>Save parameter sets; reproducible IDs for models.</li>
</ul></li>
</ul><hr><h3>Suggested next steps</h3><ul>
<li><strong>Define your parameter JSON</strong> with ranges and a small test suite of 10 preset bodies.</li>
<li><strong>Extract and freeze the base topology and 5–10 key morphs</strong> from MPFB to validate the pipeline.</li>
<li><strong>Pick X3DOM or X_ITE</strong> and wire the Script node to push buffers into IndexedFaceSet.</li>
<li><strong>Run cross-checks</strong>: emit vertices from both Python and JS for identical parameters; assert max deviation under a small epsilon.</li>
</ul><p>
</p><p>If you share a snippet of the MPFB functions that generate vertices/morphs, I can map them one-to-one into the Script node compute stubs and propose exact data formats for the assets.</p></div><div>/Copilot </div></div>
_______________________________________________<br>
x3d-public mailing list<br>
<a href="mailto:x3d-public@web3d.org" target="_blank">x3d-public@web3d.org</a><br>
<a href="http://web3d.org/mailman/listinfo/x3d-public_web3d.org" rel="noreferrer" target="_blank">http://web3d.org/mailman/listinfo/x3d-public_web3d.org</a><br>
</blockquote></div></div>
</blockquote></div>