[semantics-public] RDF embeddings for X3D

John Carlson yottzumm at gmail.com
Wed May 25 20:32:51 PDT 2022


People wanting to do object2vec in python should probably view this video:  (1)
AWS Partner Webinar: Object2Vec on Amazon SageMaker - YouTube
<https://www.youtube.com/watch?v=ggVWnnRXtYc>

I just want to make Parametric Geometry Modelling (beyond Nurbs) more
present in X3D.  Yes, I am aware of FVRML and FX3D and some of
Sonja Gorjanc's work in Mathematica:  https://www.grad.hr/sgorjanc/

Enjoy!

John


On Wed, May 25, 2022 at 10:11 PM John Carlson <yottzumm at gmail.com> wrote:

> Indeed, RDF2Vec might be something that people here are interested in, who
> are interested in data mining RDF documents (and Vince's embeddings).
>
> RDF2Vec: RDF Graph Embeddings for Data Mining | SpringerLink
> <https://link.springer.com/chapter/10.1007/978-3-319-46523-4_30>
>
> I have not read or downloaded the article.
>
> On Wed, May 25, 2022 at 9:25 PM John Carlson <yottzumm at gmail.com> wrote:
>
>> Vince, embedding is a technical term from machine learning for taking
>> large amounts of text (word2vec) or objects (Amazon? Sagemaker—object2vec)
>> and converting word or object adjacency (in spatial or temporal terms) into
>> vectors.  This allows words (or things that otherwise aren’t reduced to
>> scalar or matrix values) to be used as vectors instead of using symbolic
>> manipulation.   One can even do vector arithmetic on words, and it works!
>> Obviously words farther away from a central word should probably have a
>> lower weight while embedding.
>>
>> There are many …2vec projects.
>>
>> For example one might use the 6 words around a word throughout a document
>> to predict what word comes next after completing embedding.
>>
>> One needs a vector space to compute gradient descent in neural network
>> backpropagation, since gradient descent cannot be computed on written
>> words.  I believe that the embedding provides such a space, too.
>>
>> In other words, embedding means converting data into vectors so things
>> like TensorFlow will work!
>>
>> The old story used to be that a book can be converted into a vector of a
>> very large space.   Embedding is taking that huge space and reduces the
>> dimensions, and increases the number of vectors.
>>
>> Hope this helps, and will eventually be another tool for 3D practitioners.
>>
>> John
>>
>> On Wed, May 25, 2022 at 8:52 PM vmarchetti at kshell.com <
>> vmarchetti at kshell.com> wrote:
>>
>>> I do not know what capabilities Word2vec or SageMaker offer.
>>>
>>> I have been working on embedding XMP metadata packets in X3D Metadata
>>> nodes. XMP uses a subset of RDF. The conversions from XMP XML packets to
>>> and from X3D Metadata (XML Encoding) are implemented in XSLT stylesheet in
>>> the github project https://github.com/vincentmarchetti/x3d-xslt-tools .
>>>
>>> This work is in support of the ISO JWG-16 STEP Geometry Services
>>> project, which has adopted XMP as a schema for metadata to describe CAD web
>>> services. It may also be useful in embedding XMP in general in X3D .
>>>
>>> Vince Marchetti
>>>
>>> > On May 25, 2022, at 6:18 PM, John Carlson <yottzumm at gmail.com> wrote:
>>> >
>>> > Does anyone have RDF embeddings for X3D, like word embeddings for
>>> Word2vec or object embeddings for SageMaker?
>>> >
>>> > Thanks for info!
>>> >
>>> > John
>>> > --
>>> > semantics-public mailing list
>>> > semantics-public at web3d.org
>>> > http://web3d.org/mailman/listinfo/semantics-public_web3d.org
>>>
>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/semantics-public_web3d.org/attachments/20220525/dcc55ace/attachment.html>


More information about the semantics-public mailing list