[x3d-public] How does one create emotion in a virtual world
John Carlson
yottzumm at gmail.com
Sun Jun 7 20:17:10 PDT 2020
Remember I said I was getting into procedural generation again. Well this
time it’s the generation of emotion. One can create emotion through
music, words, action, and situation or context. Our faces conform to the
emotion we are feeling. In HAnim2 we have mainly focused on action or
animation, without considering the emotion that generated the emotion. If
I want my avatar in a 3D world to express an emotion, currently I pick from
a list of actions.
With AI’s help, we are getting much better at mapping our face onto an
avatar.
How do we bring this important capability in virtual worlds, which might be
the key to the success of VR/AR/MR/XR?
I’m not really sure if it’s just photons and sound?
How does context translate into emotion? Can computers guess how we might
feel?
How does this translate into XML/attachments?
John
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20200607/194de371/attachment.html>
More information about the x3d-public
mailing list