[Korea-chapter] [X3D] Agenda items for the Korea chapter meeting (Wed. 5:10pm PST / Thu 10:10am Korea)

Don Brutzman brutzman at nps.edu
Mon Nov 16 18:24:31 PST 2009


Meeting minutes 16 NOV 2009.

Attendees Myeong-Won Lee, Kwan Hee Yoo, Gun Lee, Dick Puk, Don Brutzman

One more discussion item:  we are finalizing updates to the X3D Showcase
this week.  If there are any more code or example submissions for inclusion
in the Showcase, please let me know.

http://www.web3d.org/pipermail/x3d-public_web3d.org/2009-October/000418.html

Myeong Won Lee wrote:
> The following are the agenda items for this week's Korea chapter 
> meeting, [...]. Please provide updates if any amendments or additions are noted.
>  
> 1. Projective Texture Mapping (Kwan Hee Yoo, Chungbuk National U.)
>  >First, it is very important that these documents get posted to the 
> website.
>  >
> ...
>  >2 questions:  do your three proposed texture nodes work similarly to the
>  >other X3D texture nodes with scope restricted to the geometry in the
>  >same Shape node.  For example:
>  >
>  >Shape
>  >Geometry node
>  >Appearance
>  >Material
>  >Texture node
>  >
>  >Hopefully the functionality DEF and USE are the same, for example
>  >the same OrthoTexture might have a single DEF and 2 USE instances
>  >to apply to the geometry found in 3 different Shape nodes.
>  >
>  >(I think the answers are yes, but will let Kwanhee Yoo answer by email.)

Kwan Hee described his thinking, here is his email of 8/9 November.

========================
Kwanhee Yoo wrote:
> Dear Prof. Don.
>
> Thank you for your interesting on projective texture mapping.
>
> Your question is very important to apply projective texture mapping(PTM)
> into the scope of given objects.
>
> I totally agree with your opinion. In a hierarchical scene graph, I think
> that users should be able to provide PTM for only wanted objects. So that
> PTM should be specified with respect to the objects as you described. The
> solution for the problem is to specify PTM as Appearance components of the
> node describing the wanted objects as you suggested. Clearly, DEF and USE
> for PTM can be used.
>
> Thanks.
> Kwan-Hee Yoo
========================

We discussed scoping and other issues.  They are recorded and updated
on the member wiki at
http://www.web3d.org/membership/login/memberwiki/index.php/Projective_Texture_Mapping_Proposal#Issues

Today's updates:

_Key Points_

    *  This work also has potential application for X3D Earth and X3D Geospatial
Component since aerial images might be better applied to terrain elevation data.

_Issues_
    *  Has the X3D Medical Group reviewed this work? What are their comments?
          o No review yet... the work needs to be submitted to the Medical
working group mailing list. The email thread can then be linked here from the
hypermail archive. (Please cc: Dick Puk to ensure that the mail is successfully
received.)

    * Have any X3D browsers implemented something like this work?
          o Kwan Hee is investigating what X3D browser might be good for an
initial implementation. He already has some example code written in C++.

    * What about scoping? Might a PTM apply to all geometry in a scene? Do we
need a 'global' field to minimize the geometry which might be affected?
          o Because a PTM is applied against Shape nodes as part of an
Appearance node, and because that is applied to corresponding geometry in that
Shape node, scoping is already part of the design. A PTM can be applied against
multiple pieces of geometry (for example, 4 walls + ceiling + floor of a room)
by DEF/USE of the same PTM for each of the 6 Shape nodes for the room.

    * Example X3D scene(s) will be helpful

> 2. Mixed reality functions (Gun Lee, ETRI)
> 
>  >> Gun has also defined the mixed reality functions as new nodes.
>  >> It is necessary to review this and decide if these node definitions are
>  >> appropriate for including the new functions in X3D.
>  >
>  >Unfortunately Gun has not yet received comments about these nodes.
>  >He would like to begin implementing them, likely using CyberX3D for C++.
>  >http://www.cybergarage.org/vrml/cx3d/cx3dcc/index.html
>  >He will contact the owner, Satoshi Konno in Tokyo.

Gun confirmed that he will work with Satoshi Konno.  This will be the
initial implementation.  It is OK for an example implementation to be
from an untested non-Web3D codebase.  However a second implementation
will be needed at some point, and ultimately (when accepted) we will
need hopefully many implementations.

CyberX3D is open source C++.

>  >Johannes and Yvonne, Fraunhofer has some similar nodes.  Have you
>  >you looked at this?  Should there be a conference to discuss the
>  >nodes before he implements?  Please advise what you think.
>  >
>  >Feedback from other browser implementers is welcome also.  Please advise.

No feedback provided - help please Johannes and Yvonne (cc:ed).  Gun Lee
has read your paper on this topic closely.  We could not find links on
your website.

Justin:  has Yumetech done work in this area?

Nick and Pablo:  has the user interface (UI) working group worked
on MR devices, or anyone else in your lab?  Gun Lee is keen to learn
about any other device work.

John Stewart, might there be some appropriate work done in FreeWrl?

Gun Lee, please keep the wiki page updated with text and links to
these items.
http://www.web3d.org/membership/login/memberwiki/index.php/Supporting_Mixed_Reality_Visualization_in_X3D_Standards

> 3. Physical units specification (Myeong Won Lee, The University of Suwon)
> 
> The unit browser is being modified from the previous Physical node 
> definition to the UNIT statements.  There were contradictions in the 
> email discussion of the X3D WG. We must decide amongst the following 
> when determining which criteria should be applied in the implementation 
> of the length units:
> 
> (1) Length units are defined but do not affect the visual scene at all 
> where there is no scaling. They transfer length units information only. 
> In this case, we cannot compare differences in real length when two X3D 
> objects designed separately are read into a scene using an inline node 
> or by reading two X3D files together in a scene.  There is no difference 
> at all in the scene compared with no unit definition.
> 
> (2) Length units are defined and do not scale objects when reading an 
> X3D file, but relative scales are used when reading two units-specified 
> X3D files together in a scene using inline nodes or by importing another 
> unit X3D file. We must not scale objects when reading an X3D file 
> because we cannot see the scaled objects of very large and small objects 
> such as those of astronomical and microorganism size. It is desirable to 
> use the same units in X3D files when it is necessary to have relative 
> scales.
>     
> I prefer the latter case (2).

We had a discussion of the issues on the mailing list.

Myeong Won, I am hoping that we might spend an hour or 2 working on the
X3D V3.3 schema together during my visit.

http://www.web3d.org/membership/login/memberwiki/index.php/X3D_Physical_Units_Proposal

> 5.  H-Anim motion data definition (Myeong Won Lee, The University of Suwon)
> 
> We define a Motion Capture node for H-Anim figures in order to transfer 
> the data of moving H-Anim figures. It provides an interface between 
> H-Anim data and motion capture data. An H-Anim viewer has been developed 
> and continues to be updated. A summarized presentation file will be sent 
> to X3D and H-Anim WGs soon.

Unfortunately the H-Anim group has been fairly inactive, but perhaps
some interest can be renewed.

When this is farther along, it would be helpful to also update the wiki
page at
http://www.web3d.org/membership/login/memberwiki/index.php/H-Anim_Motion_Data_Definition

> 6.  Mobile X3D functions (Myeong Won Lee, The University of Suwon)
> 
> A mobile X3D viewer is being developed according to the X3D interactive 
> profile. As presented at the SC24 London meeting, the progressive mesh 
> function is necessary for the server side program, but it seems 
> unnecessary for the client side viewer. However, the GPS function is 
> necessary to include in mobile X3D. Therefore, we propose a GPS node for 
> mobile X3D.

A GPS node is certainly interesting.  Please post it to the wiki or
the Web3D Korea Forum document directory.  We will be glad to discuss.

A GPS node will be of most interest to the X3D Earth working group, so
we can discuss and develop it on that mailing list.

> 7. Web3D Korea Workshop, November 28, Seoul, Korea
> 
> We will have a Web 3D Standardization Workshop at Ewha Womans 
> University, which is the venue of the conference held twice annually by 
> the Korean Institute of Information Scientists and Engineers. Thirteen 
> standardization items will be presented in the areas of X3D, CAD, 
> education, medicine, GIS, Image, mobile, etc.

Very interesting.  We hope that you have a good session.  Please feel free
to post a summary if you like.

> 8. Web3D Korea Chapter Seoul meeting, December 7-8, 2009, Seoul, Korea
> I will prepare the program and and inform everyone. 

Excellent!  I am really looking forward to it.  Dick wishes that he could go!

One new suggested agenda item:  plan for SIGGRAPH ASIA 2010 to be held in
Seoul Korea 7-10 December.  I recommend that we plan new-year goals and
future activities.

http://www.siggraph.org/asia2009/ (for Yokohama in a few weeks)

Another possible agenda item:  if you have a student or member who would
like to work on Korean-language X3D Tooltips, I would be happy to talk
about it then.
http://www.web3d.org/x3d/content/X3dTooltips.html

> 9. Our next Korea chapter meeting is planned for:
> 
> November 25 (Wed) 5:10pm (PST) / November 26 (Thu) 10:10am (Korea), OR
> December 2 (Wed) 5:10pm (PST) / December 3 (Thu) 10:10am (Korea)

It does seem like we are fully prepared for the 7-8 December meeting in
Seoul.  So it looks like we don't need another phone meeting before then.
We can plan our next teleconference during that meeting - it will likely
be in January, since many people go on vacation during mid December.

Thanks everyone for another productive teleconference today.
Comments, questions and improvements are welcome.

all the best, Don
-- 
Don Brutzman  Naval Postgraduate School, Code USW/Br           brutzman at nps.edu
Watkins 270   MOVES Institute, Monterey CA 93943-5000 USA  work +1.831.656.2149
X3D, virtual worlds, underwater robots, XMSF  http://web.nps.navy.mil/~brutzman




More information about the Korea-chapter mailing list