[x3d-public] Netw. Sens. Protocol - WAS: 3d-publicDigest, Vol120, Issue 34

Joseph D Williams joedwil at earthlink.net
Sat Mar 16 01:37:18 PDT 2019


➢ But I do NOT understand, what this means with respect to the RT capability of the network protocols.
Hi Christoph, maybe nothing, just making a statement about how the ‘internal’ SAI works internally responding to ‘internal’ events and what should happen as events from ‘external’ are converted to ‘internal’ events. Your representation of the scene, meaning the one that is your local platform, gets to decide the time stamp of the one or more external events when the external tells you when the events to be processed together is complete. (beginupdate/endupdate). 


This must be a tough part, where a scene has multiple inputs, your scene which is sharing some state with other scenes as well as your ‘external’ SAI traffic. Some goes to the networksensor as just another node where the SAI accepts external events as usual then routing some to networksensor(s) and maybe some to other nodes. The network sensor accepts/generates only internal SAI events and sending them to SAI external interface to send out. I wasn’t thinking that the ns would assign ‘internal’ time stamps.

➢ while the network sensor assigns the time stamps from receiving the network traffic instead of the timestamp of the event cascade it releases the event to.
I would like to see the source system tell me when they think the group of events they sent is complete and should all be processed using the same time stamp, that will be assigned by the sink system according to its local time. I was thinking the ns would get events from external same as any other node and they would already have an ‘internal’ timestamp. Maybe I better read up
➢ impossible to send traffic over the network, process remotely and receive again within the SAME event cascade
That might be a way to demonstrate scene entanglement – synchronization at a distance. Each scene has the same time, but it is different in each one. I want the essentials of this because I really may need to know when all networked scenes are synchronized at some common time. Or some animation gets started at some time and I need to know it is running everywhere. If your scene sends an event to another then the sink scene can send back the timestamp of completion. 

Thanks and Best, 
Joe




From: Christoph Valentin
Sent: Friday, March 15, 2019 1:17 PM
To: Joseph D Williams
Cc: x3d-public at web3d.org
Subject: Aw: RE: [x3d-public] Netw. Sens. Protocol - WAS: 3d-publicDigest,Vol120, Issue 34

Joseph,
 
I understand your explanation, and I see an issue, if a scene author relies on the time stamps of one event cascade (one frame) being all the same, while the network sensor assigns the time stamps from receiving the network traffic instead of the timestamp of the event cascade it releases the event to.
 
But I do NOT understand, what this means with respect to the RT capability of the network protocols. The network traffic must be decoupled from the framing anyway, as far as I think to understand.
 
I think it is impossible to send traffic over the network, process remotely and receive again within the SAME event cascade, anyway.
 
KR,
Christoph
  
Gesendet: Freitag, 15. März 2019 um 18:04 Uhr
Von: "Joseph D Williams" <joedwil at earthlink.net>
An: "Christoph Valentin" <christoph.valentin at gmx.at>, "x3d-public at web3d.org" <x3d-public at web3d.org>
Betreff: RE: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest,Vol120, Issue 34
 
• REST is a paradigm (which can e.g. be achieved when using http) which afaik does not specify whether the system behaviour will be RT or NRT.
 
The most important item in x3d realtime is that there is the idea that there can exist a ‘cascade’ of related events that are represented as being executed instantaneously, like all at the same time, so that the simulation is accurately represented at the specified frame time.  The initial event will be created with a unique time stamp and may result in a cascade of events having the same time stamp. When all events have executed, then the frame for that time can be produced.  We don’t want the cascade broken by presenting a frame where all related events have not been completed.
When an x3d scene receives an external event, then the author of the external event can choose to inform the scene to either assign a time stamp then execute each event as an internal event with a unique and honest time stamp as it is received, or to buffer one or more events until released with a single time stamp.
 
Joe
 
 
 
 
 
 
 
 
 
 
 
From: Christoph Valentin
Sent: Friday, March 15, 2019 2:21 AM
To: x3d-public at web3d.org
Subject: Re: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest,Vol120, Issue 34
 
John,
You addressed me directly therefore I answer to my best knowledge. If someone can improve my answer please do it.

REST is a paradigm (which can e.g. be achieved when using http) which afaik does not specify whether the system behaviour will be RT or NRT.

Question: Why do you ask for RT?

Sorry, don't know the differences between WebRTC 1.0 and ORTC. Can someone else help?

All the best
Christoph
--
Diese Nachricht wurde von meinem Android Mobiltelefon mit GMX Mail gesendet.
Am 15.03.19, 02:32, John Carlson <yottzumm at gmail.com> schrieb:
Is REST applicable in a real-time environment?
 
Should we consider ORTC?  Or WebRTC 1.0?
?
 
John
 
Sent from Mail for Windows 10
 
From: Christoph Valentin
Sent: Thursday, March 14, 2019 7:44 PM
To: Doug Sanden
Cc: X3D Graphics public mailing list
Subject: Re: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest,Vol120, Issue 34
 
>>>>>>>> such as XMPP, SIP, RTP, TCP, UDP, SCTP, MSRP, 3GPP MCX, ............
 
as long as it complies with the decision of the consortium to use WebRTC + AJAX, of course........
  
Gesendet: Donnerstag, 14. März 2019 um 20:46 Uhr
Von: "Christoph Valentin" <christoph.valentin at gmx.at>
An: "GPU Group" <gpugroup at gmail.com>
Cc: "X3D Graphics public mailing list" <x3d-public at web3d.org>
Betreff: Re: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest, Vol120, Issue 34
Gina-Lauren, Doug,
 
I know, I do not know the consortium very well and I apologize, when I mix somebody up, but AFAIK you - Doug - are the "head behind FreeWrl". 
 
So if I address you, then I have to talk, as if I talked to a browser creator, true?
 
Why did you decide to "go for VRML/X3D"? I guess it has something to do with belief :-) , but there are reasonable pros and cons, too. I believe in VRML/X3D, because it is easy to use and very efficient. Just a few lines of definitions in a text file - and you have a box, a sphere, some light sources, and so on.
 
And you don't need to learn a programming language (thanks to declarative principle stolen :) from HTML).
 
And it is an ISO standard.
 
So if my browser supports VRML/X3D, then I can promise to my users that their work will be preserved for decades, I could even promise them they can leave me without harm, because there are enough other browsers that support the same standard (I won't do that of course).
 
The meaning of VRML/X3D is to create trust. Trust in the future of authoring of 3D scenes (which causes still a lot of efforts that we do not want to waste).
 
Now coming to the idea with the Hyper Scene and to Gina-Lauren's suggestions. I think the idea with the Hyper scene that cares for the networking is a good idea
   - as long as the shared state is not too complex (each value has to be transported via the "?")
   - as long as the Hyperscene sticks to a standard (which would need to be defined, too)
 
If the scene becomes too complex or if the scene is compiled from different sources, e.g. "module by module" and "model by model", then I would rather prefer to "bury" a network sensor within each model, just broadcasting the "network connection" to every model by an SFNode and defining the "streamName" for the network sensor, the rest would be done by the model internally.
 
Now to the misunderstaning which I have with Gina-Lauren. I often used the term "network protocol" in a wrong manner. Of course I do NOT mean a "network LAYER protocol", such as IPv4 or IPv6, but of course I mean an "application LAYER protocol".
 
I think this application layer protocol could be sketched by message flow charts and by textual description. Just which messages are defined, how are the messages combined to procedures and which parameters can the messages carry. How are the messages routed?
 
This application layer protocol should be open to (m)any transport protocol(s), such as XMPP, SIP, RTP, TCP, UDP, SCTP, MSRP, 3GPP MCX, ............
 
Such a stable description and the promise to support any transport protocol would create trust in X3D MU.
 
All the Best
Christoph
  
Gesendet: Donnerstag, 14. März 2019 um 17:53 Uhr
Von: "GPU Group" <gpugroup at gmail.com>
An: "X3D Graphics public mailing list" <x3d-public at web3d.org>
Betreff: Re: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest, Vol120, Issue 34
One very fuzzy idea: HyperScene (HS) 
- one layer above scene
- networking would/could work on this layer ie 
Scene -?- HS - network - HS - ? - Scene
- with the ? being maybe EAI/SAI and/or DISTransform-like node updater
- the LayeringComponent - some have been asking if it could/should be dropped or fixed, and maybe if fixing, it might relate to a layer above scene also
HS - Layering - Scene
And maybe 2 scene instances rendering on the same web page or side-by-side in one desktop browser instance might be on some layer above scene, such as Layering or HyperLayering.
But sorry I can't explain any deeper. Its all fuzzy thinking.
-Doug Sanden
  
On Tue, Mar 12, 2019 at 2:37 PM GL <info at 3dnetproductions.com> wrote:


Christoph,

Personally, I don't think the NS should impose a network protocol standard at all. Instead, it should allow for as much flexibility as possible, remembering that we don't necessarily work within a web browser, and that present and future applications may involve different types of hardware and/or protocols not thought of, yet widely in use, or available today. We don't have to define implementations, but just leave the door open so to speak, so that vendors and authors may build what they need without the standard getting in the way.

However, at a higher level, if we want different 3D implementations to be able to communicate with each other, there would be a need to have a standard way of making worlds, avatars, objects, etc. compatible at the network level. In that case, we could envision an X3D application layer protocol (perhaps thinking of ftp, smtp, http, etc, but explicit to X3D), independently of the network transport layer, while the emphasis remains on flexibility. That being said, provide a set of mechanisms, a structure for conduits and data tunnels, but what passes through it should probably largely be left undefined as far as a standard is concerned, as long as the format is true to X3D form.

It would be nice to have data transmissions more defined, but the world being what it is and constantly evolving, I doubt we'd ever get there. I'd say "let the market forces play and the chips fall where they may".

So I am not too sure how to answer your question. I think that is what we are trying to figure out. Cheerz

GL


________________________________________________________
* * * Interactive Multimedia - Internet Management * * *
  * *  Virtual Reality -- Application Programming  * *
    *   3D Net Productions  3dnetproductions.com   *




From: x3d-public [mailto:x3d-public-bounces at web3d.org] On Behalf Of Christoph Valentin
Sent: Monday, March 11, 2019 4:21 AM
To: x3d-public at web3d.org
Subject: Re: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest, Vol120, Issue 34

Hi Gina-Lauren, Hi John

>>>>>>But yeah, whatever the browser vendors support as a standard would be OK with me.

John, You speak as a scene author, don't you? So you don't care, which Network Protocol the browser uses.

I understand that, because I am a scene author, too.

Now, what is the position of the Browser vendor?

Wouldn't the Browser vendor say the same? "Whatever the server Vendor supports as standard, would be Ok with me?"

@Gina-Lauren: What would you like to support as standard? Just to have an example.

Just my 2c

:-) Christoph



--
Diese Nachricht wurde von meinem Android Mobiltelefon mit GMX Mail gesendet.
Am 11.03.19, 08:09, Christoph Valentin <christoph.valentin at gmx.at> schrieb:
Hi Gina-Lauren, John, Andreas,

Now seriously.

Given the Network Sensor (or however it would be called), i.e. the syntax of how to instantiate the related nodes in a Web3D scene , was specified sufficiently (e.g. in X3Dv4)

there would still be a lot to be defined regarding how to connect to the server, connect to a server at all, use a star topology or support server hierarchy and so on.

If those decisions were left to the implementors of Web3D browsers (and libraries), then the market would fall apart again, as soon as it starts to bloom.

The standard doesn't need to be technically perfect, but it must be well regarded.

Just philosophy:-)

Just my personal opinion :-)

(And meanwhile we can use open-dis ;-) )
--
Diese Nachricht wurde von meinem Android Mobiltelefon mit GMX Mail gesendet.
Am 11.03.19, 07:41, John Carlson <yottzumm at gmail.com> schrieb:
I wish there was a reviewing site for open source software.  With both developer and end-user reviews (separate).

John

Sent from Mail for Windows 10

From: Christoph Valentin
Sent: Monday, March 11, 2019 12:36 AM
To: John Carlson; x3d-public at web3d.org
Subject: RE: [x3d-public] Netw. Sens. Protocol - WAS: 3d-public Digest, Vol120, Issue 34

<!-- /* Font Definitions */ @font-face {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4;} @font-face {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0in; margin-bottom:.0001pt; font-size:11.0pt; font-family:"Calibri",sans-serif;} a:link, span.MsoHyperlink {mso-style-priority:99; color:blue; text-decoration:underline;} a:visited, span.MsoHyperlinkFollowed {mso-style-priority:99; color:#954F72; text-decoration:underline;} .MsoChpDefault {mso-style-type:export-only;} @page WordSection1 {size:8.5in 11.0in; margin:1.0in 1.0in 1.0in 1.0in;} div.WordSection1 {page:WordSection1;} -->

>>>>>>May I suggest a CSV-like protocol across the https or TLS/SSL network, similar to:



(SSL-initiation)

COMMAND1,TIME1,PARAMETER1,PARAMETER2,..\r\n

COMMAND2,TIME2,PARAMETER3,PARAMETER4,…\r\n

…

[Christoph:] I think it depends on the KIND of state/event you want to transmit. If the state does not change very often (let's say every second), then a text based protocol will be the right choice. If you had a state that changed in real time (position of a jet plane) , then I'd think about somthing like an RTP stream (real time protocol). Anyway, the first step is to agree THAT the protocol should be standardized.




>>>>>>Will that satisfy the network nazis?

 [Christoph] I always thought the term Network Sensor (NS) was a bad choice :-)



>>>>>>>What type of constrictions do we have to jump through to satisfy network ops?  Should we make the protocol look just like an Excel file? 😊😊  Just wait until the first executive gets his excel file stopped by a network filter 😊.

 [Christoph]: I'd suggest to use SIP as signaling protocol, but this is my personal opinion. I'm not supported by my employer nor by my former employer.


--
Diese Nachricht wurde von meinem Android Mobiltelefon mit GMX Mail gesendet.

Am 11.03.19, 06:07, John Carlson <yottzumm at gmail.com> schrieb:

May I suggest a CSV-like protocol across the https or TLS/SSL network, similar to:



(SSL-initiation)

COMMAND1,TIME1,PARAMETER1,PARAMETER2,..\r\n

COMMAND2,TIME2,PARAMETER3,PARAMETER4,…\r\n

…



Will that satisfy the network nazis?



Also, we might use a STEP/EXPRESS-like format, if we want a hierarchical format.



Can we leverage X3DUOM-style XSD and our tools to create APIs?



What type of constrictions do we have to jump through to satisfy network ops?  Should we make the protocol look just like an Excel file? 😊 😊  Just wait until the first executive gets his excel file stopped by a network filter 😊.



Maybe not, huh?



What types of interactions do we want to support?



John





Sent from Mail for Windows 10



From: Christoph Valentin
Sent: Sunday, March 10, 2019 10:34 PM
To: x3d-public at web3d.org
Subject: Re: [x3d-public] x3d-public Digest, Vol 120, Issue 34



>>>>>>It may be time to come up with some proof of concept implementations of nodes or infrastructure around a scene using web technology before more thought experiments.



>>>>>Andreas,

I don't mean to interject in your conversation, but my earlier comments (below) were not merely "thought experiments". Most of everything I've talked about is already implemented and working at officetowers.com (The server is currently suffering from a database malfunction, so login is not currently possible until next week, but I can assure you. Meanwhile, videos are available if you'd like to see them).

That world and its associated X3Daemon multiuser server, have been ready for a network sensor for years. But before potentially spending hundreds of hours in implementing one in a new version based on more modern practices and available technologies, I'd like to see what we can come up with as a consensus.

What is exactly a network sensor? What are its functions? Within what boundaries and parameters? Do we refer to the earlier draft (when work on it stopped) as a basis? IMO, those are some of the questions that need answers before we can contemplate creating specs for a NS.

Cheerz,
Gina-Lauren



[Christoph]

Hi Gina-Lauren, Hi Andreas,
It's me, who should not disturb fruitful discussion here, but waiting for Andreas' response I'd like to add my two cents.

Just assume the members could agree on a test setup and they could motivate enough manpower to "do it".

A) How could such test setup look like?

B) Assuming the test setup and the specs would be developed in parallel, what would we need from the specs?

As I said, just my 2c.

Have good luck :-)
Christoph

Am 11.03.19, 03:38, GL <info at 3dnetproductions.com> schrieb:


>It may be time to come up with some proof of concept implementations of nodes or infrastructure around a scene using web technology before more thought experiments.



Andreas,

I don't mean to interject in your conversation, but my earlier comments (below) were not merely "thought experiments". Most of everything I've talked about is already implemented and working at officetowers.com (The server is currently suffering from a database malfunction, so login is not currently possible until next week, but I can assure you. Meanwhile, videos are available if you'd like to see them).

That world and its associated X3Daemon multiuser server, have been ready for a network sensor for years. But before potentially spending hundreds of hours in implementing one in a new version based on more modern practices and available technologies, I'd like to see what we can come up with as a consensus.

What is exactly a network sensor? What are its functions? Within what boundaries and parameters? Do we refer to the earlier draft (when work on it stopped) as a basis? IMO, those are some of the questions that need answers before we can contemplate creating specs for a NS.

Cheerz,
Gina-Lauren


________________________________________________________
* * * Interactive Multimedia - Internet Management * * *
* * Virtual Reality -- Application Programming * *
* 3D Net Productions 3dnetproductions.com *





Hello Andreas, Christoph and all,


>I looked at the network sensor, and BS Collaborate nodes. I think the idea is to explicitly forward all events which need sharing to the server which then distributes those to connected clients.


I'd say it's a fair assessment, though there are also a little more, behind the scene events, such as avatar authentication, login/logout, standby/resume status, and other similar network connections that are not necessarily shared in the scene per say.



>This leaves the definition of the shared state to the scene, and therefore requires careful design and code for MU. Perhaps there is a way that the browser can better assist with making a scene MU capable.


X3D is very capable of doing this internally, but I'd definitely want it to have its own script node. It may be possible to pass simple events through the browser, but that solution would impose severe limits to the ability to perform multiuser tasks in general. I mean, it can get convoluted enough as it is, without adding the extra complexity of passing events and network connections back and forth through the browser. I understand it isn't easy, but if it can be done, we probably should do it.




>What is the complete state a client needs when it is admitted to a shared scene ?


>From my experience, that will depend very much on the use case, but generally, the scene is complete within each client, though avatars and/or objects may be hidden and not necessarily seen at any given time. When the client connects to the shared network, that's pretty much when state is established. There may be exceptions to that, for example when an avatar has a specific identity and/or status in a game or accumulated points, that information will typically first be obtained from the server's database, before being passed on to the client and in turn to the network.



>Since most fields of most nodes accept input, and can therefore potentially change, complete state probably means just the value of all fields of all root nodes which means all nodes. The complete state may only need to be transferred when a client joins.


Ditto. Yes exactly. See above.



>Plus the offsets of avatars from Viewpoints. And the clock. Perhaps other values.


Again yes, so avatars don't end up on top of each other for example.

As far as a clock, I have never shared one or felt I needed to thus far. In my implementation, each client simply polls the server at regular intervals of one, two or more seconds (depending what other subsystems are in place). We sometimes cannot allow a client which is experiencing latency for one reason or another, to hold back all of the other clients. The show must go on, whether a client gets disconnected or happens to get out of sync. It generally seems to have little impact, since, at least in some cases, users are not together in one room. So they are in essence unaware that they are out of sync, as the case may be.

But, I can see how specific applications may need and would want to have a clock. And in that case, I believe we should take a good look at what the MIDI standard has to offer. MIDI is a well established and solid standard that has many uses for timing and control, not just music, and is more than likely a very good fit for virtual reality.



>I was thinking about avatar initiated event cascades since the state of everything else should be deterministic and only depend on time. These other things should update themselves. There are exceptions such as scripts which generate random values.


Avatars with gestures, and also all kind of visible objects, such as doors, cars, elevators (that's a tough one), a cup of coffee, a guitar, clouds, stars and anything else that moves. But also various more subtle things like say group chat, or items inside pockets or bags, money, ammunition and things of that nature also need shared states, and sometimes to various levels of privacy.



>Avatar initiated event cascades generally start with environment sensors. Not sure if there are other ways. The idea would be to just have to transmit this event cascade to other clients which then can replay it to update their scene instances.


Sure. General user interactions with the world and its interface, such as buttons and inworld devices, often need to be shared as well.


>I also was reading some of the github open-dis documentation. Interesting background, and pretty accurate discussion on coordinate systems. It may be possible to set up somewhere a very simple distributed simulation to join and test clients with.


I remember the DIS-Java-VRML work. That was very good. I will take a closer look at Open DIS as you mentioned. It sounds interesting. As I began to touch above, each MU implementation will have their own particularities. Personally, I would very much like to see a general NetworkSensor node that would be flexible enough to accommodate different protocols, but also one that has the ability to be adapted to specific use case and needs.



>Single browser MU: I think an additional touchedByUser field would be required but perhaps there is a way to generalize tracking of Users.



I don't know if it's possible, but I'd like to see the possibility to add any number of fields to the network node. Fields that would be defined by each implementation as needed, as we may not be able to always predict its application.


I feel there is much more to discuss, and I'd re-iterate my suggestion to open a NetworkSensor group or similar, so as to allow for more detailed exchanges and begin to have some basis for VR networking.

Gina-Lauren


P.S. Your work with Python is very interesting and falls in line with some of my own thoughts.

________________________________________________________
* * * Interactive Multimedia - Internet Management * * *
* * Virtual Reality -- Application Programming * *
* 3D Net Productions 3dnetproductions.com *








_______________________________________________
x3d-public mailing list
x3d-public at web3d.org
http://web3d.org/mailman/listinfo/x3d-public_web3d.org



_______________________________________________ x3d-public mailing list x3d-public at web3d.org http://web3d.org/mailman/listinfo/x3d-public_web3d.org


_______________________________________________
x3d-public mailing list
x3d-public at web3d.org
http://web3d.org/mailman/listinfo/x3d-public_web3d.org
_______________________________________________ x3d-public mailing list x3d-public at web3d.org http://web3d.org/mailman/listinfo/x3d-public_web3d.org
_______________________________________________ x3d-public mailing list x3d-public at web3d.org http://web3d.org/mailman/listinfo/x3d-public_web3d.org
 
 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20190316/92c87ae7/attachment-0001.html>


More information about the x3d-public mailing list