[x3d-public] X3D file conformance philosophy

Roy Walmsley roy.walmsley at ntlworld.com
Fri Sep 25 02:36:53 PDT 2015


Andreas,

 

Your concluding comment about 'add to user node/attribute dictionary' is, I would say, the equivalent of creating your own DTD/Schema extension. If you examine the end of the DTD file or the beginning of the Schema file you will see the provision to include your own extensions.

 

As for a DTD for x3dom, that may be something that has to be generated by x3dom users / developers if it is needed. While the Web3D Consortium will develop a DTD/Schema for the new version 4.0 of the specification set harmonizing usage of X3D in HTML, x3dom is likely not be fully aligned with this, because of its custom fields.

 

All the best,

 

Roy

 

From: Andreas Plesch [mailto:andreasplesch at gmail.com] 
Sent: 25 September 2015 05:41
To: Roy Walmsley
Cc: X3D Graphics public mailing list; John Carlson
Subject: Re: [x3d-public] X3D file conformance philosophy

 

Well, I am not sure if I can contribute much more to the discussion.

It might be a coincidence but I noticed that x3dom has a custom option to render to texture which can be used for mirror effects or mapping to curved projection surfaces, or stereo view. So things get reinvented if there is no easy way to first just ignore but not repel, then tentatively allow and finally absorb experimental features into a format for content.

I think a majority of x3dom's nodes have custom fields in addition to 'id' driven by application needs. Should we keep calling models using these nodes x3d based models or x3dom models ? The answer may be yes until there is a DTD for x3dom ?
But there is also the html model of validation.

To me, a mode of validation which is able to ignore unknown elements would be very useful. From what I understand it may be possible to add a DTD fragment which catches all such elements and declares them as valid but I would not know how to. Or have a 'add to user node/attribute dictionary' for each unidentified element.

Andreas



On September 24, 2015, at 7:06 PM, Roy Walmsley <roy.walmsley at ntlworld.com> wrote:



Andreas,

 

Thank you for raising this discussion in the first place. It has been enlightening for me in a number of ways.

 

With my example I did have nodes that contained all the original material properties and the additional ones for Infrared rendering. I also had additional Viewpoint nodes that specified the type of rendering to be undertaken. Also, we weren’t rendering to the screen directly, but rendering to texture. This meant that, at any instant, we could have as many viewpoints ‘simultaneously’ rendered as we desired, and they could be any mix of Infrared or Visual. Then we could display as many viewpoints in separate windows as we desired. We did have a separate viewer that did render directly to screen. Then we could choose to render either Infrared or visual. We knew the models with custom content were not pure VRML, and we simply said they were Open Inventor / VRML based.

 

Turning to X3DOM  now. You are probably aware that the Web3D Consortium is working on a new version of the specification (V4.0) to cover usage of X3D in HTML5. This will address some of the issues you are noting. In my opinion,  though, there is still likely to be a significant gap between X3DOM and the new X3D Specification set, since X3DOM does not conform to X3D in quite a number of ways. The Consortium hopes to release the first public draft of V4.0 for 19775-1 at the end of the year.

 

The point you raise about ‘id’ usage and errors is noted. Scripts are likely to be one of the areas in the new version where there may be changes. Time will tell.

 

Regards,

 

Roy

 

From: Andreas Plesch [mailto:andreasplesch at gmail.com] 
Sent: 24 September 2015 21:46
To: Roy Walmsley
Cc: X3D Graphics public mailing list; John Carlson
Subject: Re: [x3d-public] X3D file conformance philosophy

 

Wow, thanks again for engaging in this discussion. First, let me acknowledge that there is a lot of value in having a way to strictly define a format using DTD and Schema, and that it is important for any long-term viability to require extensions to follow the provided mechanisms.

Also, I was not really aware that validation is a term reserved for a specific action in XML processing which probably does not exactly match what I would have associated with the term.

But, to use your example, would it not be useful to be able claim that your infrared scenes have correct (allowable?, viable?) VRML content, say if you included a switch to choose between infrared and regular materials which would allow approximate rendering in regular VRML renderers but at the same time allow special purpose rendering with a custom renderer ? Without going through the DTD extension/registration process ? In fact, you may have referred to your scenes as VRML informally. Should there be way to formally designate such a scene as viable (valid) VRML but not strictly conforming (pure) ?

What triggered this discussion for me was my attempt to use x3d-edit for x3dom scenes which at first glance does not seem unreasonable. I was just surprised to find out that the little 'id' attribute which is required for x3dom elements to work with scripts triggered so many validation errors that there was a "too many errors to report" (or similar) message. My expectation was simply that the validator would be silent about items which it does not need to be concerned about. Of course, I did not realize that the validator needs to be concerned about everything due to the strict conformance rules.

 

Perhaps the real issue is the (perceived?) difficulty of writing a correct DTD and/or Schema with the result that those rarely materialize. But this would be a XML issue not a X3D issue although it has these consequences.

I would argue that an exclusive approach to conformance becomes a significant problem if it slows adoption and helps to maintain the gap between x3d, the format, and x3dom, probably the most widely used x3d browser, currently.

 

-Andreas

 

 

On Thu, Sep 24, 2015 at 2:20 PM, Roy Walmsley <roy.walmsley at ntlworld.com> wrote:

Andreas,

 

John raised an additional point that is also worth expanding on.

 

One of the general aims of the Web3D Corporation is round-trip-ability. What this means is starting off with a file in one encoding, converting it to a second encoding, and the converting it again back to the first encoding. It is expected that the resulting file, while not textually identical, will have identical functionality and display identically.

 

Another general aim is maximising compatibility. Both backwards and forwards. This maximises the lifetime of authored content.

 

Another general aim is maximising reuse and portability of authored scenes.

 

You can see that custom content is going to scupper these aims. So I personally believe that the specification, in defining what is a ‘conforming’ X3D file, is essentially correct and should be maintained (although see below for one exception).

 

Let me quote an example of ‘custom content’ from my own experience. I was a co-developer of a 3D simulation. This was begun many years ago, so used Open Inventor, and then VRML. The difference for us was that we wanted to render in the Infrared spectrum, not visual. That meant, not only modifying the rendering process, but also creating additional nodes to characterise materials in ways appropriate to Infrared rendering. We accepted that we could not use any standard viewer, but would have to write our own. Now, with the advance of time and the standardization of X3D one course of action open to me would be to update the additional custom VRML nodes we created to X3D nodes. I could then create an extension to the Schema and DTD, covering the new nodes, and have it registered. This would also include a new COMPONENT definition. See 4.10 Component and profile registration http://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/concepts.html#componentprofilereg.

 

Now, you are right about f. in that case, but e. is OK. With that one proviso my custom content becomes conforming X3D. Standard browsers would immediately recognise they couldn’t display my custom content through the COMPONENT statement. Any other implementer could extend their browsers to cover my custom content if they so desired. My custom content then becomes more widely usable.

 

Returning to the issue of the validator, I believe it should point out all the things it finds as errors. After all, you don’t have to fix them! And you probably have to check fully every time. We all know how easy it is introduce typos! And one would assume that the majority of the X3D file is conforming with only a relatively small portion that is custom content. Or are you thinking something different here? Perhaps you could illustrate the sort of ‘custom content’ you are referring to. And why you believe it to be a significant problem.

 

Best regards,

 

Roy

 

From: John Carlson [mailto:yottzumm at gmail.com] 
Sent: 24 September 2015 17:04
To: Andreas Plesch
Cc: X3D Graphics public mailing list; Roy Walmsley
Subject: Re: [x3d-public] X3D file conformance philosophy

 

One thing to consider is schema based conversion from one encoding to another.

On Sep 24, 2015 10:20 AM, "Andreas Plesch" <andreasplesch at gmail.com> wrote:

Hm,

to me a validation tool is simply not able to reason about or make any judgement about nodes and fields it does not know anything about (other than that it does not know them). So to me the main (perhaps only) point of a validation tool is to pick up out of bound fields, missing DEFs for routes, missing required nodes, and such. And existing tools do that well, I think.

So, yes, I think a reduced activity tool would be very useful, and there may be a way to disable warnings about unknown elements in the validator bundled with x3d-edit. Could it be done ?

I think the existing phrasing in the spec. for conformance captures well what valid x3d content is, except for items e), f) and perhaps h) in 

http://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/conformance.html#ConformanceX3Dfiles

which would not relate to validity if understood in a more inclusive sense.

A concern may be that it would not be possible to flag '<Tansform>' as syntax error which is clearly desirable. And as a consequence all unknown elements have to be defined as syntactically incorrect.

But would it not make sense to separate this concern from concerns which actually can be addressed more specifically by what is contained in the spec. such as out of bound fields ? Have two buttons, one for possible syntax errors, and one for deeper, x3d related checking ?

Best regards,

Andreas

 

 

On Thu, Sep 24, 2015 at 8:46 AM, Roy Walmsley <roy.walmsley at ntlworld.com> wrote:

Andreas,

Referring to your final comment below I would say that a validation tool that does not complain about nodes and fields it does not know would make that tool largely useless! After all, apart from picking up out of bounds field values, or detecting missing required content, the point of a validation tool is to complain about things it doesn't recognise. How else does it tell you about errors? Validators, if they are well written, will already tell you what sort of problem they have found, i.e. the equivalent of your classification of deviations.

It sounds like you either need a reduced activity validation tool, or else to teach your validation tool about the extra things that you permit in your X3D authored files.

Finally I would raise two questions: What is valid X3D content? How does a validation tool definitively categorize custom content from errors in valid X3D content?

Best regards,

Roy

-----Original Message-----
From: Andreas Plesch [mailto:andreasplesch at gmail.com]
Sent: 24 September 2015 13:13
To: Roy Walmsley
Cc: X3D Graphics public mailing list
Subject: Re: [x3d-public] X3D file conformance philosophy

Roy,

you bring up many good points. I completely agree that there are many ways to extend the standard.

This includes the option for browsers to provide DTD and Schema for extensions which quite a few browsers support. However, in practice this is rarely done.

You are right that validation software can only detect deviations from the standard as a reference. But I think it could be designed to classify deviations into ones which contradict references and ones which it cannot further analyse because reference documents do not provide the necessary information. In fact, many validators may already do that.

So what could be done ?

In order to be more inclusive, Web3d could introduce a definition of validity in parallel to conformance which would allow more documents to claim that they have valid x3d content alongside custom content.

I think, in practice all I may be looking for is for a validation tool which does not complain about nodes and fields it does not know.

Andreas

On September 24, 2015, at 5:13 AM, Roy Walmsley <roy.walmsley at ntlworld.com> wrote:

Andreas,

I would also offer the following consideration into the debate, continuing with the current terminology just for the sake of maintaining clarity of understanding between us.

Let's say I am a software engineer developing software that validates an X3D file,  utilising either or both of the DTD and the Schema. I design my software to detect 'deviations' in X3D files from the DTD/Schema reference documents. When such 'deviations' are found, how are they to be reported? My software isn't going to know if a 'deviation' is an author's intention or simply a mistake. It can  only report (notwithstanding very clever heuristics) that the X3D file is not valid, pointing out the errors.

Of course, the author also has the option of creating extensions to the Schema and or DTD that cover the differences. These would allow software validators to pass an X3D file containing extensions above and beyond the specifications.

Of course, when we are talking DTD and Schema we are also talking XML. Which, of course, stands for  Extensible Markup Language. So an author already has the option to make their own private additions.

And then, an author can just create their X3D file with additions, and ignore error messages from validation tools.

So, from the perspective of the Web3D Corporation (and here I am only expressing my own personal opinion), what can be done? The Corporation does not wish to control authors content, merely to provide a standard to enable the widespread use and distribution of 3D graphics content. While this naturally has limits, it also includes methods to permit individual author extensions. HTML is standardized, with validators. It too provides methods for extension of the basic standards. Perhaps for HTML these are more well known, whereas for X3D they are less publicized.

Best regards,

Roy

-----Original Message-----
From: Andreas Plesch [mailto:andreasplesch at gmail.com]
Sent: 24 September 2015 04:51
To: Roy Walmsley
Cc: X3D Graphics public mailing list
Subject: Re: [x3d-public] X3D file conformance philosophy

Roy, thank you for your thoughtful response.

Of course, authors who choose to include non-standard elements are not able to have an expectation that the content renders as intended in all browsers. In a sense, labeling such content as not conforming is redundant. To me, perhaps a better use of the label 'conforming' would be simply an assurance that used nodes and fields which are actually standardized conform to the standard, eg. <Box size='square' /> would be not conforming.

So I am still not quite sure what the benefit is of considering foreign elements in terms of conformity.

>From a semantic point of view, perhaps a better fit would be purity or fidelity. A pure x3d file consists exclusively of standardized components.

Validity would be another concept which to me signals a tolerance. All x3d standardized components in a valid x3d file conform to the standard but it may also contain other (xml) content.

Considering that part of the success of html is often seen as due to its nonchalance towards unspecified content, this kind of dry discussion is probably more relevant than it seems.

Andreas

On Wed, Sep 23, 2015 at 6:51 PM, Roy Walmsley <roy.walmsley at ntlworld.com> wrote:

Hi Andreas,



You have expounded an interesting view.



What if Conformance did not apply in the restricted sense. An X3D author would be able to create files that had additional undefined (and non-conforming) content. Browser implementers they would be then be free to support whatever extras they wished and still be classed as ‘conforming’. So, suppose an author creates an X3D file with extras that is supported on one particular browser. If the author were to share the file with others they may be restricted to only using the one particular browser that supports those particular extras. Other ‘conforming’ browsers would be unable to display the content.



Obviously, it is desirable that X3D authors, should they wish, be free to share their X3D files with other users. The concept of restricted conformance ensures that if that same file is loaded into other conforming browsers then the display will be the same, reproducing the authors intentions.



There are mechanisms already available for authors to generate conforming content that has additional features. The first and most obvious is the use of Prototypes. The second is to have definitions for additional nodes, both in terms of specification and validation tools such as DTD and Schema. These can be registered and made publicly available if it is envisaged that they will have sufficiently wide usage. Of course, there is ways the ultimate third case, whereby additional features are incorporated into the specifications. Anyone can present drafts for consideration to the Web3D Consortium, provided they have at least two implementations to support the new features.



So my personal view is that it is better to retain the restricted conformance testing. But I am always open to further discussion ….



Best regards,



Roy



From: x3d-public [mailto:x3d-public-bounces at web3d.org] On Behalf Of Andreas Plesch
Sent: 23 September 2015 18:55
To: X3D Graphics public mailing list
Subject: [x3d-public] X3D file conformance philosophy



Using the x3d validator and quality assurance tools which come with x3d-edit, it will become quickly apparent that an x3d file is deemed conforming (or syntactically correct) only after it passes a series of very strict tests which are defined here:

http://www.web3d.org/documents/specifications/19775-1/V3.3/Part01/conformance.html#ConformanceX3Dfiles



For example, referencing a field of a node which is not listed in the spec. but is supported by a x3d browser (say "id") will put the x3d file in the unconforming category. Is this correct ?

Similarly, referencing nodes and behaviours which are not explicitly defined in the spec. will break conformance.

This is in contrast to the concept of validity of html files which are deemed valid even if they include elements not defined in the spec.

So, what is the philosophy of having a narrow definition of file conformance ?

It may be that it defines minimum functionality for a x3d browser.



A x3d browser is said to be conforming when it allows browsing of a conforming file as a minimum requirement.

But this convention would still apply if a conforming file could include custom (non-spec.) nodes and fields which a conforming browser would be allowed to ignore.



Another way to think about the usefulness of strict conformance may be to explore the consequences of restricting tests for conformance only to nodes, fields, behaviours and such with which the spec. is actually concerned. What would break if ignoring all other elements in browsers and parsers would be considered conforming ?

In my view, a strict view of file conformance shifts responsibility from the browser to the x3d content author. To me, this is a somewhat inverted view which may need to be reconsidered.



-Andreas


--

Andreas Plesch
39 Barbara Rd.
Waltham, MA 02453




--

Andreas Plesch
39 Barbara Rd.
Waltham, MA 02453




-- 

Andreas Plesch
39 Barbara Rd.
Waltham, MA 02453


_______________________________________________
x3d-public mailing list
x3d-public at web3d.org
http://web3d.org/mailman/listinfo/x3d-public_web3d.org




-- 

Andreas Plesch
39 Barbara Rd.
Waltham, MA 02453

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://web3d.org/pipermail/x3d-public_web3d.org/attachments/20150925/c2c4ca88/attachment-0001.html>


More information about the x3d-public mailing list