Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives

NOTE: The galeon mailing list is no longer active. The list archives are made available for historical reasons.

Ben,

Good one about TCP!

The only downside I see is the resources it takes to move it through the process. I think the time *probably* would be well spent; I say 'probably' only because I think it will take a LOT of time to (a) create a description of the specification that is suitably clear and comprehensive, and (b) get enough time working on it to get through the process.
Well, there's a third potential time issue: I for one would want to  
see a few changes in CF before I would say it should be approved via  
OGC.  And there may be others with their own concerns.  So first we'd  
have to discuss whether OGC should rubber stamp the existing standard,  
given its huge community and 20TB of existing data, and worry about  
improvements later; or whether there is a 'minimum bar' of  
interoperability that has to be supported for any OGC standard.   
(Presumably there are some criteria, or ESRI binary shapefiles would  
have been accepted. I don't know the history of that, though. Are  
there a set of criteria that get applied to every standard in OGC?)
To desensitize any discussion on that point, let me cite an  
archetypical example,  Recently I saw a complaint about another  
community protocol that has no embedded metadata and is very hard to  
parse. The protocol has been in use worldwide for a few decades and  
may have thousands of users (certainly hundreds), transmitting data  
real time 24x7 all that time.  So it's been very successful in that  
sense.  The question is, just because it has been shown to work, and  
is widely adopted, is that enough to be an interoperability standard?   
Or should a standards body say "These are minimum criteria that any  
standard must fulfill"?  If the latter, I am curious, what are those  
criteria?
To avoid sending us totally off-topic here, let me return to my  
conclusion that I think it would be healthy, for both the science  
community and the CF-netCDF community, if CF-netCDF went through some  
standards process like OGC. But it might be painful too.
John


On Aug 24, 2009, at 3:10 PM, Ben Domenico wrote:

Hi all,

These are really valuable discussions. In my mind they are just as important as the formal standards that result from that part of the process. In the various OGC working groups where I've been active , I think we all have a much better understanding of the other subgroups needs and their approaches to satisfying those needs. I certainly count myself among those who have received one heck of an education over the last few years.
In the current discussion though, one point I still don't grasp is  
what is to be gained by NOT specifying CF-netCDF as A standard for  
binary encoding.  Not THE standard necessarily, but one possible  
formal standard option.   It's as if people think that CF-netCDF is  
more likely to be replaced by a newly minted standard if CF-netCDF  
is not declared a standard.   Those of us who've been at this long  
enough to remember the declaration of the ISO OSI transport layer in  
the late 70s realize that the non-standard TCP still has a modest  
following in many communities.
In the case at hand, I'm really convinced that it's a good idea to  
build on proven technologies while AT THE SAME TIME working on  
specifications (e.g., SOS, WFS, WCS, SWE common, ...) that may be  
more comprehensive,  fill gaps and address shortcomings of the  
existing approaches -- approaches that have been shown to work, but  
may not be all things to all people.  As we proceed, it's essential  
to keep this valuable dialog going so the individual components have  
a chance of fitting together in some sort of coherent whole in the  
end.
-- Ben

On Mon, Aug 24, 2009 at 3:26 PM, John Graybeal <graybeal@xxxxxxxxxxxxxxxxxx > wrote:
On Aug 24, 2009, at 10:42 AM, Steve Hankin wrote:

NetCDF (& associated tooling) is arguably emerging as the definitive standard for interchange of 3-dimensional, time- dependent fluid earth system datasets.

For the members of the NetCDF community who favor this argument, may  
I point out there are other communities that say similar things  
about their solutions?  And I'm not referring to OGC, which to my  
knowledge has never pitched SWE (or anything else) as a straight  
replacement for NetCDF, notwithstanding Alex's claims for SWE's  
representational capabilities. I mean, it's not like apples and  
zebras, but the two seem really different to me.
I like NetCDF for a lot of things, including many-dimensional and  
time-dependent data representations.
But terms like "definitive standard" carry their own hyperbolic  
weight, especially in a world of multiple standard bodies and many  
different kinds of system requirements.
So it seems to me there will not be *a* winner, either in this  
argument or in the earth science data management community's choice  
of technologies.  Thus, I'm much more interested in understanding  
the characteristics of each, so as to use them well and maybe even  
improve them.  (Hmm, I suppose that would explain my project  
affiliation....)
John


---------------
John Graybeal
Marine Metadata Interoperability Project: http://marinemetadata.org
graybeal@xxxxxxxxxxxxxxxxxx




---------------
John Graybeal
Marine Metadata Interoperability Project: http://marinemetadata.org
graybeal@xxxxxxxxxxxxxxxxxx



  • 2009 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the galeon archives: