<wetBlanketMode>
I’m having some trouble digesting exactly the proposal here. From where I sit,
we already have a perfectly well-defined and standardised encoding format
(netCDF) – if we want a document, we can point to the NASA spec. We also have a
set of conventions for that format (CF) that are well-governed within an
existing community process. I’m having trouble seeing what OGC brings to this.
The added value, it seems to me, would come from integrating netCDF/CF within
the framework ISO/OGC abstract approach to data interoperability, which is
being adopted very widely across many domains (ref. the multi-billion € INSPIRE
infrastructure). That approach is very simple and very clear – you first define
a conceptual model for your universe of discourse (in which exchange and
persistence formats are explicitly out of scope), then you (auto)generate a
canonical encoding for that model, thereby enabling interoperable data
exchange. CSML was one attempt (ours) at the conceptual model bit, and we’ve
shown that, *at least for current usages* of CF-netCDF, the ISO/OGC standard
encoding of that model (i.e. GML) works perfectly well with netCDF *as-is*!
(Incidentally, the CSML feature types and CF Point Observations proposal are in
almost perfect alignment, meaning that the ISO/OGC standard approach works with
even more confidence for current and proposed CF/netCDF.) I’m not sure what
extra standardisation is being proposed. On the other hand, I am very nervous
that by merely ‘rubber-stamping’ CF/netCDF with an OGC logo, without first
getting right the underlying foundations (i.e. an agreed standards-based
conceptual model), we’ll be headed to even more confusion ultimately (this is
the reason there is so much hand-wringing about how exactly to bring KML into
alignment with the rest of the OGC standards family – it doesn’t share a common
base). I’d be very interested to hear David Arctur’s view on how exactly it was
proposed actually to *integrate* CF/netCDF into the OGC frame, as opposed to
just attach an OGC label, and to point out why such integration requires new
CF/netCDF standardisation activity. In my view, such integration is already
possible and happening.
</wetBlanketMode>
Regards,
Andrew
From: galeon-bounces@xxxxxxxxxxxxxxxx [mailto:galeon-bounces@xxxxxxxxxxxxxxxx]
On Behalf Of Ben Domenico
Sent: 15 July 2009 19:29
To: Unidata GALEON; Unidata Techies
Cc: Mohan Ramamurthy; Meg McClellan
Subject: [galeon] plan for establishing CF-netCDF as an OGC standard
Hello,
At the galeon team wiki site:
http://sites.google.com/site/galeonteam/Home/plan-for-cf-netcdf-encoding-standard
I put together a rough draft outline of a plan for establishing CF-netCDF as an
OGC binary encoding standard. Please note that this is a strawman. Comments,
suggestions, complaints, etc. are very welcome and very much encouraged. It
would be good to have the plan and a draft candidate standard for the core in
pretty solid shape by early September -- 3 weeks before the next OGC TC meeting
which starts on September 28.
One issue that requires airing early on is the copyright for any resulting OGC
specification documents. Carl Reed, the OGC TC chair indicates that the
wording normally used in such documents is:
Copyright © 2009, <name(s) of organizations here>
The companies listed above have granted the Open Geospatial Consortium,
Inc. (OGC) a nonexclusive, royalty-free, paid up, worldwide license to copy and
distribute this document and to modify this document and distribute copies of
the modified version.
I'm sending a copy of this to our UCAR legal counsel to make sure we are not
turning over ownership and control of the CF-netCDF itself..
-- Ben