NOTE: The galeon
mailing list is no longer active. The list archives are made available for historical reasons.
Hi Steve, Just to clarify when I said NetCDF was a "NEW standard" I meant a new standard in OGC. As I was telling Ben in an offline email, I am totally conscious of its penetration and usefulness in certain communities. However, I am not convinced that having two standards doing the same thing in OGC is sending the right message and is the best way to go for a standardization organization. There has been a lot of experimentation with SWE technologies as well that you may not know about and in many communities, especially in earth science. What I'm saying is that perhaps it is worth testing bridging NetCDF to SWE before we go the way of stamping two 100% overlapping standards as OGC compliant. Regards, ------------------------------------------------- Alexandre Robin Spot Image, Web and E-Business Tel: +33 (0)5 62 19 43 62 Fax: +33 (0)5 62 19 43 43 http://www.spotimage.com <http://www.spotimage.com> Before printing, think about the environment ________________________________ De : Steve Hankin [mailto:Steven.C.Hankin@xxxxxxxx] Envoyé : jeudi 20 août 2009 20:58 À : Tom Whittaker Cc : Robin, Alexandre; Ben Domenico; Unidata GALEON; wcs-2.0.swg Objet : Re: [galeon] [WCS-2.0.swg] CF-netCDF standards initiatives Hi Tom, I am grateful to you for opening the door to comments "from 10 thousand feet" -- fundamental truths that we know from many years of experience, but that we fear may be getting short shrift in discussions of a new technology. I'd like to offer a comment of that sort regarding the interplay of ideas today between Robin ("I hope we don't have to define a NEW standard ...") and Carl Reed ("there are other organizations interested in bringing legacy spatial encodings into the OGC. There are sound business and policy reasons for doing so."). The NEW standard in this discussion is arguably SWE, rather than netCDF. NetCDF has decades of practice behind it; huge bodies of data based upon it; a wide range of applications capable of accessing it (both locally and remotely); and communities that depend vitally upon it. As Ben points out, netCDF also has its own de jure pedigree. A key peril shared by most IT standards committees -- a lesson that has been learned, forgotten, relearned and forgotten again so many times that it is clearly an issue of basic human behavior -- is that they will try to innovate. Too-common committee behavior is to propose, discuss and document new and intriguing technologies, and then advance those documents through a de jure standards process, despite an insufficient level of testing. The OGC testbed process exists to address this, but we see continually how large the gap is between the testbed process and the pace and complexity of innovations emerging from committees. Excellent reading on this subject is the essay by Michi Henning, The Rise and Fall of CORBA (2006 -- http://queue.acm.org/detail.cfm?id=1142044). Among the many insights he offers is 'Standards consortia need iron-clad rules to ensure that they standardize existing best practice. There is no room for innovation in standards. Throwing in "just that extra little feature" inevitably causes unforeseen technical problems, despite the best intentions.' While it adds weight to an argument to be able to quote from an in-print source, this is a self-evident truth. We need only reflect on the recent history of IT. What we need is to work together to find ways to prevent ourselves from continually forgetting it. There is little question in my mind that putting an OGC stamp of approval on netCDF is a win-win process -- for the met/ocean/climate community and for the broader geospatial community. It will be a path to greater interoperability in the long run and it deserves to go forward. The merits of SWE (or GML) as an alternative approach to the same functionality also deserve to be explored and tested in situations of realistic complexity. But this exploration should be understood initially as a process of R&D -- a required step before a "standards process" is considered. If that exploration has already been done it should be widely disseminated, discussed and evaluated. - Steve ================================== Tom Whittaker wrote: I may be ignorant about these issues, so please forgive me if I am completely out-of-line....but when I looked at the examples, I got very concerned since the metadata needed to interpret the data values in the "data files" is apparently not actually in the file, but somewhere else. We've been here before: One of the single biggest mistakes that the meteorological community made in defining a distribution format for realtime, streaming data was BUFR -- because the "tables" needed to interpret the contents of the files are somewhere else....and sometimes, end users cannot find them! NetCDF and ncML maintain the essential metadata within the files: types, units, coordinates -- and I strongly urge you (or whomever) not to make the "BUFR mistake" again -- put the metadata into the files! Do not require the end user to have to have an internet connection to simply "read" the data....many people download the files and then "take them along" when traveling, for example. If I simply downloaded the file at <http://schemas.opengis.net/om/1.0.0/examples/weatherObservation.xml> <http://schemas.opengis.net/om/1.0.0/examples/weatherObservation.xml> I would not be able to read it. In fact, it looks like even if I also got the "metadata" file at: <http://schemas.opengis.net/om/1.0.0/examples/weatherRecord1.xml> <http://schemas.opengis.net/om/1.0.0/examples/weatherRecord1.xml> I would still not be able to read it, since it also refers to other servers in the universe to obtain essential metadata. That is my 2 cents worth....and I hope I am wrong about what I saw in the examples.... tom
galeon
archives: