Re: [galeon] Features and Coverages

NOTE: The galeon mailing list is no longer active. The list archives are made available for historical reasons.

Hi Jon,

I think you've cut right to the heart of the matter again.  Namely your
question:  "How does this translate into interoperable software?"    That's
a good way to ground the discussion for the upcoming joint session at the
OGC TC meeting where we will try to determine how the OGC Coverages and
Sensor Web Enablement (SWE) thrusts fit together.  (Note that Observations
and Measurements O&M conceptual framework is part of SWE).

Here's my take on how the O&M concepts fit into the GALEON WCS work -- in a
practical way with working, if not perfect, software.  First off,  WCS
provides us with a bounding box as a mechanism for describing a very
rudimentary "feature of interest."  In our metoceans world, the feature of
interest is the fluid within that bounding box. At present, WCS does limit
us to requesting "rectangular" bounding regions but it still makes it
possible to subset the data to a region in which we are interested.

The properties of the fluid in that region vary continuously in space and
time.  Nearly all the datasets available on our servers are actually
samplings of those continuously-varying properties at specific points within
the feature of interest.   Thus, in the language of O&M, the datasets we
serve are "sampling features."  In the case of GALEON 1, the sampling
features were delivered via WCS 1.0 in the form of coverages encoded in
CF-netCDF (among other possible encodings).  As others have pointed out, the
current WCS confines us to coverages that are regularly spaced in some
Coordinate Reference System (CRS).  In spite of these limitations, GALEON 1
showed clearly that WCS can be used to get real work done with real,
existing software systems. Several clients were able to access subsets of
data from several servers via the WCS interface in the form of CF-netCDF
encodings.   This included access to some of our collections of forecast
model output, mosaics of radar data cast onto a regular grid, and some
satellite imagery.

This is a significant win in two ways.  First it actually works and some
members of  traditionally GIS-oriented communities (e.g. the US hydrology
community) are using it to access metoceans data from our servers for use in
GIS applications.  Second it provides a model for the direction in which our
data systems and the standards can evolve in order to expand the menu of
metoceans datasets available via standard interfaces.

Another way to look at this is that the information models of ISO 19123 and
the OGC O&M are indeed general enough at the conceptual level to encompass
our data collections.   However, they are so general that they don't provide
the detail needed to ensure that our data is useful by other communities.
On the other hand, WCS with CF-netCDF encoding has been shown to provide
sufficient detail to enable a handful of clients and servers to interoperate
with one another on a limited subset of the data collections in the GALEON
community.  Moreover, in the proposed WCS extension standard for CF-netCDF
encoding, we now have a very detailed specification of the key encoding
format for our community.

As noted in the summary of GALEON 1, this leaves us with the question of how
we expand this approach to the other major categories of data collections in
the metoceans community.   To me it seems the first step is to carefully
define explicit CF conventions for the other scientific data types.  John
Caron has taken a first step along that path with his proposal for
non-gridded datasets at:

 http://www.unidata.ucar.edu/software/netcdf-java/CDM/CFpoints.html

Note here that, agreeing on such conventions for the other scientific data
types, will be a boon to our own community in terms of facilitating the
development of useful code for dealing with these data collections.  Beyond
that, we now know that the CF conventions -- especially with explicit CRS
information -- are an important part of the specification for encoding these
datasets as standards-conforming coverages.

In essence I propose that this is a step-wise process.  We've taken nearly
all the steps for those metoceans CF conforming data collections that
contain regularly spaced grids.    One by one, we need to pick off the other
data types (including the unstructured meshes), develop CF conventions, map
to a standard coverage data model where appropriate.  And in the process, we
must continue to work with the OGC community  to determine what augmentation
or modification may be needed for  WCS  or other access protocol
specifications in order to deliver data from these non-gridded collections.

It's clear that this is not all going to happen for all data types at once,
but I am also convinced that we're homing in on a strategy that will allow
us to get there in the long run if each group takes on the tasks in areas
where they have a stake.

-- Ben

Agreeing on CF conventions for these collections of non-gridded
On Wed, Oct 8, 2008 at 3:29 AM, Jon Blower <jdb@xxxxxxxxxxxxxxxxxxxx> wrote:

Dear all,

Another great discussion, thanks everyone.  Particularly thanks to
those (esp George) who have corrected my faulty understanding of
features and coverages.  Just when I think I've finally grasped all
this OGC stuff, I find there's another level of complexity that's just
beyond my reach... ;-)

I must admit I still struggle greatly to see how all this stuff will
translate into actual software.  Further than this, I don't see how it
will translate into *interoperable* software (i.e.
independently-written clients and servers that can talk to each other
properly).  There seem to be way too many degrees of freedom.
Assuming that I'm allowed to define my own feature types to describe
absolutely any "thing" that I'm interested in, how can I expect a
generic W*S server to correctly serve up my features and provide
sensible subsetting facilities?  I could write my own W*S variant to
serve my features, but this seems to be missing the point.  Sorry,
maybe I'm slow but I still can't grasp how the "core plus extensions"
model of WCS actually helps interoperability substantially.

I sympathise with Peter Baumann - things were a lot simpler and more
workable when we had the (more restrictive) view that "WCS is for
raster data, WFS is for vector data".  I could at least see how this
translates to real software.  Now I can't make the link at all.

I think it's worth taking note of the WMS world at this point.  WMS is
a far simpler and more mature spec than WFS and WCS and has much
greater backing from industry.  Despite this I still have not found a
WMS server or client that fully implements the 1.3.0 specification,
particularly with respect to z and t axes (which are in the spec but
often ignored, with servers doing horrible things like putting time
information in the STYLES parameter).  If we can only achieve partial
success with WMS what hope is there for WFS and WCS, which are far
more difficult, with far fewer interested parties?

Regarding "unstructured" meshes - this is something that even the CF
community has yet to solve properly.  I think it's way to early to
start folding this into the ISO Coverages world.

I'm going to finish with a bald statement - I think the only hope for
WCS is to restrict its scope.  The scope can always expand later if
the case is proven by real systems.

Best wishes,
Jon



  • 2008 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the galeon archives: