1e. Is the UPC prepared to provide the same
quality of support to the newly engaged communities as it provides to its current constituents?
While the support for all
users will remain at a very high level, that does not mean it will be exactly the same. For example, for
the core community Unidata provides comprehensive support for a full suite of tools from data services, through
decoders, to complete analysis and display packages. For other cases, the tools that are
specialized to their community may not be available via and supported by the UPC. One example of this is
the community of users of GIS tools. In that case Unidata supports standards-based web services that make our datasets
available in such a way that tools that incorporate those standard interfaces can avail themselves of Unidata
datasets. Thus these new communities can continue to make use of the analysis and display tools they are familiar with
while taking advantage of the data services of the traditional Unidata community.
Excerpt from the proposal review panel report
Advocacy for Community
Standards: "In particular, the UPC could play a significant leadership role within committees and consortiums like OGC
seeking to address the need to develop standards and technologies for data discovery. Unidata leadership and
advocacy in this area could facilitate expanded utilization of Unidata information resources for other research areas
like climate and provide Unidata users with easier access to other data sources like NASA satellite information. However,
the OGC letter of recommendation in the proposal and the Unidata responses to the review panel questions
regarding cyberinfrastructure did demonstrate that the Unidata was actively involved in community discussion
of interface and data standards."
Summary of Recent Progress
Background on netCDF and CF formal standards efforts
Following on the success of Russ Rew and the netCDF team in
establishing netCDF and CF as NASA standards, efforts continue to have CF-netCDF recognized internationally by the
Opengeospatial Consortium (OGC) as standards for encoding georeferenced data in binary form.
As the
official UCAR representative to the OGC Technical Committee, Unidata participates in 3-4 technical committee
meetings per year to ensure that Unidata and UCAR needs are met in the emerging international standards.
The
overall plan and status is maintainted at http://sites.google.com/site/galeonteam/Home/plan-for-cf-netcdf-encoding-standard. In keeping
with the proposal and review panel recommendations, the goal of this effort is to encourage broader use of
Unidata's data by fostering greater interoperability among clients and servers interchanging data in binary
form. Establishing CF-netCDF as an OGC standard for binary encoding will make it possible to incorporate
standard delivery of data in binary form via several OGC protocols, e.g., Web Coverage Service (WCS), Web
Feature Service (WFS), and Sensor Observation Service (SOS). For over a year, the OGC WCS SWG is already
developing an extension to the core WCS for delivery of data encoded in CF-netCDF. This independent
CF-netCDF standards effort is complementary to that in WCS and hopefully will facilitate similar extensions
for other standard protocols. Progress on OGC standardization
In January
2011, the OGC Technical Committee voted to adopt the netCDF Classic as an official OGC binary encoding
standard. As of the writing of this report,
the final standard specifications are being formatted for final publications, but the draft standards are still
available in three documents:
an
overview primer, the core standard spec, and the binarry encoding spec.
http://www.opengeospatial.org/standards/requests/71 Ongoing
Outreach Activities
AccessData (formerly DLESE Data Services) Workshops
The overall AccessData program is described at:
http://serc.carleton.edu/usingdata/accessdata/ and the most recent
workshop page is:
http://serc.carleton.edu/usingdata/accessdata/impacts/index.html. The AccessData team
is now working on several publications to document the results of the project.
Data Discovery Initiatives
In keeping
with the Unidata 2013 Proposal review panel recommendation relating to collaborating with others to enhance the
available data discovery facilities, the UPC and the Unidata community are following up on earlier
collaborations with George Mason University and NASA. The most recent work is with the U of Florence ESSI Labs team
to use their tools to harvest search metadata from THREDDS data servers which can provide special challenges because
of the size and volatility of their holdings. A new release of the ESSI Labs GI-cat package has addressed
limitation of earlier versions which ran into difficulty with the Unidata Motherlode THREDDS server. Members of
our community are finding this tool useful enough that Rich Signell has created a YouTube
video on "How to Configure GI-CAT for the first
time": http://youtu.be/28biJHTQSrM
Work continues in our
ongoing efforts to coordinate our data discovery and access systems with those of the hydrology community. The most
recent undertaking was described in an invited paper with David Maidment as the lead author at the Fall 2010 AGU:
Hydrologic information science
requires several different kinds of information: GIS coverages of water features of the land surface and
subsurface; time series of observations of streamflow, water quality, groundwater levels and climate; and space-time
arrays of weather, climate and remotely sensed information. Increasingly, such information is being published as web
services, in standardized data structures that transmit smoothly through the internet. A large "Digital Divide"
exists between the world of discrete spatial objects in GIS and associated time series, and the world of continuous
space-time arrays as is used weather and climate science. In order to cross this divide, it should be possible to search
for quantities such as “precipitation” and to find the information no matter whether it comprises time series of
precipitation at gage sites, or space-time arrays of precipitation from Nexrad radar rainfall measurements. This
means that servers of discrete space-time hydrologic data, such as the CUAHSI HydroServer, and servers of continuous
space-time weather and climate data, such as the Unidata THREDDS server, should be able to be indexed in a unified
manner that will permit discovery of common information types across different classes of information services. This
paper will explore options for accomplishing this goal using the CUAHSI HydroServer and the Unidata THREDDS server as
representative examples of information service providers. Among the options to be explored is GI-cat, a federated,
standards-based catalog service developed at the Earth and Space Science Informatics Laboratory of the University of
Florence.
Some of these efforts are described in the August Unidata
E-letter:
http://www.unidata.ucar.edu/newsletter/2010aug/index.html#Article1 Other Collaborations:
- NCAR
GIS Program (official program of NCAR as of this year)
- Marine
Metadata Interoperability Project Steering Team
- IOOS
DMAC Steering Team
- CUAHSI
Standing Committee
- UCAR
wide representative to OGC Technical Committee
- AGU
ESSI Focus Group Board
- ESIN
Journal Editorial Board
- Host
for OGC Technical Committee Meeting September 2011
- Liaison
to OOI Cyberinfrastructure Project
- Possible
collaboration with UCSD on a follow on NSF proposal
for the Marine Metadata Interoperability (MMI) project.
Planned
Activities
At the April EGU, there was a presentation on the
status and plans for the CF-netCDF standardization effort. A concise summary is given on the Goolge sites
GALEON wiki
https://sites.google.com/site/galeonteam/Home/status-update-2011-march In the updated plan,
the next steps will be standardization of the
CF conventions, in particular those associated with gridded and point (discrete sampling) data types.
At the most recent OGC TC meeting, the CF-netCDF SWG strongly recommended consideration of the OGC
Fast Track process for the CF conventions specification as well as for the netCDF4 binary
encoding spec (which is actually based on a subset of HDF5 format). At this point, it appears that a
case could be made for a Fast Track approach for the CF conventions based on the already adopted NASA
standard.
The OGC
Fast Track process is described in
There
is also a pending NASA standard for the NetCDF4/HDF5 encoding
The
netCDF enhanced data model will be an additional undertaking as an OGC extension to the core netCDF
classic data model standard, but that will not come under the Fast Track process.
Relevant Metrics
The
list of "other collaborations above includes ten
organizations we have regular interactions with. In most cases, our interactions are as
representatives of our community on their steering or policy groups, so we have at least some voice
in their direction.
The first three netCDF standards documents were
adopted by the OGC technical committee with no comments and no dissenting votes. I suppose
that's sort of a negative metric in terms of counting but positive in terms of outcome.
Over the years of these standardization efforts,
ESRI has incorporated the netCDF among the input and output formats that their arcGIS tools work
with directly. This represents a user community that numbers in the millions, but it isn't
possible for us to measure how many of those users now use it to access our data.
The standards efforts enable us to collaborate on
an ongoing basis with dozens of international organizations -- especially those represented in
the OGC MetOceans, Earth System Science, and Hydrology Domain Working Groups.