Re: [thredds] Running THREDDS on top of old OPeNDAP servers

Ok, Ill just add a few cents worth of info.

1. THREDDS Catalogs can be used to list arbitrary files, and so in general, 
THREDDS metadata has to be listed explicitly.

2. When the datasets can be read and understood by the CDM, we can in principle 
extract metadata like time and space extents. (Also things like author and 
rights, but those are less common placed into datasets and are typically not 
standard).

So far the only type we can do much for is the specialized TDS 
"InvDatasetFmrc", which understands the time and space coordinates of model 
runs, as well as which variables are the interesting ones. For example, look at 
any of the datasets under 

  http://motherlode.ucar.edu:8080/thredds/idd/models.html

such as

  
http://motherlode.ucar.edu:8080/thredds/catalog/fmrc/NCEP/GFS/Global_0p5deg/catalog.html?dataset=fmrc/NCEP/GFS/Global_0p5deg/NCEP-GFS-Global_0p5deg_best.ncd

the Variables, GeospatialCoverage and TimeCoverage metadata is automatically 
extracted and placed in the catalog.

We plan to do similar things for Point Datasets, I hope soon.

3. The value of the THREDDS metadata elements (when present) is that they can 
be crosswalked to other standards such as ISO. We have Dublin Core and DIF 
crosswalks, and an OAI harvester from DLESE. These crosswalks work ok, but not 
always maximally, ie figure out the maximal info that can be extracted and 
converted to other standards. 



Pauline Mak wrote:
> Hi all,
> 
> Simon - this is exciting stuff!!!  I was wondering - how are you
> generating the ISO documents?  Are you going to add extra information
> into the THREDDS "Digital Library Metadata Elements" when referred to
> making ISO19119 records for thredds services?  I'm particularly
> interested in the automatic extraction of geospatial boundaries.  I was
> hoping to use some standard document to extract this.  I thought WCS
> would be good for this, but it has just hit me (unfortunately, somewhat
> late in this project) that it only works on a small subset (i.e. only
> for regularly gridded datasets) :(
> 
> Looking at the catalog spec, it looks like the geospatial and temporal
> extent is something that has to be hand-crafted.  But given CDM can read
> these values (and for more than just regularly gridded datasets), it
> would seem to make sense if this could be automated somehow?
> Interestingly, the NetCDF Subsetter's dataset.xml document (e.g.
> http://opendap-ivec.arcs.org.au/thredds/ncss/grid/TERN_AUSCOVER/MODIS/L2/LPDAAC/aust/MOD13Q1.005/2000.02.18/MOD13Q1.2000.049.aust.005.b01.250m_ndvi.hdf.gz/dataset.xml)
> will produce the required information, but, this is doesn't follow any
> kind of metadata standards (as far as I know anyway).
> 
> Any suggestionsAs Rich was saying, do you have any "ISO thredds
> catalogs" on a server, dev or otherwise that we can have a look at?
> 
> Cheers,
> 
> -Pauline.
> 
> Richard Signell wrote:
>> Simon,
>>
>> This sounds extremely useful and I'd love to give it a try.
>>
>> Can you please tell us what the "trivial" changes are to NetCDF-Java?
>>
>> And do you have a real-life example of the catalog below that works
>> with publicly available OpenDAP data?
>>
>> Thanks,
>> Rich
>>
>> On Wed, Apr 8, 2009 at 8:28 PM,  <Simon.Pigot@xxxxxxxx> wrote:
>>> Hi Pauline,
>>>
>>> The following works ok for us (as an example - non-essential details
>>> removed):
>>>
>>> <?xml version="1.0" encoding="UTF-8"?>
>>> <catalog name="YOUR SITE OPeNDAP Catalog"
>>>       
>>> xmlns="http://www.unidata.ucar.edu/namespaces/thredds/InvCatalog/v1.0";
>>>        xmlns:xlink="http://www.w3.org/1999/xlink";>
>>>
>>>  <service name="yoursiteopendap" serviceType="OpenDAP"
>>> base="http://www.yoursite.com/dods/nph-dods/dods-data/"/>
>>>  <datasetScan name="climatology-netcdf" path="climatology-netcdf"
>>> location="http://www.yoursite.com/dods/nph-dods/dods-data/climatology-netcdf";>
>>>
>>>    <serviceName>yoursiteopendap</serviceName>
>>>    <crawlableDatasetImpl
>>> className="thredds.crawlabledataset.CrawlableDatasetDods" />
>>>  </datasetScan>
>>>  <datasetScan name="bluelink" path="bluelink"
>>> location="http://www.yoursite.com/dods/nph-dods/dods-data/bluelink";>
>>>    <serviceName>yoursiteopendap</serviceName>
>>>    <crawlableDatasetImpl
>>> className="thredds.crawlabledataset.CrawlableDatasetDods" />
>>>  </datasetScan>
>>> </catalog>
>>>
>>> I'm not sure if its all documented somewhere - I worked it out the
>>> slow way by poking around in the netcdf java code and hunting through
>>> the archives of the thredds mailing list. There are also some trivial
>>> changes you need to make to the code (in netcdf-java) to filter out
>>> some unwanted artifacts created when the scan picks through the html
>>> from the OpenDAP server - otherwise you end up with some strange,
>>> non-functional things in your catalog. Maybe there is a better way to
>>> do the above?
>>>
>>> By way of introduction, we want this sort of catalog to work as part
>>> of a thredds metadata harvester I'm adding to GeoNetwork which
>>> produces ISO19115 metadata records and ISO19119 records for thredds
>>> services. Its nearly at the stage where it is working reliably but
>>> there are a few more issues I need to solve and I'm still learning
>>> about Thredds :-)
>>>
>>> Cheers and I hope this helps,
>>> Simon
>>>
>>> ________________________________________
>>> From: thredds-bounces@xxxxxxxxxxxxxxxx
>>> [thredds-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Pauline Mak
>>> [Pauline.Mak@xxxxxxxxxxx]
>>> Sent: Thursday, 9 April 2009 8:56 AM
>>> To: thredds@xxxxxxxxxxxxxxxx
>>> Subject: [thredds] Running THREDDS on top of old OPeNDAP servers
>>>
>>> Hi all,
>>>
>>> I'm figuring out ways to serve data using THREDDS on top of old OPeNDAP
>>> servers.  I'm aware that you can configure datasets based on a URL, but
>>> that's for a single file... (correct me if I'm wrong!)  However, are
>>> there ways to apply to an directory?  Sort of like a datasetScan +
>>> filters for a directory URL?  When poking through the THREDDS catalog
>>> XSD, there's a crawlableDatasetImpl element.  Is that the sort of things
>>> I need to look at?
>>>
>>> Thanks,
>>>
>>> -Pauline.
>>>
>>> -- 
>>> Pauline Mak
>>>
>>> ARCS Data Services
>>> Ph: (03) 6226 7518
>>> Email: pauline.mak@xxxxxxxxxxx
>>> Jabber: pauline.mak@xxxxxxxxxxx
>>> http://www.arcs.org.au/
>>>
>>> TPAC
>>> Email: pauline.mak@xxxxxxxxxxx
>>> http://www.tpac.org.au/
>>>
>>>
>>>
>>> _______________________________________________
>>> thredds mailing list
>>> thredds@xxxxxxxxxxxxxxxx
>>> For list information or to unsubscribe,  visit:
>>> http://www.unidata.ucar.edu/mailing_lists/
>>>
>>> _______________________________________________
>>> thredds mailing list
>>> thredds@xxxxxxxxxxxxxxxx
>>> For list information or to unsubscribe,  visit:
>>> http://www.unidata.ucar.edu/mailing_lists/
>>>
>>
>>
>>
> 
> 



  • 2009 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the thredds archives: