[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Datastream #IZJ-689237]: Additional Datafeeds



Hi Jeff,

re:
> A couple of things:
> 1> To test the latency issue, and to conserve diskspace, is it
>    possible to suggest a basic ldmd.conf feed setup and pqact
>    file(s) that would get me:
> 
>    Surface Observations for Garp
>    Model Data for Garp
>    Observed Soundings for NSharp

OK.  Here is the relevant information with some questions:

Surface Observations - these are contained in the IDS|DDPLUS datastream
Model Data           - this is a bit trickier since there are up to 5
                       different datastreams that contain model data:
                       HDS     - "lower" resolution data contained in NOAAPort
                       NGRID   - "higher" resolution data contained in NOAAPort
                       CONDUIT - high resolution data from NCEP
                       GEM     - model data from the Canadian Meteorological 
Center (CMC)
                       FNMOC   - model data from Fleet Numerical (in the 
process of being revived,
                                 but no ETA on availability)
Observed Soundings   - these are contained in the IDS|DDPLUS datastream

So, the IDS|DDPLUS datastream contains two out of three of the things you
are asking about.  The third will depend on how much data you want to ingest
(i.e., how much your Internet connection will allow) and process.

> 2> Can you look at the statement below and let me know if that would,
>    in theory, get me all of the eta215 data coming in and write it
>    to a raw grib file, and what I would need to do to alter it to
>    write the file(s) based on date/time, i.e. 20090206_nam215.grb, etc.
> 
> HDS|CONDUIT   (/mNAM|/mNMM).*#215
>       FILE    data/grib_nam/nam_215.grib

This will _not_ save individual files by date or by time.  This action
will write each NAM 215 product from the HDS or CONDUIT datastreams into the
specific file whose name is ~ldm/data/grib_nam/nam_215.grib.
    
> I've been so sidetracked on other stuff that I've sort of lost track
> of what's been tried, with regards to my datafeed issue(s).

I know what you mean...

> Number 1 above would get me most of what Dr. Zehnder is looking for
> and Number 2 would allow me to see what exactly IS coming in, with
> regards to my missing eta/nam data.

The best thing to do to see what products you are actually receiving is
to use the LDM 'notifyme' utility.  'notifyme' can list out what is
being received both on the local host (whistler) and on upstream machines
that have been configured to allow you to request data.  The output from
'notifyme' can be written to a log file for later review.  Here are
two examples:

- log what is being received on the local host in the HDS or CONDUIT datastreams

<as 'ldm'>
cd ~
notifyme -vl logs/model_ingest.log -f 'HDS|CONDUIT' 

- log what is being received on the upstream host idd.unidata.ucar.edu in the
  HDS or CONDUIT datastreams:

<as 'ldm'>
cd ~
notifyme -vl logs/model_avail.log -f 'HDS|CONDUIT' -h idd.unidata.ucar.edu

'notifyme' will run until it is manually stopped, so you can leave these
invocations running for several days.  A comparison of the log listing from
the local host and that from the upstream feed site(s) will show which products
were available but not received.  I believe that this is the information you
are really looking for.

The other benefit from running the 'notifyme' invocations above is that you will
then have a log of what the product IDs for the products in the stream(s) you
are interested in.  You would then use this information to fashion extended
regular expression patterns that could be used in pattern-action file (e.g.,
~ldm/etc/pqact.conf) entries.

By the way, I logged onto whistler when I saw your email this morning and
made some changes to your ~ldm/etc/ldmd.conf file.  The changes were to:

- remove redundant requests for the same data

- revert back to a 5-way split request for CONDUIT data

The "problem" I saw was:

- You were requesting IDS|DDPLUS|UNIWISC in one line:

  request IDS|DDPLUS|UNIWISC ".*" idd.unidata.ucar.edu

  You were requesting HDS in another line:

  request HDS ".*" idd.unidata.ucar.edu

  You were requesting WMO in a third line:

  request WMO ".*" idd.unidata.ucar.edu

  The problem is that WMO is a compound feed type that is composed
  of the union of IDS|DDPLUS and HDS.  So, your WMO request was
  duplicating the other requests made above.  Since your network
  capacity is in question, it is best to eliminate all duplication
  of feed requests.

  I commented out the single request for WMO as this was redundant.

- You were requesting all of NMC2 from idd.unl.edu in a single request

  request NMC2 ".*" idd.unl.edu

  NMC2 is an alias for the CONDUIT feed.  Since the CONDUIT feed has a LOT
  of data, it is typically best to split requests for it into pieces.  I
  did this by editing out your NMC2 single request and uncommenting the
  5-way split for the same data:

  #  request NMC2 ".*" idd.unl.edu
  # Broke the CONDUIT request into fifths, to help with latency - 10-14-08 - 
Jeff
  request CONDUIT "([09]$)" idd.unl.edu
  request CONDUIT "([18]$)" idd.unl.edu
  request CONDUIT "([27]$)" idd.unl.edu
  request CONDUIT "([36]$)" idd.unl.edu
  request CONDUIT "([45]$)" idd.unl.edu

- You had a single line request for everything from UNIDATA:

  request UNIDATA ".*" idd.unl.edu

  UNIDATA is also a compound feed.  It is the union of the WMO and
  UNIWISC datastreams.  Since WMO is the union of HDS and IDS|DDPLUS
  and since you were already requesting those feeds in another
  request line, I commented out the single line request for UNIDATA.

After making the changes to your ~ldm/etc/ldmd.conf file, I stopped
and restarted the LDM:

<as 'ldm'>
-- edit ~ldm/etc/ldmd.conf to remove unneeded duplication of feed requests
ldmadmin restart

Cheers,

Tom
--
****************************************************************************
Unidata User Support                                    UCAR Unidata Program
(303) 497-8642                                                 P.O. Box 3000
address@hidden                                   Boulder, CO 80307
----------------------------------------------------------------------------
Unidata HomePage                       http://www.unidata.ucar.edu
****************************************************************************


Ticket Details
===================
Ticket ID: IZJ-689237
Department: Support IDD
Priority: Normal
Status: Closed