[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: New Model Ag Service for TDS



Jennifer Adams wrote:

Dear All,
I'm really glad to see the TDS has a forecast aggregation prototype up and running! I have poked at the server on motherlode a little bit. Unfortunately, my version of dncdump dies when attempting to get the "DODS" attribute from the "Lambert Conformal" variable -- so I can't get a complete dump of the metadata from these data sets, but the .html output from the server fills in the gaps nicely.

I dont know why dncdump dies, perhaps you should report as a bug?


As a client, GrADS can display some of the data from the server -- not the data sets with the 2D time axis, but the others are doable in theory. The data are served on their native Lambert projection, so a simple 'sdfopen' command will never work with this data. GrADS has to use a descriptor file with PDEF entry to interpolate to a regular lat/lon grid. This interpolation is done in the GrADS I/O layer one grid point at a time, which means that it is incredibly slow -- so slow that it becomes unusable as an interactive tool. If I skip the PDEF option and read the grid into an abstract 93x65 grid, then it works reasonably fast, but then I have to forego a map overlay.

It's a little early to say for sure, but I don't think the 2D time axis data sets will be readable by GrADS, even when the 5th ensemble dimension is completely implemented. This variable looks like it would work: *u_wind: Array of 32 bit Reals [run = 0..14][time = 0..10][isobaric = 0..18][y = 0..64][x = 0..92] * the coordinate variable called "run" would fit as the 5th ensemble dimension and the coordinate variable "time" would fit as a normal linear time axis. However, the coordinate variable "time" is two-dimensional:
*time: Array of 32 bit Integers [run = 0..14][time = 0..10]*
I don't know how to deal with that -- my knowledge of the netcdf API is limited to 1D coordinate axes. Is this a feature available with NetCDF 5?

The development branch of netcdf-java (2.2.17) has some new routines to deal with these more complicated grids, but i assume you need the C library? If so, thats a long way off.

You can also look at the various 1D time subsets, of which the "best time series" might be the most interesting for general use, eg:

http://motherlode.ucar.edu:9080/thredds/catalog/fmrc/NCEP/NAM/CONUS_80km/catalog.html?dataset=fmrc/NCEP/NAM/CONUS_80km/best.ncd

or the dods URL:

http://motherlode.ucar.edu:9080/thredds/dodsC/fmrc/NCEP/NAM/CONUS_80km/best.ncd



To continue testing, it would help A LOT if the data were not on a Lambert projection. John, can you serve up a similar data sample on a regular lat/lon grid?

I wont be back in the office until monday, but then i will put up a dataset that uses lat/lon, eg one of the GFS global models.


Jennifer
--
Jennifer Miletta Adams
Center for Ocean-Land-Atmosphere Studies (COLA)
4041 Powder Mill Road, Suite 302
Calverton, MD 20705 USA
address@hidden




On Aug 16, 2006, at 10:36 AM, dan.swank wrote:

    1) This seems to be an issue of incompatible clients.
    GrADS is expected to be spoon feed the projection information
    in a particular fashion it understands ~
    and it obviously is not smart enough to read the
    char Lambert_Conformal; information provided by the TDS
    and figure it out. Not much can be done here until someone
    updates the GrADS client.

    2) By Grid-relative coordinates, I mean the x/y points as they appear
    in the GRIB files. x=1 2 3 ... nx, y=1 2 3 ... ny.
    This, along with map projection (char Lambert_Conformal;)
    information needed to project the grid into lat/lon coordinates
    is what GrADS is expecting. I am not sure if putting
    <variable> override tags in the TDS configuration can allieviate
    this.

    I've cc'ed Jennifer Adams, perhaps she can clarify/correct my
    understanding of the GrADS client...

    3) In our NCEP GRIB archive ~ it seems, GRIB was never
    designed to follow the convention of having a consistant reference
    time and adjusting the forecast time to create the valid date.
    Rather, the valid date of the record = the reference time for
    analysis + any forecast hour. This is the cause of our
    aggregation issues.

    We have just gotten non-aggregated NARR fiels on our TDS today
    in our Test area :
    http://nomads.ncdc.noaa.gov:8085/thredds/catalog/narrDaily/catalog.html


    -Dan


    John Caron wrote the following on 8/15/2006 4:42 PM:


        Hi Dan:

        dan.swank wrote:

            John:

            I've tinkered with the new GRIB aggregation, made a few
            subsets.
            I noticed some funky x/y coords in the data dump :
            Y begins at -832.6982610175619 (?) and X @
            -4226.1069969154705.
            Noitced in the definition that the units are Kilometers...
            (from what reference lat/lon ?)



        These are projection coordinates, in "km on the projecction
        plane". The
        projection is defined by

        char Lambert_Conformal;
        :grid_mapping_name = "lambert_conformal_conic";
        :standard_parallel = 25.0; // double
        :longitude_of_central_meridian = -95.0;
        :latitude_of_projection_origin = 25.0;

        following the CF-1 conventions. This is the projection
        contained in the
        GRIB file.



            The particular client I used to access this (GrADS) cannot
            decern this
            x/y spatial cooredinate. It is expecting them as lambert
            grid relative
            x[1:x] etc.

            Not sure weither this is something that can be configured
            with the aggregation, but it certainly would be nice to
            have the option
            to use grid relative x/y coords.



        sorry, I dont know what "grid relative x/y coords" are?



            Other than this, I see the TDS handles the varying Z
            dimension
            across variables quite well. Does it still require the
            files to be homogenous? Or will it scan each one for the
            forecast hour = <specified> header tag ~ and use that?



        Yes thats the intention, to not need index homogeneity, but to
        use the
        times in the GRIB files and make it all work nicely. The GRIB
        files
        require another level of XML configuration, to make it all
        work. We are
        hoping to work closely with all to get this figured out, and
        ill explain
        in more detail then.



            -Dan


            Glenn.Rutledge wrote the following on 8/15/2006 8:50 AM:

                Dan-
                You have the lead with Steven to implement this new
                aggregation service
                on your platform of choice. This is one of the Web
                Services I spoke of-
                and please make this #1 priority. See msg below. Glenn


                Subject: Forecast Model Run Collection Aggregation
                prototype
                available
                Date: Mon, 14 Aug 2006 18:02:05 -0600
                From: John Caron <address@hidden>
                Organization: UCAR/Unidata
                To: address@hidden, Steve Hankin
                <address@hidden>



                An experimental new TDS service "Forecast Model Run
                Collection
                Aggregation" is available for poking at on the
                motherlode development
                server:

                
http://motherlode.ucar.edu:9080/thredds/catalog/fmrc/NCEP/NAM/CONUS_80km/catalog.html

                (or .xml)

                This aggregates a collection of Forecast Model Runs
                (in this case the
                IDD NAM CONUS 80km runs), making it available as one
                dataset with a
                2D time coordinate.

                Then it creates various other logical datasets:

                1) data from one run (what we already are used to)
                2) data with the same forecast offset hour (eg all the
                3 hour
                forecasts, from different runs)
                3) data with a constant forecast date (eg all the data
                for
                2006-08-08T12:00:00Z, from different runs)
                4) the "best" time series, taking the data from the
                most recent run
                available

                This is not production-ready, so dont clobber it, but
                any testing and
                comments welcome.

-- Glenn K. Rutledge
                Services Team Leader
                Remote Sensing and Applications Division
                NOMADS Project Manager
                National Oceanic and Atmospheric Administration
                National Climatic Data Center
                Asheville NC 28801
                Phone: (828) 271-4097
                Fax: (828) 271-4328

                NOMADS: http://nomads.ncdc.noaa.gov/




-- Dan Swank <address@hidden>
    NOMADS Project: Software & Data Management
    Contractor - STG, Incorporated
    Veach-Baley Federal Building
    151 Patton Avenue
    Asheville, NC 28801-5001
    Phone: 828-271-4007