[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: decoders] First post - GRIB2 decoders



On Wed, 20 Feb 2008, Jelle Ferwerda wrote:


Hi Robb,

Thanks so much for your reply.

I realize that the Grib2 decoder is not intended for batch-decoding. That being said.. I am afraid it is the only thing that gets close enough at the moment for me. Also: The MPE files are quite small (2Mb), have only one grid, and I can -after decompressing- straight away subset it for my ROI (Which is only central kenya) and store them as one spatial archive files (e.g., HDF) for each day or season.

Jelle,

I downloaded at test file from the site and it decoded the data fine.
You could concatenate all the grib files together instead of working with a bunch of small files. Just create an index on the resulting larger file.

I looked at your code, one needs to seek to GDS offset before calling Grib2GridDefinitionSection or the PDS offset before calling Grib2ProductDefinitionSection because grib files are built on 8 consecutative sections that don't have the offsets included in the data. When the files are indexed , the offsets are stored for later use. There is a class Grib2Data.java that shows examples of this, one could modify Grib2Data.java to suite your needs. Your code only needs the minor offset modifications to make it work so I left it up to you to finish it.

Happy coding,
Robb...


If you do have time and could have a peek (& possible whip a few lines of code up that could help me decode a file in a more generally supported filetype?), that would be amazing! The link for the datafiles is: http://oiswww.eumetsat.int/~idds/html/grib.html. I normally work in IDL/ENVI, which supports aot the NETCDF format, which I think is the data-backbone of the UCAR library? Or I might just look into running the java decode as part of an IDL data processing stream, whichever is easier to manage. Eventually I expect to just run the program once or twice a month on updated files, and retrieve the data for my AOI, archive the .grb files on the data archiver, and export a composite of the new data to my analysis machine. Just so have some idea of what I have been trying so far, I have attached the java file I was working on for the past few days, trying to implement the unidata java class. As you can see: My java skills are not very good...

Thanks so much!

Jelle.


Jelle,

The Grib2 decoder isn't a batch file decoder for getting data. The idea is to make an index for the file using Grib2Indexer <fileName> and to use the Index to find what parameter to actually decode the requested parameter
 data. The design was the result of very large Grib files.

% Grib2Dump <fileName>  shows the metadata of each record.
% Grib2Indexer <fileName>  gives the index of the file
The file IndexFormat.txt in the root of the distribution explains the details of a Grib index. sample line:

0 0 0 6 2 103 2.0 255 0.0 2008-02-11T14:00:00Z 0 -418705429 121261 121342

% Grib2GetData <fileName> <GDS offset> <PDS offset> gets the data for record

from above sample index line:

% Grib2GetData <fileName> 121261 121342

Don't forget to set the CLASSPATH variable to the jar file.

I'll download a test file to see if there are any problems decoding this type of data.

Robb...



Exception in thread "main" java.lang.NegativeArraySizeException
at ucar.grib.grib2.Grib2GridDefinitionSection.<init>(Grib2GridDefinitionSection.java:300)
      at exportgrb.Main.main(Main.java:71)


// Some code

   // Paths and stuff
   String InFile          = "c:\\java\\data\\MPE_20080217_0900_M9_00.grb";
   Boolean CheckTheSum = true;

  // Create a unidata specified RandomAccesFile for reading
ucar.unidata.io.RandomAccessFile InRA = new ucar.unidata.io.RandomAccessFile(InFile, "r");

  // Grid definition section
Grib2GridDefinitionSection GDSd = new Grib2GridDefinitionSection(InRA, CheckTheSum);
    String GGdef = GDSd.getGridName(0);
    System.out.println(GGdef);

// -- end some code

I hope that someone can help me and explain how to fix this. Or even better: Is there someone who has already created a decoder specifically for the multisensor precipitation estimates?

Thanks so much,

Kind Regards,

Jelle Ferwerda

-----------------------------------------------
Jelle Ferwerda
Postdoctoral Research Fellow

Animal Behavior Research Group
Department of Zoology / University of Oxford
South Parks Road / Oxford / OX13PS
United Kingdom

email:         address@hidden
Office Phone:     +44 18652 71214
WWW:        http://www.bio-vision.nl
-----------------------------------------------


--

-----------------------------------------------
Jelle Ferwerda
Postdoctoral Research Fellow

Animal Behavior Research Group
Department of Zoology / University of Oxford
South Parks Road / Oxford / OX13PS
United Kingdom

email:         address@hidden
Office Phone:     +44 18652 71214
WWW:        http://www.bio-vision.nl
-----------------------------------------------
_______________________________________________
decoders mailing list
address@hidden
For list information or to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/


=============================================================================== Robb Kambic Unidata Program Center
Software Engineer III               Univ. Corp for Atmospheric Research
address@hidden           WWW: http://www.unidata.ucar.edu/
===============================================================================

--

-----------------------------------------------
Jelle Ferwerda
Postdoctoral Research Fellow

Animal Behavior Research Group
Department of Zoology / University of Oxford
South Parks Road / Oxford / OX13PS
United Kingdom

email:          address@hidden
Office Phone:   +44 18652 71214
WWW:            http://www.bio-vision.nl
-----------------------------------------------


===============================================================================
Robb Kambic                                Unidata Program Center
Software Engineer III                      Univ. Corp for Atmospheric Research
address@hidden             WWW: http://www.unidata.ucar.edu/
===============================================================================