Kevin Tyle, Univ of Albany, Chair
Michael Baldwin, Purdue University
Martin Baxter, Central Michigan University
Anne T. Case Hanks, University of Louisiana Monroe (outgoing)
Sen Chiao, San Jose State University (incoming)
Jennifer Collins, University of South Florida
Bart Geerts, University of Wyoming (absent)
Steve Lazarus, Florida Institute of Technology
Sam Ng, Metropolitan State University of Denver
Russ Schumacher, Colorado State University
Stefan Cecelski, Univ of Maryland (outgoing)
Kimberley Hoogewind, Purdue University (incoming)
Becky Cosgrove (CONDUIT) (absent)
Michelle Mainelli (GEMPAK-NAWIPS/AWIPS II) (absent)
Richard Signell (absent)
(Note: Government representatives were not allowed to attend due to government shutdown.)
Larry Oolman, University of Wyoming
(DeSouza Award Recipient)
UPC Staff Attending
| Lansing Madry
Next Meeting: Kevin Tyle will set up a Doodle poll to determine dates for the next meeting, tentatively set for March 2014.
ACTION 1: Tom Yoksas to check with RAP requesters to see whether what's on NOAAPORT is sufficient.
Tom: Gerry Creiger at OU had requested more level products to initialize WRF runs. He posted to the e-mail lists to see if others had similar needs, but no one responded. This is apparently not a pressing concern in the community.
ACTION 2: Michelle will ask Steve Schotz the following questions:
Michael: There is a limit of 500 connections for qpid. Setups with 15-20 clients have been tested informally, but he is not aware of any testing done to determine connection limits or how the system degrades with increased use.
ACTION 3: Committee members with obs data sources/sensor data amenable to importing into Rosetta should contact Sean to serve as usability testers. (Mike Baldwin, Steven Lazarus offered)
So far we've had lots of data sets from EOL to test, so we haven't poked committee members for additional examples.
Would still like to get more observational data sets. Committee members please send ideas and usecases to
ACTION followup for spring meeting, committee members to suggest data sets to Sean.
ACTION 4: Jeff Weber will change to ingest pressure levels rather than sigma
Sean: We are ingesting this data, but the data sets are very large.
ACTION 5: Tom Yoksas will check on having both N0R and N0Q at once.
Tom: Michael has already created N0Q composites as a test, we may add these to the IDD. NWS has been trying to take the N0R out for years; this may or may not happen in December 2013. We can add the N0Q products to the IDD at any point. There are a number of products in NEXRAD LIII feed that would make great national composites; we can make these available in a test mode and let committee members look.
ACTION Add hydrometeorological classification as Level III product (per Russ Schumacher request in April 2013). Committee members to communicate wishes for other composites to Tom Y.
ACTION 6: Kevin will follow up with UserComm members to carry out site contacts over the summer before the fall meeting.
Kevin will get updated contact list from Jen. Site contacts will be discussed by committee on Friday.
ACTION 7: Unidata (Linda Miller) should pursue gaining access to space weather data.
Linda: Brent Gordon made a presentation at the 2012 Users Workshop. He is eager to get the data out. He will write up something to describe the types of data that are available, and Linda will forward this to the committee. Maybe next meeting Brent Gordon can give a talk?
A good meeting at College Park, with a tour of the climate prediction center. In addition to talking about the NSF proposal, the committee met with Ben Kyger about Open Weather and Climate effort. Most of meeting focused on finalizing the NSF proposal. The committee also met with AWIPS II team and had demo. There was a Earthcube presentation.
How is Unidata going to be involved with Earthcube at the AMS?
Mohan will be chairing a conference on cloud computing at AMS. Unidata is heavily involved in cloud computing activities. The AMS Data policy statement is currently under review — make comments in next couple of weeks. (The draft statement on Full and Open Access to Data is available here (requires AMS login)). Mohan also chaired the data stewardship board; it is hard to get support for an updated data policy.
Does Unidata want to participate at American Association of Geographers (AAG) conference?
Unidata will look into attending AAG. We skipped NWA meeting this year as didn't get significant response from attending last year.
Some additional comments:
Questions that followed Mohan's report:
How does training workshop attendance compare with users workshop?
The Users workshop was 2/3 from universities, because they were invited. Explanations for training workshop attendance changes: in some cases training as happened and institutions are doing self-training. Rising travel costs are likely also a factor. Usefulness of tools has broadened beyond the university community; the LDM is operational in the NWS, for example. TDS is starting to be core infrastructure in other parts of the government/commercial community.
The total size of .com/.net community is much larger than .edu — are they are making profits from Unidata free software and data streams? In the EarthCube concept, is data sharing free? Google is putting NASA data into EarthEngine...
Unidata working with Google on EarthEngine. How Unidata navigates the edu vs. commercial space in the future is an issue; the question we must answer is does access by non-edu groups benefit the larger geoscience community? For example netCDF is now part of ESRI software, so users get software from us (directly or indirectly). Note that commercial users are not necessarily receiving data directly from Unidata.
Michael provided a live demo of the current version of the AWIPS II software (13.2). Upgrades to 14.1 (64-bit EDEX server and RHEL6) are planned for October, although the Government shutdown may interfere). Plans call for a first Unidata release incorporating 14.1 — this may carry over to early 2014, depending on quality of initial 14.1 release.
The demo focused first on the D2D perspective which is designed for the forecast offices. This perspective is localized so that each WFO has its own look, emphasizing resources from the region. Changing from one localization to another is non-trivial. (Michael tried quickly switching from the default OAX to BOU but the interface did not populate with BOU data.)
Michael also demonstrated some other nice features of the interface, including the ability to import GeoTIFF files (with geolocation information) and export .kmz files for import into Google Earth interface.
Data decoders are configured in xml files, but the configuration looks a lot like an LDM pqact entry.
Mohan noted that a small group of WFOs are currently using AWIPS II,
and that a second group will be switching soon.
Here is the list of active sites as of September 2013:
All NCEP Centers (except EMC and SWPC) are using AWIPS II in operations. NHC, OPC, and WPC create and disseminate their operational gridded products using AWIPS II. However, none of the Centers are using AWIPS II in replacement of NAWIPS functionality.
The WFOs (Group 1) that are using AWIPS II operationally are:
Video of Larry's talk will be available here shortly.
Staff status reports were made available to the committee prior to the meeting. Kevin went through the list (in reverse order) asking for comments or questions.
Ben noted that several UPC staff were associated with work that won an award for the outstanding publication of the Journal of Geophysical Education (JGE).
John mentioned that TDS 4.4 will be out in alpha in the next month. 4.3 grib changes took a long time, but have worked well.
Kevin gave the staff kudos for communication of 4.2-4.3 transition.
Ethan said that 4.4 is not nearly as big a step as 4.3, but is more infrastructure-oriented.
John added that 4.4 will make it easier for developers to add services into TDS framework. The release concentrates on point data and features developed by Marcos before he left.
Sean noted that Rosetta is closely tied to ACADIS, which recently had an NSF site visit. The NSF manager really liked Rosetta. The team is working with EOL to increase number of forms of ASCII we can transform. One challenge is getting the message across that we use netCDF so that the CDM can make further translations — not necessarily because netCDF should be the end result. Sean also noted that they need more example data files. If anyone has old Campbell data loggers, it would be good to see that data. One proposal dealing with waterML was funded.
Ben said that ESSI labs expressed interest in incorporating Rosetta into a brokering system.
Michael noted that GEMPAK 7 is the name for the release that will be bundled into AWIPS II as a local app. GEMPAK 7 will not include GUI programs. Don't know when that will be released, or if there will be another GEMPAK 6 before then. Unidata will incorporate GEMPAK 7 updates into the Unidata GEMPAK release, which does include the GUI (NAWIPS) programs.
Russ S. noted more attendees in GEMPAK workshop — is enthusiasm growing?
Michael pointed out that GEMPAK and AWIPS II are now a single workshop, which probably accounts for the growing attendance.
Kevin noted that GEMPAK is now on github; how many people are availing themselves?
Michael: some fixes have been submitted, and it has been forked several times.
Kevin: Has there been any thought given to making GEMPAK available to repository providers?
Michael: We have not considered that. Would the benefit to users justify the resources needed to do it?
Kevin: The build procedure is getting easier. Does Unidata maintain macports? (No.) Python interfaces would be interesting. (netCDF for Python, created by Jeff Whittaker, exists now.)
Russ R: We don't maintain interfaces for Python, but we'd like to get into that in preference to, say, c++. Enthought Python provides a partial netCDF-4 API.
Kevin: Can you give us a sense of what is going on with McIDAS-V?
Tom: They just released version 1.4 today, adding Hydra. They are working on eliminating ISL in favor of jython. There is increasingly good interaction with SSEC developers.
Tom: Input into IDD cluster averages 15gb /hour, peaks ~30gb/hour
Steve: The Virtual Circuit LDM experiments use specialized routers (which are already in use at Internet 2 sites).
Kevin: At a previous meeting, the committees made a request to Eumetsat about satellite data, what happened?
Tom: The idea has not been accepted, but has not been nixed either. They must sell the idea to their member states. The request was to get RT Meteosat data for universities; Eumetsat would have to make an existing experimental program operational, which takes money and staffing. If approved, it would include 3rd generation Meteosat and polar orbiter data. Universities must follow Eumetsat data policy (can't redistribute until after 24 hours).
Yuan: RAMADDA server now has a data handler for radar products. Lately I have not spent much time on RAMADDA side, mostly IDV. IDV 4.1 is out, with a new satellite data chooser that will be in IDV 4.2 release. You can now use the view window to set display area/subset for satellite data. We've also introduced progressive resolution features.
Sean: The new image chooser refines subsetting. McIDAS-V did this but was slow; Yuan has speeded it up dramatically.
Sen: For the IDV workshop (short course) at AMS, how will you handle big data that people bring?
Sean: IDV users should be comfortable with subsetting, and should understand the limits of their machine. This is the first thing we cover in workshop sessions.
Yuan: Progressive resolution is currently only in the satellite data chooser; in the future we may add it for gridded data.
John: The TDS steering committee meets once every six months or so, usually before committee meeting. Rich Signell would have given a report, but he's incommunicado.
Kevin: Steering committee notes are encapsulated in IDV status report. I will ask Jim (Steenburg) when to set up the next call. There have been some performance improvements, but satellite imagery (especially at high resolutions) seems to take up a lot more RAM that it seems it should.
Yuan: Performance will improve with new satellite image chooser.
Kevin: I've seen strange behavior when near peak RAM allocation, problems releasing memory.
Stefan has seen similar problem; he uses a third-party app to free memory. This may be Mac-specific.
Yuan: This may get better when we move to Java 7.
Kevin: What is sticking point in move to Java 7? Java3d?
Julien: There seems to be some progress on Java3d. Java3d has been taken over by open source community; we are keeping an eye on this.
Mohan: Why not bundle Java 7 for non-Mac platforms?
Sean: Some bugs appear when IDV runs under Java 7.
Julien: We could do piecemeal builds, as long as we don't introduce code that depends on Java 7.
Kevin: I've been experimenting and now can configure a RAMADDA server to generate imagery from gridded files in a "headless" environment (with proper X server available).
Kevin: Will IDV eventually remove ISL? Yuan/Julien/Sean: No! Still developing ISL functionality.
Semiannual steering committee meeting; the last call was at end of May 2013. Three committee members currently running RAMADDA.
Kevin: How to get sites listed in the IDV Unidata Community Sites catalog?
Mohan: We should not have to manually update these catalogs...
Sean: ramadda.org may have this information, but we'd have to scrape it somehow.
ACTION: can Jeff/Don push federated RAMADDA sites back out so IDV community sites catalog reflects those who choose to federate?
Last year we introduced IDV/RAMADDA in a data analysis class. For final presentation students gave a brief discussion of a previous case. They had some issues with using IDV; it was easier to use nmap2 to get images together in a short time. Marty gave presentation on using IDV as case-study tool — how can I better pitch things so students will use IDV?
Stefan: Have them create bundles using four-panel display?
Kevin: Our in-house data is in GEMPAK format. I want to continue pushing IDV. Some of the then-juniors now-seniors are going forward with IDV. They are seeing now with the government shutdown that using web sites is not always an option for generating products. Had an ncl workshop — this was very popular (Matlab as well). One of my goals is to learn/master Python using Johnny Lin's book.
Kim: I use Python for a broad range of things — no more shell scripting — and for web applications. I use it to read in gridded data using Pygrid, pyneo, matplotlib with basemap, cartopy.
A lot of activity related to moving LDM to a new server. Two LDM servers for Level 2 radar are old, so we bought new hardware. In 7 years in our dept we've gone from having hardware in building and local expertise to consolidating IT resources run by the college. College IT made us move off that system to systems at our supercomputing center. Moving more and more to Python for real time wrf runs, generating images, planning to use more in classes. Dean says each dept should have an IT specialist, so I'm optimistic.
Mohan: Purdue is a top-level L2 site, is the college planning to continue this?
Mike B: Yes, we're planning to continue, spending money on hardware.
Tom: Are you now the point of contact for Level 2 data?
MikeB: I think so.
Anne Case-Hanks I haven't been teaching since I moved up to admin. We have been developing a new weather/forecasting class that uses IDV/GEMPAK. We don't have IT support — it's centralized in the college. Anything not Windows/Mac is harder. I don't think that's going to change. Our ADDE server is down and we need to buy new hardware.
I have a lot of experience with GEMPAK, some experience with IDV, a bit of ncl. I use the NetCDF-Java library to convert level 2 data to cf-radial to bring it in to Python. Doing climate change and extreme weather.
Since last year we abandoned teaching FORTRAN, now we teach Python. Students appear to be happy about this. I teach using GEMPAK and IDV — I don't like having them use canned bundles, I want them to understand the data. I'm the one doing IT.
We got an equipment award this year; we have the hardware, almost installed. We may be sending support e-mails as we continue. This was a student-led effort; a joint real time and reanalysis system. We want to bring copies of data sets into central server, and publish them with RAMADDA.
Ensemble workshop: I'm trying to set up a pilot project to move 2012 Earthcube discussions forward.
I was at the cyclone workshop in Montreal: Gary Lackmann gave talk that highlighted using nmap2 to do hand-analysis on the computer.
As usercom members we have to continue to evangelize the IDV as a useful tool.
Kevin: Until there is widespread adoption of AWIPS II in our community, we may need to support nmap2 for a while.
Mohan: What is "support?" GEMPAK will continue to work, just not be developed actively.
Our old department chair moved on. We have a new school of geosciences; the chair (Jeff Ryan) is supportive of Unidata. We have a new weather lab, so I can take our equipment award off hold. We are buying new equipment.
Steven L. and I are hoping to co-host a regional workshop in Spring 2015. Not sure about a max number of attendees. I've reached out to folks at other universities and they don't use Unidata products, but there was interest. Our AMS chapter is very active, and many members might want to come to a workshop.
I'm on sabbatical so IDV use in classroom has been on hold. I'm using it for research mostly.
The University of Maryland is returning to Unidata! We have a new undergrad director. Students said we need a real time forecasting lab, with real time data. We're in the process of setting up LDM also want to get involved with AWIPS II.
Mostly I set up ISL scripts and forget them.
I'm also on sabbatical. I'm on IDV and RAMADDA steering committees. I had the school build a 16-screen electronic map wall that will eventually display Unidata products. I've submitted abstract on education and data literacy/analysis for AMS. Want to make a case study for hurricane Sandy, showcasing what IDV and RAMADDA can do. Looking for ideas about how best to use the technology for undergrad education.
Using GrADS and Python for some ensemble applications; students want to learn GrADS.
Interest in IDV has increased at my school, so we added an intro section for freshman/software. AMS club asked me to give an IDV tutorial. A lot of students know what IDV is. I'm still using GEMPAK in junior/senior forecasting lab. I want to covert case studies of front range weather in Garp format to IDV. with AWIPS II coming soon, I want to start installing that in my weather lab.
Only six responses to workshop followup survey to date.
ACTION Kevin to send out one more reminder to attendees to fill out survey
Mohan: We want to do some followup activity to use up workshop grant funds. How do we benefit the community more broadly? We can use some funds for regional workshops.
Kevin: Can we devote a small portion of the regional workshops to data issues/usage/citation?
Jennifer: Could you use some money for national meetings AMS/etc?
Mohan: We want to create a Unidata-in-a-box distribution that is easy to use. Have to go beyond just handing out the software. It must be a self-configured, ready-to-go set of services.
Anne: For next workshop, design a tracking mechanism into the proposal.
(Project led by Peter Neilley of The Weather Company and Ben Kyger, NCEP)
Discussion within NOAA/NWS spawned this project. Environmental Information Services Working Group (eiswg) wrote a whitepaper to science advisory board/NOAA admin recommending that NWS ensure data generated by NOAA and NWS be made available.
The proposal is that NCEP install vendor/user computer systems in a community facility co-located with NCEP supercomputers in Reston VA. Perform analysis/visualization near the data rather than moving large data (e.g. 15-min model output) to community.
Ben Kyger is working with Peter Neilley toward this end. The first step is to install vendor computer systems in Reston VA. This will bring analysis and visualization close to the data rather than moving the data in real time. Unidata was approached to reach out to the academic community. Logistical details are being worked upon. WSI plans to use the data for turbulence research. Unidata will add our own computer as part of this project.
There are logistical issues because contractors run the Reston supercomputing center.
Unidata will have to purchase our own networking (commercial internet); The Weather Company provides rack space. Within last two weeks Mohan has heard there is another private company interested. Legal details between NOAA and contractors are being worked out. Agreement will come to UCAR soon (today/next week)?
Unidata needs to entrain some university beta users to take advantage of data. NWS would like to access research results ("research to operations").
Mohan, Neilley, and Kyger are presenting at AMS on this project.
Still TBD: what will the business model be for keeping this going beyond experimental stage?
This will not affect push data streams (CONDUIT). The university community could have influence on what model output are made available.
How would such a system work? The would be a networking connection to the supercomputing facility disk storage to copy "whatever data we want" to the co-located equipment. Data will not be flowing in real time to users; must do processing on site.
John: What if community proposed algorithms/processes to run, like proposing different data streams? For example: feature extraction on a specific data set.
Mike B: Seems to be of limited utility for research — researchers have models running anyway, it's really the private companies who need the speed.
Kevin: I can see interest in getting high resolution GFS output, for example.
Russ S.: If they could do more interesting post-processing, that would be more useful than raw 15-minute full resolution output.
The Timetable is unclear; Neilley wants to get going right away. Weather Company equipment may be in place by January. This is supposed to be a six-month experiment Not talking about a huge computing resource commitment ($5-10K)
Mohan: I would like to go to the broader community and ask if this is something that would be useful. NCEP/NWS believe there is a greater need in all sectors for access to this data.
Mike B: I'm uncomfortable getting into a situation where you have to pay to get access to the best data
ACTION UPC draft a document to explain open weather and include a proposed catalog. Committee will review document and recommend a plan for communicating to the general community. *
*products on NOAA servers-(provided 10/22/13)
I like the way IDV is going, but students complain about loading a lot of data. Students are loading a lot of data at home, but they complain about data loading being slow. Can we limit access to specific data?
(Tom: it would be useful to know what they're trying to do?)
(Sean: progressive disclosure could help with display issues).
Best time series in TDS: Jim Steenburgh posted about some jumpiness. (Sean: that is fixed now.) Was there more description in IDV catalog before 4.0 upgrade? (John: initial feedback was that other categories beyond "Best Time Series" were not useful; we have plans to put them back eventually, but low priority.)
I'm concerned that some people don't understand what "Best Time Series" means. We want to match zero-hour data with obs. (Sean: You're wanting constant forecast hour, not best time series) We need more information about these logical data sets, apparently. (Sean: perhaps a new logical data set with only analysis fields?) (John: we can document this in catalog itself.)
Will IDV hit wms-t servers? (support question)
From Michael's demo yesterday . . . the complexities of getting AWIPS II to run for our students are considerable. D2D is tailored to WFO too much. Even NCP has too much available. Can we pare down the interface to something that makes sense to students? (A Unidata perspective?)
(Michael: menus are very complicated. The NCP would be primary perspective for researchers/students)
(Kevin: if we had a class in virtual weather service, sure, but otherwise I only see it as a followon for nmap2. Unidata support resources to should go to the NCP.)
(Michael: NCP documentation is sparse, our docs should correct this.)
Push the progressive disclosure in the IDV — a lot of value there.
Keep developing ISL — keep up with GrADS scripting language.
Happy with where IDV is going.
Can we make multi-radar composite from OU available through IDD?
(Mike B: OU will give you LDM access)
Q3 gridded radar composites.
(Ethan: would it be worth trying to convince them to use ncdf4?)
(John: better to do it for them and give them the code.)
ACTION talk to OU about using a standard format, making available.
I've used TIGGE output a bit — slicer/dicer on the ECMWF page is great. The ability to access just archived data you want is great for researchers. ECMWF user inference is more user friendly. (Mohan: recent work at NOMADs on interface) Non-specifically, it is nice to get portions of archive data.
My goal is to write six labs using IDV.
(Kevin: Interest in sharing teaching resources.)
(Jennifer: Making teaching resources available would be very useful. Can we have a summer position to do this?)
ACTION Create space on RAMADDA server for lab/tutorials/teaching resources.
Ideas about how to push Unidata ideas/software to serve a wider community. OWCS opens door to private companies — how do we piggyback?
Get teaching resources on RAMADDA.
Taking IDV progressive disclosure to gridded data would be good. Anything Unidata can do to speed up data access. The faster the better.
I'd like to see more Python. Lots of efforts out there to create met-focused Python packages. Could you access GEMPAK routines from Python? Access to all of these tools helps with the workflow. I can't use IDV to look at 100 years of climate data.
Make it easy for us to do case studies, and get access to older data. It's hard to know where resources are if I'm not getting them off my local LDM. It would be nice to have standard gridded archive of NEXRAD mosaics, for example. Real time data is great, but easy access to historical data would be really useful.
I'm very afraid of AWIPS II. So much effort to recreate 20th century technology.
I'm thinking of getting away from GUI-based methods of creating plots — I need to teach students to program instead. Python looks like the way forward.
I want to catalog data we have available on our system. The variety of data formats can be a problem. Is there something on the Unidata pages on how to translate from one data format to another? Do we need to spend some time on a universal translator?
ACTION Put best practices info for making small netCDF files on Unidata web site.
(Russ R: watch for dev blog on netCDF compression.)
Jen is creating up-to-date list of institutions and contacts. Kevin will create a spreadsheet. Each committee member to take four institutions (two large schools, two small?) and contact.
Some discussion of how to ask questions. General consensus to have a list of standard questions/topics (perhaps starting from questions from last round of contacts) but to try to keep it more informal — more a conversation than another survey.
Doug to assist Kevin in coming up with list of sample questions. Some suggestions: