The summary is not yet available. Stay tuned.
Action Items
Committee Members
Acronym List
Committee Members:
Kevin Tyle, University of Albany, Chair
Michael Baldwin, Purdue University
Ibrahim Demir, University of Iowa
Kevin Goebbert, Valparaiso University
Steve Lazarus, Florida Institute of Technology
Gretchen Mullendore, University of North Dakota
Sam Ng, Metropolitan State University of Denver
Russ Schumacher, Colorado State University
Student Representative:
Kimberley Hoogewind, Purdue University
NCEP Representatives:
Becky Cosgrove (CONDUIT)
USGS Representative:
Richard Signell
UPC Staff Attending
John Caron
Ethan Davis Doug Dirks Ben Domenico Steve Emmerson Dennis Heimbigner Yuan Ho Ryan May Terry Mitchell Jennifer Oxelson |
Inken Purvis
Mohan Ramamurthy Russ Rew Sheri Ruscetta Mike Schmidt Christian Ward-Garrison Jeff Weber Tom Yoksas Josh Young |
Date for Fall meeting:
The Fall meeting will be scheduled jointly with the Strategic Advisory Committee. Staff will send out a poll to both committees to identify possible meeting dates. Date will be finalized at SAC meeting in April.
Review of Action Items
Committee nominations
There will be 5 openings on the Users Committee this year so please consider who would make a good committee member and nominate them. The formal call for nominations will go out in April; however, the nomination form is currently available.
Highlights of this conversation include consideration of MOOCs and the importance of Python and CS for met programs. Kevin Goebbert noted that Valparaiso University is also doing fortran/python in programming and looking at including it in freshman coursework.
Russ Schumacher noted that a new faculty member at CSU started a graduate level programming course in python. Becky Cosgrove added that NCEP is looking at training staff on Python because more things are moving to that language.
It was noted that the SAC featured some discussion of EarthCube and that NSF has a new initiative (INSPIRE and SEEs being replaced) focused on risk reduction and high impact storms this means it is likely that GEO will focus more funds on storms, hydrology, and earthquakes. There is also a renewed focus on societal impact and policy research/applications. The previous SAC featured some discussion of cyberinfrasture at geoscience agencies, and this will only increase with the release of the Public Access to Research Results (PARR) at NOAA, NSF, NASA, USGS. Steven Lazarus noted that the Journal of Geophysical Research rejected a paper because it didn't cite where the data was and how it was available.
Mohan's slides.
There was a question for Mohan regarding the cost for using Azure. Mohan noted that in many ways this was a minimal instance; however, the cost for a year would have been about $12k.
Sam Ng: AWIPS II has been a real focus for the last month, using it in class with nowcast and forecast, IDV is used more by a colleague, students have been able to run both IDV and AWIPS II at the same time if not loading a lot of data.
Steven Lazarus: Taught a Remote Sensing class for the first time which resulted in my first IDV assignment for a class, I sent students out to NCDC to gather data; I didn't want things to be too pre-processed so that students learned where to get data and how to process and use the data; I want to keep using this assignment but the current archive has a 2 year storage.
Yuan noted that everyone in the Unidata community can get up to 5 GB of data a month for up to 60 users from the University of Wisconsin.
Steven added that the fact IDV lacks satellite calibrated units is a big thing for folks that want to use satellite data, he has access to SSEC archive data but you have to use McIdas X since IDV does not interpret the gempak wx symbol. The FIT weather briefing course is almost pure IDV.
Rich Signell: Still working within IOOS context, 18 different data portals in second round of Sandy supplementals, third round was us working with these 18 different portals to make them more searchable. Site on github under ESIP Federation to connect data sets to this infrastructure.
Russ Schumacher: A couple Unidata folks came up for site visits, with CIRES starting to use more Unidata products as a result. Just got a a NASA project funded with Brian Mapes to look at extreme rain events and create IDV bundles that will allow you to quickly browse the case materials. Teaching synoptic in the fall, last time I used a little IDV, next time I want to do more.
Gretchen Mullendore: Business as usual, with LDM running, weather lab mostly focused on Gempak, the team is really happy with the support but would like more documentation; I feel like I'm behind the curve on McIdAS V and/or X. Another theme is grants coming in that could use many of these Unidata products but I don't have the time or money to do this incorporation. Maybe we in the university community could partner with Unidata on some training grants.
Kevin Tyle: We had a recent seminar at our department on Big data and atmospheric science to collaborate with a new department of geoinformatics, also intended to get students and faculty in the atmospheric science department familiar with newer tools. I hope to make the presentation available on University of Albany RAMADDA server. I also created a case study of a past event, some of the folks in the computing college might be able to leverage some of their search expertise. Unfortunately, I couldn't talk about python bundles and Wakari as a useful tool. Rich Signell asked if there was any multiple user wakari? Kevin noted not yet.
Michael Baldwin: I'm teaching a sophomore level introductory lab that is being taken by non-majors. I've been using NAWIPS and IDV, trying to use the WRF, in single column mode seeing how changing soil moisture affects the fluxes. There are some things where IDV and WRF work well together and there are other areas that do not work well. Fluxes and 2 meter temperature are very nice; however, if you want to do a vertical profile of temperature you are not going to be able to do it. At least, I haven't been able to figure it out. For vertical profiles, I use nsharp and NAWIPS. I have the students do a lot of the calculations using NMAP2, I'm trying to use IDV as much as I can but I don't know how to do that in IDV. Between the last meeting and now I've had my biggest foray into IDV. I have lots of plans to run all of these great tools but I've never learned to use them over the last 3 years. I'm working with the state Department of Transportation to get winter weather information to them. They want an index that tells them how bad the weather was compared between seasons or location to locations. We've developed a database for this project and some tools to work with it but it is still under development.
Kevin Goebbert: Not too much to report, we have the Cave client connected to EDEX server, hoping to do more with it in the future.
Ibrahim Demir: We have AWIPS II running but had some problems with security and had to move to Centos. We have two machines running LDM with one for data collection and rainfall data. Exploring a cloud instance as well as also moving to a university data center with an almost a virtual environment. We are exploring how we could get a system running on the cloud quickly in response to an emergency; a couple examples developed for NASA funded flooding experiments. Things are running on web based map that allows you to see individual grid cells. Exploring webgl to access large scale data.
Kimberley Hoogewind: A lot of python related stuff focused on research and collaboration. Building up my python package and borrowing some Fortran code, wrapping up something with SPC and historical storm events; not focused on teaching too much but primarily trying to get finished and find a job.
IDV specifically Jython and VizAd are not widely used. Generally concern about maintaining VizAD and how IDV will evolve.
Need for more time, or at least, a decrease in the amount of time needed to learn and use products/services.
Becky's CONDUIT slides.
A summary of the CONDUIT survey result is available here
Becky's AWIPS II slides.
Jeff’s demonstration was delayed due to technical issues with the projector. The demonstration ultimately took place on Friday.
Sam talked about Metro State University of Denver's experience using AWIPS II in the classroom. His presentation is available here.
Polar Orbiting data is available in the University of Wisconsin archive, there was a brief conversation about area files and point data and differing challenges between the two data types. The scatterometer data is a polar orbiter data, it along with GPM would be nice to work with as well as high resolution imagery. Julien Chastang is working on python tools for calibrated data.
There was consensus for an action to identify what data/variables is available as a polar orbiting product and then assess community needs/desire for each data or variable type.
A more detailed aspect of this conversation is how to make polar orbiting data accessible in IDV.
Options considered include a broker service for ADDE. Committee members mentioned that facilitated access to new data sets is nice for class use but great for research. The level of priority should be determined by the level of effort.
Kevin Tyle raised the question if this issue overlaps with microwave imagery.
Kevin Goebbert noted that from the synoptic perspective we haven't used these products because of the data friction bringing in this information for display and use. The COSMIC stuff would also fall into this camp. We could turn THREDDS into the front-end for a bunch of different servers.
John Caron confirmed that THREDDS could be turned into a broker; however, the problem is that the data is not evenly spaced. We have to be able to identify both time and space. There was a suggestion talk to NASA Giovanni people to facilitate a partnership with that team and provide an example of the use case for polar orbiting data.
Two questions were raised:
The Data Management Resource Center (DMRC) is still under development but the site was accessed to both discuss the DMRC and view the new website format.
This session was used to brainstorm additional speakers to invite to the upcoming workshop. Committee members may see these additions by accessing the Speaker spreadsheet.
The Users Committee reviewed the nominees and made a selection regarding the 2015 recipient.
Kimberley Hoogewind just wanted to note that she is really excited about all the Python enthusiasm.
Ibrahim Demir suggested that at this stage he is focusing on learning a lot more about Unidata tools and packages and looking forward to passing back to his community what he has discovered.
Kevin Goebbert would really appreciate more documentation with current approaches as opposed to 1980s approach. I'd like to see more videos or interactive means to get documentation. I think that would help reduce friction to adopting new tools
Michael Baldwin finds that some basic things that should be easy to do aren't. For example, the ability to obtain and plot surface observations for any time period that you can get reanalysis data should be easy. There are lots of archives, such as Iowa State, that can be used to make plots but only in gempak. Satellite data, radar data, model forecasts in archives are all really difficult to access and work with.
Staff noted that the developers know where the data is and how to work with it. However, from a user perspective it would be good to have all of that information in one place.
Kevin Tyle noted that MADIS data is very cool and the mesonet data is astounding but he don't know how to efficiently display it.
Gretchen Mullendore suggested thinking more about time constraints. She added none of us have time but that students do. So, she would like to set up a Unidata visitor program where we could send a graduate student down for some period of time.
Rich Signell would like more documentation or effort to proselytize through GeoData Carpentry. Taking it on the road would be huge. It would be cool to have a notebook of the week. For example, the python for oceanographers blog is very cool. Think about Unidata's core services in the future; we need something that will allow us to process on the server. We have to remember the fundamental data forces.
Steven Lazarus uses IDV in his dynamics class, but still finds himself using Nsharp because it has a lot more indices. It would be really nice to have at least comparable capabilities in IDV. The Gempak disconnect with IDV and wx symbol is a barrier. He would like to be able to generate a netcdf file directly. The issues with satellite data is also a challenge. Polar orbiting data is very attractive for use. In addition to Madis, he would like to look at including ACARS in IDV.
Sam Ng would ultimately like to have an Edex server in the cloud or some functionality that would allow students to access AWIPS II at home without a linux machine.
Kevin Goebbert suggested that his IT folks definitely won't move away from dedicated IT labs for students. He would also like access to FIM model data.
Mohan responded that there have been conversations with ESRL about making that happen, it is a question of figuring out how to distribute it.
Mohan noted that when we had the MS Azure server up we could have done that but it is a question about funding. We would like to develop a multi-operating system version but right now that is out of scope. It may not be as we proceed with the AWIPS II recompete.
Mike Baldwin added that Purdue uses something called Thinlinc that might be an online solution that provides a linux system.
This discussion focused on the IT requirements and configuration for the Triennial. In particular, the needs regarding the hands-on working sessions of the workshop. The Co-Chairs will discuss these issues with the speakers during the coordination call. However, the key questions identified included whether this should be hosted locally or on the cloud. If hosted on the cloud should this be at a commercial service or at one of the universities. Mike Baldwin mentioned that Purdue may have some resources that could be used for this purpose.
Ryan May provided a brief update regarding Python activities at Unidata. These are not large stand alone projects; however, several staff have python-related or dependent projects on the side.
Committee members mentioned the need for a python tool that delivers the scripting functions associated with Gempak. Ryan has a project that pre-dates Unidata referred to as a PyMet that works in this space. Committee members requested that Ryan make that project more widely available.
Tom highlighted the potential transition of the Gempak scripting products to a Cloud instance as a means to begin transitioning some Unidata services to that environment. This was suggested as an initial effort because it is not heavily used by the core academic community and does not cause a significant financial burden for Undiata.
Committee members pointed out that some IDV actions interact and rely upon those products so a transition to the cloud could impact a small number of individuals.
Josh Young
Community Services - Unidata
University Corporation for Atmospheric Research
P.O. Box 3000
Boulder, CO 80307-3000
303 497-8646 fax: 303 497-8690