Bill Gallus, Iowa State University, Chair
Dave Dempsey, San Francisco State University
Rob Fovell, University of California, Los Angeles
Brian Mapes, University of Miami
Lynn McMurdie, University of Washington
Mike Piasecki, City College of New York
Dave Santek, University of Wisconsin SSEC
Sepi Yalda, Millersville University
Kevin Tyle, University at Albany
Bernard Grant, NSF
Jeff de La Beaujardière
Greg Byrd, UCAR/Comet
Vanda Grubišić, NCAR/EOL
Steve Worley, NCAR/CISL
| Ethan Davis
| Mohan Ramamurthy
HRRR is a now an operational product through NOAAPORT, so the User Committee was discussing what to serve to the community; the operational product or the ESRL research product. Right now, both are being served; however, if it becomes a volume issue then folks tend to lean towards the operational version.
More and more use of Python in the departments now, either in the class or through coursework. IDV use is also ramping up in synoptic labs and with either RAMADDA or THREDDS. Certainly over the last few years a transition away from GEMPAK. IDV, THREDDS, and RAMADDA are the current trend. Major request for more time, to use the cool new data and technologies available. It would be nice if there could be more resources focused on the educational component, there still needs to be information for overworked instructors, students, etc. to offer resources to help these users take on new tools. YouTube instructional videos from staff and community members are helpful.
AWIPS II proceeding, downloads work but there are some hardware constraints. You need to have a video card with at least 1 Gigabyte of video memory. The path for AWIPS II looks much better than this time last year, with active testing at the universities and also within the private sector.
The user guide provided by Scott Jacobs may not be what was requested from this committee, but it was very helpful for us testing the AWIPS II beta to know what to actually do with the thing.
The User Committee will be conducting site contacts since they have not been done in a while. We are planning to start contacts this month. We have a set of boilerplate questions but it is meant to be open ended conversations on the phone. I suspect a lot of what we will hear is focused on the rate of tech change and are they running into the same time constraints related to learning these tools.
Additionally, Jeff Weber put together a nice presentation on teaching resources. You can now, put together case studies on the RAMADDA server that includes data. The COMET case studies were put up as an example, and we are requesting User committee members to let other folks know about it so it is spread among the community.
2015 is time for the next Triennial Workshop. The Co-Chairs for that workshop had a brief presentation and noted that the focus will be on big data and leveraging cloud computing. Programmatically one of our goals is to make this a more hands-on experience then the 2012 event.
Discussion about the Triennial: June 22nd is typically the WRF workshop week, is there a conflict? We have been on the docket since 2012 and WRF is usually at Center Green so it is unlikely to be held that week since there isn't an appropriate venue available. There is also a large field project during the month of June and the first half of July (PECAN) that may impact some folks.
Mohan's full presentation is available here.
Mohan provided an overview related to transitioning committee members, award recipients, and staff. This involved introducing Sheri Ruscetta who is responsible for workshop coordination, assists Terry with budgets, and provides travel backup for Ginger. He welcomed Dave Santek and Jeff De La Beaujardière on the Strategic Advisory Committee and noted the new User Committee members. A special thanks was given to Dave Dempsey for his many years of service to the UPC community. A brief history of the Russell DeSouza award was provided along with some background on Rich Signell, the 2014 recipient.
As an update on UCAR-wide initiatives there was an overview of the process to develop a new UCAR Strategic plan.
A highlight from this year's equipment award is that for the first in time our 30 year history, an award was made to procure services, specifically for cloud computing on Amazon EC2 (Embry-Riddle Aeronautical University).
The “Servers at NCEP” Project was cancelled as originally designed, there are some ongoing discussions about UCAR as an intermediary, with a CRADA being explored; however, this is still very exploratory.
Discussion of the Software Engineering Projects Diagram
As mentioned in the spring meeting report, the government shutdown, along with an unanticipated pre-award audit of our budget for the new funds by the NSF's Cost Analysis and Audit Resolution (CAAR) branch, delayed our core funding by four months and reset our award period of performance to 4/01/14-3/31/19. This delay caused us to draw down most of our reserve funds, fortunately we were able to weather this period due to our reserves and a few less FTE's due to full and phased retirements. Compared to UCAR as a whole, we have been very successful pursuing non-core awards and this has helped us as well.
Bernard's presentation is here.
Highlights of the discussion follow:
The Innovation Corps is focused on turning ideas into products.
A new program called PREEVENTS is centered on understanding the natural processes that produce hazards, solicitation will come out at the end of FY 15 at the earliest.
NOAA is doing a climate resilience toolkit, that is related but independent.
Comments on transition to Improving Undergraduate STEM Education (IUSE) from Transformation of Undergraduate Science Education (TUSE).
There was a discussion about whether the requests from the Chairman of House Science Committee for detailed information on NSF grants will have any effect on Unidata. The consensus was that these requests have focused on social science awards and are unlikely to impact Unidata.
Chris’ presentation is available here.
Chris recommended UPC staff or committee members participate in the Earth Science Data Systems Working Groups https://earthdata.nasa.gov/esdswg. Activities within in these groups include collecting how-to resources and promotion of groups and hierarchies within Netcdf 4 enhanced.
Jeff’s presentation is available here.
Highlights of discussion:
In response to a question regarding data repository, Jeff noted that for manuscripts it will be part of the repository created at NODC during deepwater horizon. At the data level, NOAA has to decide how to handle it because there is not enough capacity to ingest and archive all of the NOAA generated data let alone funded research.
A follow-up question asked whether the mergers of the three archive centers play into this process; however, Jeff noted that the changes associated with the mergers are at the human resources level and not focused on the location of data.
The 2nd Big Data Partnership RFI just came out, with an industry day on 2014-10-17; long-term outcomes of such a partnership are still being explored.
A related conversation noted the long partnership with NOAA and NASA on software development. However, there are UPC projects that would support NASA and/or NOAA missions and would benefit from additional resources. This need along with the need to encourage the use of UPC products within those agencies and their supporting communities should not be ignored.
Would folks working with EDEX in the cloud be incurring a significant cost? Yes, but we are trying to assess that exact cost when 12 university users are accessing the cloud.
The UPC is currently incurring Amazon EC2 costs but access to Microsoft Azure is through a resource grant.
What is the significance of the ACADIS recompete? They are looking at other coalitions to run this program. It does not signify NSF's displeasure.
Equipment awards talked about the alignment with the 2018 strategy plan but the award mentioned alignment with climate studies is that misprint? Mohan noted that the UPC is encouraging that but we haven't had a lot of success getting proposals on climate-focused stuff.
Discussing the IDV training that has been provided with the WRF-users workshop, committee members wondered if that is a growth opportunity to go where there are clusters of potential users already assembled. Mohan concurred and added that the UPC has been working to have the IDV as a standing item on the agenda at the WRF workshop.
Dave Santek raised the issue of funding updates on infrastructure to products like McIDAS and IDV that rely upon VizID which itself relies on Java3D that has very limited support. Most funding is for features so there is an underlying challenge or question about how to leverage resources for foundational updates to keep the software services viable well into the future.
Further discussion on this topic resulted in agreement that there should be closer collaboration between UPC and SSEC and that the IDV steering committee would be an appropriate venue.
Brian Mapes asked what does integrating discrete data under TDS mean? Is netcdf groups the way to handle that issue? Ethan Davis responded that Discrete Continuous Data is an Earthcube project that is bringing grid data together with point data. Meanwhile CF conventions are really strong in grids but have recently expanded to include point data. Jeff De La Beaujardière added that the Observations and Measurements Model from OGC might be relevant to this effort. WaterML was based on that and WMO is going to adopt that as a standard for time series data.
Dave Dempsey inquired what is meant by “enabling desktop” within the Rosetta update. Mohan clarified that right now you don't download the pieces to run on your machine; it is a service running on a server. The UPC would like to offer it as a stand alone package you can download.
There was a follow up question from the committee whether this was the same as tabular support in DAP. Staff answered that Rosetta started because Sean Arms had a lot of point observations and this was a way to generate CF compliant netcdf files from that data. Thredds doesn't really serve time series point data, so as we move to that area we will add that capability by mapping them into DAP sequences.
Ward Fisher’s presentation is available here.
Jeff De La Beaujardière noted that this does raise a strategic question regarding IDD, since it changes the direction; people would go to the data as opposed to the IDD model where we push data to users. Mohan noted that is why we are thinking of data proximate analysis and synthesis. Generating products on the cloud enables tablet uses and reduces the cost for using the cloud to pull down data. Chris Lynnes noted that this is very similar to the federated Giovanni use case.
A discussion of rewriting IDV for mobile applications noted that the IDV has over 16 years and 6M invested and a redesign would require significant resources. The data science community has the data science toolkit that is very useful and might be a good goal for what Unidata in a Box achieves.
Ward Fisher noted that in software development there is a big debate about streaming versus native application. The reason this particular demonstration is application streaming is because it is very easy to convert. There are some downsides when the program is not optimized for mobile and there is a cost associated with streaming data.
Chris Lynnes stated that they have noticed that the cloud computing is always faster and more up to date than their local stuff.
Mohan added that studies are now showing that the cost is comparable between cloud versus on site. Lots of things have moved to the cloud: email, calendar, etc. That is because it is hard to beat the cost if you ran your own service.
Kevin Tyle noted that more and more people are using tablets instead of laptops in class.
Rob Fovell mentioned that many of these “cloud” things that we are talking about are basically outsourcing.
Mohan offered that the real strength is scalability or elasticity.
Jeff De La Beaujardière mentioned that the federal government has issued a cloud first policy directing agencies to think about using the cloud before buying any servers. There is a lot of opportunity for entities like Unidata to provide services on the cloud.
Chris Lynnes suggested that Unidata try to participate in federal agencies working groups on the cloud.
Josh’s presentation is available here.
Committee members expressed interest in guides focused on specific needs such as data management and encouraged staff to produce such a product.
Action: Chris Lynnes requested that UPC staff produce a short-study of requirements for long-tail data.
Josh’s presentation is available here.
Conversation focused on the impetus for these discussions with private sector entities. UPC staff noted that this is primarily driven by the need to deliver additional services for the user community and the difficulty to receive additional federal funding in a flat to decreasing budget period.
The Earthcube conversation was cut short based on the length of other conversations. The main take away was to focus on how we can produce synergies out of this activity, e.g. DAP4.
This conversation is based on the white paper that Dave Dempsey provided to the Strategic Advisory Committee. It is available here.
A key piece to a flipped learning is to make the problem-solving collaborative in the classroom. For Unidata, Dave suggests the flipped learning model. One of the goals should be to consider ways to increase the level of commitment per participants (e.g. ask them to provide some product/presentation as part of their participation).
Committee members asked how do you apply this to training participants that range from knowledgeable users to absolute new users with say the IDV.
Dave noted that an online tutorial could be converted to something with feedback and completion of that would be required prior to in-person participation. This would help reduce the gap between novice, intermediate, and advanced users.
Another challenge is balancing depth versus breadth; this approach is good at providing more depth but won't cover everything. It would be good to gauge the interests of the participants ahead of time.
Mohan noted that the UPC can make this transition in our training but would like committee members to identify if there is something that we can do to help you shift to a flipped classroom or learning model.
Tom’s presentation is available here.
NCAR Strategic plan was recently completed and is focused on science.
UCAR's plan is focused on the rest of the organization and our mission supporting science. The draft plan has undergone an extensive review through the various stakeholders; however, comments to Tom are still welcomed.
There is an organizational focus on making more scientific data available. OpenSky is an NCAR open repository for scientific collections that is trying to address this need; committee members suggested that this resource be made available for member universities to add collections.
Within UCAR Community Programs there was a broad discussion of the focus on education, data, and climate-services. Specifically, on the proliferation of silos both within the community and the organization and the need to foster cohesion.
A 2013 Survey of university members resulted in a request for a teaming center; members know the atmospheric science community but need a resource to make contacts within related but distinct fields (e.g. an agricultural economist studying the effects of climate change on crop values). There are some limited funds for this work as well as the development of something such as a UCP university partners board. Dave Dempsey from San Francisco State University agreed to participate in such an activity and would be able to represent the needs of the Unidata community.
High-level topics of interest, included:
Community Services - Unidata
University Corporation for Atmospheric Research
P.O. Box 3000
Boulder, CO 80307-3000
303 497-8646 fax: 303 497-8690