Kevin Tyle, University of Albany, Chair
Michael Baldwin, Purdue University
Martin Baxter, Central Michigan University
Sen Chiao, San Jose State University
Jennifer Collins, University of South Florida
Bart Geerts, University of Wyoming
Steve Lazarus, Florida Institute of Technology
Sam Ng, Metropolitan State University of Denver
Russ Schumacher, Colorado State University
Kimberley Hoogewind, Purdue University
Becky Cosgrove (CONDUIT) (absent, Justin Cooke filled in)
Michelle Mainelli (GEMPAK-NAWIPS/AWIPS II) (remote)
UPC Staff Attending
| Ryan May
Josh Young introduced himself in his role as the new Community Services manager. Introductions around the room.
Next Meeting: Per a surprisingly unanimous result in a poll of the committee, the next Users Committee meeting is scheduled for 15-16 September 2014.
The Russell L. DeSouza award ceremony, which has previously been associated with the Spring Users Committee meeting, will now take place during the Fall meeting. Selection of the nominee will still take place in the spring, but moving the award itself to the Fall meeting will give the awardee more time to prepare.
Discussion of Fall 2013 Users Committee Action Items
Previous ACTION 1: Committee members will submit observational data sources amenable to conversion by Rosetta to Sean Arms.
Russ Schumacher has submitted a sample data set to Sean. The committee decided to make this an on-going action, and will continue forwarding suitable data sets.
Previous ACTION 2: Hydrometeorological classification radar composite to be added to IDD. Other desired composites should be sent to Tom Yoksas for consideration.
Michael James reported that the HHC (Hybrid Hydrometeor Classification) radar composite is now being generated on Unidata's Amazon EC2 instance and has been added to the IDD. Display examples are available on the GEMPAK page (High-resolution NEXRAD national composites).
Previous ACTION 3: Request RAMADDA developers to push federated servers back out so IDV community sites catalog will be up-to-date.
Kevin Tyle has spoken with Jeff McWhirter about enabling federation. Federation is configured on individual RAMADDA servers; the process is currently not completely transparent. Until we can get more detailed info, Kevin asks that Sean add Central Michigan University and the University of Miami to the IDV data source catalog (New ACTION 1).
Previous ACTION 4: Kevin Tyle will send one more reminder about 2012 Workshop follow-up survey to workshop participants.
Mohan noted that we must make it clear to NSF that Users workshops are useful. There was limited response to the 2012 Users Workshop surveys; Kevin will work with Josh to try to get additional feedback from attendees. Defer until fall meeting (New ACTION 2).
Previous ACTION 5: Unidata will draft a document explaining Open Weather and Climate Services, and will distribute it to the community to assess interest/concerns, after feedback from Usercomm.
An OWCS Project page is online on the Unidata web site.
Previous ACTION 6: Contact OU to request they make their multi-radar composite available in netCDF.
Ryan, Sean, and Jeff have been pursuing this. We have the composites coming from OU to one of our machines at Unidata, where we have some converters in house to aggregate the tiles that come from OU servers and package in a useful format.
Previous ACTION 7: Create a space on Unidata RAMADDA server for lab/tutorials/teaching resources.
Jeff has created an example/template for a teaching resource RAMADDA site on Motherlode. Discussion and demo took place Tuesday morning.
Previous ACTION 8: Unidata will publish best practices for creating netCDF files of a manageable size on web site.
Russ has published information on compression in this blog post, and has plans to incorporate the information into official documentation.
The most recent Policy Committee meeting was October 10-11, 2013. At that meeting, the committee decided to rename itself the Strategic Advisory Committee (Stratcom?), to better reflect the committee's activities (providing advice, long-term strategy, and "vision"). The fall meeting was also the last meeting at which Steven Businger served as chair; Bill Gallus begins a new three-year term as chair of the committee with the Spring meeting, which will be May 20-21 in San Francisco.
One topic was the growing sense in the community that Python is an important development in scientific computing and a good way to bring students into the programming community. FORTRAN skills are not necessarily practical outside the WRF community. Kim Hoogewind volunteered that she is involved with a new course in which Python and FORTRAN are taught/used in parallel, with students doing assignments in both languages. The class is currently working on netCDF data access. Kim offered to share the class web site with the committee.
Sen Chiao noted that San Jose State has a class with lots of hands-on exercises using Python code. His experience was that students took computer science courses but didn't use those skills, and ended up forgetting them by senior year. Python seems to be more widely used and less quickly forgotten.
Kevin Tyle wondered about the status of adding SkewT and other relevant plot types to the Python matplotlib package. Ryan is involved in this effort and has some example scripts available.
Rich Signell pointed out that the UK Met Office uses Python extensively, and has a package (Iris, part of the SciTools program) that handles CF-compliant data sets in the way that Unidata's CDM does. There is currently a problem with using UDUNITS in this context. There was also some discussion about the fact that Iris is licensed with the GPL3; Rich suggested getting in touch with the Met Office developers about the issues; they might be amenable to a different license. Rich also noted that Khan Academy has a good Python course.
The Stratcom also had a demonstration of the new adaptive resolution feature in the IDV, which turned into a discussion on making polar orbiting satellite (POS) data available to the Unidata community. Stratcom resolved that:
The committee recommends that Unidata continue to investigate making Polar Orbiting data and visualization tools available to the community. We recommend that the Users committee further define what the community interest is.
Marty Baxter wondered whether the IDV could visualize the POS data? Steven Lazarus said that Yuan had helped him display some in the IDV.
Mohan pointed out that it is also possible to use the POS data in GEMPAK and McIDAS, and that it's possible to use hyperspectral data in McIDAS-V via the Hydra package. Unidata wants to work with the University of Wisconsin to integrate the Hydra functionality in the IDV. Mohan then asked for a show of hands regarding who is currently using POS data? Kevin and Steven currently do, but using derived products rather than raw hyperspectral data. Sense of the committee was that access to POS data could be useful, but not sure how they would take advantage. Committee members agreed to talk with colleagues in their departments who specialize in remote sensing to gauge interest (new ACTION 3). Bart Geerts noted that many people use MODIS and various NASA products, but not in real time and may not be looking for push data service for POS products.
Mohan ended his report by announcing that the committee had chosen Rich Signell as the recipient of the 2014 Russell L. DeSouza award.
Questions and comments following the Director's Report:
Any plans for a 30th year event?
Probably not, as the NSF audit for 25th anniversary event raised a number of concerns. No more "anniversary" or "celebration" events.
Kevin Tyle notes: I was at an EarthCube user group meeting, and we talked about flipping the 4-1 ratio of managing data to doing research. At that meeting the idea for a wiki page for folks who want to do modeling arose. Can Unidata ease the road to make tools/data available for modelers? Perhaps a user contribution RAMADDA page to document data sources and processes, also maybe a Yelp-type infrastructure to gather community feedback about the value of different methods?
Can you give a brief summary of building blocks projects?
There are three:
Using Python, integrating with netCDF. Creating some image loops to display forecast reflectivity from HRRR (Ryan helped with this), I can send the link around. Also been working on some IDV ISL scripts with good success — have been able to create some publication-quality graphics using this method! Some things are missing from the ISL language, but I got some help from Don Murray. Interested in advancing the ISL language.
Tom Yoksas noted that the University of Wisconsin does not like the ISL language, so they are trying to make all ISL features available in jython. Rich Signell pointed out that when generating video loops, if you use multiple processors you can speed things up a lot.
At Metro State we're weaning students/faculty off of GARP; 95% are using the IDV now. We've installed a RAMADDA server but it's not live yet. Students have been complaining about IDV slowness, but I believe it's related to lack of dedicated graphics cards in the machines used in our lab. We do have a hardware refresh coming up, and I hope to get machines with discrete graphics cards. Introducing IDV to Intro to Meteorology students; it is useful to be able to incorporate GIS shapefiles into the IDV workflow.
I am still on sabbatical at NC State. I gave a talk at AMS on how to put together a case study using IDV and RAMADDA; several Unidata folks assisted with that. (I can send that around.) I'm collaborating with NOAA ESRL, using Python to read in GRIB and netCDF files, hosting data on CMU's RAMADDA so students can view them in the IDV. Rich Signell's posts on StackOverflow have been useful.
I am also on sabbatical. I've been using IDV for research purposes, and I used it in some of the work that went into my AMS poster. I'll be teaching a remote sensing class in the Fall, and I'm thinking about using a week at the beginning of the class to spin students up on IDV. Looking forward to using the adaptive resolution stuff.
Our department has gone through some changes in past years. It is more stable now, and has been merged with the Geology Department. Our department is also now associated with the Alliance for Integrated Spatial Technologies (AIST). I recently worked with Sean to get access to some NCDC reanalysis data. Steven Lazarus and I are still interested in hosting a workshop in Florida in Spring 2015.
I've been using IDV and GEMPAK in my regular teaching, and I'm starting to see senior student thesis projects using the IDV. We also set up an AWIPS II EDEX server and two client machines using Unidata equipment award grant. The AWIPS II learning curve is large; switching to a different location is not straight forward.
We also received an equipment grant last year. Our students wanted a central repository for real-time and archive data; this is up and running. We are also looking to set up a TDS in addition to the local mounted disk access we created. I participated in a second national ensemble workshop; Josh Hacker, Gretchen Mullendore, Carlos Maltzahn and I created a whitepaper for NSF. I've started using NCSS for point data from model output.
I'm still using GEMPAK and NAWIPS for my senior-level weather forecasting class. I had hoped to upgrade to Python and IDV, but haven't been able to make the switch. Our department hired an IT site specialist to help with internal tech. We need to train him in configuration of the LDM, etc. I did use some Python in a High Performance Computing class — I am just ramping up.
On the Python front: we recently did an IOOS system test, looking at Hurricane Sandy data. The question was can we find all the models producing water levels and all observed data and bring them together? Our system searches standard catalog services, gets OPeNDAP endpoints, all in Python. I will send out some info to the committee.
I've been processing CMIP2 climate data, writing to GRIB2 for ingest into WRF. Sometimes I've used netcdf-java. I've also done some work (alongside a TA) with the IDV in a synoptic lab after GARP went away. I've also been dabbling with Iris and other new Python tools.
We have a community-supported aircraft facility. We've used radar data in the IDV; it has been useful to access data and produce cross-sections. Polar orbiting data would be useful in this context.
Michelle and Justin's slides. (Presented remotely via WebEx.)
Justin Cooke filled in for Becky Cosgrove, reporting on CONDUIT status.
Hardware update will be in place in College Park in May (2014). Once the new hardware is operational, NCEP will add some new datasets, but at first these will not be backed-up on NCEP's Boulder systems. At that point NCEP will reevaluate the list of products to be added or removed made as a result of the 2012 CONDUIT User Survey (see Becky's report from the Fall 2012 Users Committee meeting.)
Other CONDUIT data stream questions:
Justin asked if the committee wanted the HRRR 3km grids in CONDUIT?
The HRRR 2.5km grids are available on NOAAPort, but these are a subset of what is available in the model. Weather Forecast Offices use the 2.5km grids.
Russ Schumacher suggested that it would be nice to put the NAM nest output into CONDUIT. Data is available elsewhere, but not so easy to get as through the IDD mechanism. Kevin suggested that we survey to determine whether there is enough interest in the community to warrant inclusion in CONDUIT. (New ACTION 4).
Michelle noted that she and Becky missed the October meeting as a result of the federal government shutdown.
AWIPS II 14.2.1 Release is coming in April, including 64-bit RedHat 6 support. Raytheon's AWIPS II contract ends in Q4 of FY2015. 24 WFOs have made the transition to AWIPS II, as have the National Centers (although the Centers still have NAWIPS available). More frequent (~1-2 week) AWIPS II releases will be available during the summer months: NCEP-only updates will also be available.
GEMPAK 7.1 is due out week of 7 April 2014. An interface allowing GEMPAK to access the AWIPS II EDEX was included in GEMPAK 7 (see GEMPAK_AWIPSDB_Notes.pdf for additional information.
AWIPS II Thin Client testing also begins 7 April. Developers are working on a facility to launch scripts on the server via the thin client interface. Other thin client notes:
There was some discussion about the nature of the AWIPS II thin client — currently it requires a workstation-class Linux machine. (Not exactly what one envisions a thin client to be.) Windows support is hoped for, mobile platforms not currently envisioned.
Staff status reports were made available to the committee prior to the meeting. Committee members had comments or questions on a subset of the projects reported upon:
Michael James asked how many committee members were interested in running only an AWIPS II client (no EDEX). All members said yes. Michael is looking into the feasibility of a community EDEX server. Tom Y noted that the normal CAVE client talks to the EDEX for some things that the thin client does locally. Tom also noted that at the AMS meeting, SAIC was demonstrating AWIPS II running in an Amazon EC2 instance.
Kevin Tyle wondered about the status of the National Centers Perspective (NCP) vs D2D. Older NCP was not very functional where D2D was. Michael said that NCEP had added a lot of functionality in the past year, 14.2.1 release should solve some of the problems.
Kevin inquired about the NWS Changes mailing list. Josh Young will be taking over the responsibilities of that list.
Yuan said that the IDV 5.0 release is expected in a few weeks to a month. No real reason to rush it out before the end of the current academic term.
Kevin wondered if there had been any progress on negotiations with EUMETSAT regarding access to the Meteosat data. Tom Y replied that there has been no "no" issued, which is positive. But nothing from Director General of Eumetsat so far.
Kevin wondered: Will added bandwidth on NOAAPort affect ngrid?
Steve E didn't know whether they had tested bandwidth upgrades against the LDM. Tom added that we'll have a testing phase when they switch (August/September 2014 timeframe), and there will be a roughly one-month overlap during which they'll broadcast both old and new data streams. UPC will repurpose the GOES-E dish at UCAR to point at the new NOAAPort satellite.
Tom mentioned that there have been monthly McIDAS-V/IDV meetings, and Yuan is participating. Yuan added that UPC has been urging the McIDAS-V team to adopt the new adaptive resolution feature quickly, which would make it easier to bring the Hydra module into the IDV.
Rich Signell pointed out that it would be a big win for the community to know more about netCDF-4 chunking and compression. Russ has written some nice blog posts, but chunking/compression techniques are not widely known by data providers.
Kevin noted that netCDF4-FORTRAN will begin using cmake, and wondered what that meant. Russ R told the committee that cmake is similar to autoconf, but also works on Windows. We're getting close to being able to build netCDF-FORTRAN on the Windows platform.
Rich wondered if the same technology could work for UDUNITS; not having a Windows build of UDUNITS slows progress on the Iris package (UK Met Office Python package). Steve E replied that UDUNITS can be built on Windows, but the process is environment-specific. Rich suggested Unidata could make a big contribution by having a repository for "all these binaries."
Kevin wondered if anyone can grab ASCAT data? Michael replied not currently, he's working on getting ASCAT/OSCAT data from NESDIS, with the end result of having the data on the IDD. They (NESDIS) are okay with UPC giving out the data, but they need to set up an LDM (New ACTION 5).
Kevin: How does one display scatterometer data in GEMPAK?
Michael: Use gpmap or gpscat (which Michael created specifically for this data)
Steven L: Can nmap2 display the data directly?
Michael: Yes, but it's specific to the ASCAT-high data format. It's not yet available to universities. (He'll say more on availability next meeting).
Russ S: What's the difference between n0R and dhr?
Michael: dhr is similar to a composite, and is visually similar to the N0Q product. An advantage of the dhr product is that it will eliminate a lot of the artifcats that appear in the N0Q product. There are four new high-res composite products.
Marty B asked if the committee could have a more in-depth presentation about Python activities at the UPC at the Fall meeting (New ACTION 6). Rich suggested setting up a remote meeting with Python developers at the UK Met Office (New ACTION 7).
Russ S noted that SkewT plots are now available in matplotlib, and wondered if they are customizable. Ryan responded that Matplotlib 1.4 (next release) will have some enhancements, but as far as a "proper" SkewT class, he's looking for a place for that to live. He has plans for additional features (moist adiabats, etc.).
Rich noted that Googling
returns interesting resources.
Rosetta code is now on Unidata's Github site.
Marty B asks: if we had an on-campus weather station that gives us text format data, can Rosetta turn it into netCDF? Sean says Yes, the idea is to turn the text data into a machine-readable format with useful metadata.
Kevin asked if anyone had set examples of Davis Weather Station format files? Sean says No, please send them along.
Kevin: What's up with this big TDS upgrade?
Sean: There were some updates to the NCEP grib tables, which we try to use as they are provided. This surprised some folks. Second surprise was that thredds.ucar.edu was down this morning. Before security breach, we had test and dev servers on the live machine, but now the test and dev servers are on linux machines but main server is on a Solaris machine. An OS-specific difference caused the main server to run out of file handles. There were also slight changes to URLs for the netcdf subset service.
Kevin: The timing (mid-semester) was not particularly good.
Rich: Releasing on a Friday is not great form.
Sean: I have an IDV plugin that allows users to test a change to the TDS server. How long should community have to test? One week?
Kevin: One week is not enough. Maybe a month.
Ethan D: Who else beyond IDV users do we need to notify?
Mohan: We should review our practices, based on this incident.
Rich: Maybe a special "-Announce" mailing list?
Rich: We could use additional members on the THREDDS steering team.
We have been meeting infrequently. Some are trying to use the CF discrete sampling geometry,
and want support for unstructured grids, triangular grids. There is money available from the
Sandy supplement: a proposal to do the
unstructured grid work outside Unidata but in cooperation with Unidata was floated.
Ethan D: the current committee is heavily weighted toward data providers and developers. User perspectives would be more than welcome...
No committee members (except Rich) are running a TDS currently. Russ S is planning to start one.
Mohan: We should build connection data into TDS so that we know where all the servers are.
Kevin: The last meeting was a few months ago.
Yuan: We recently had a discussion of naming for the adaptive resolution feature.
Mohan gave a very brief update of current status.
Rich: UK Met office is also interested in standards for server-side processing.
Ethan: UCAR and NOAA folks are involved in the Met-Ocean group at OGC.
Mike B: How will this be a benefit to our community? Is it just a mirror of the raw output? It will still need post-processing — how will that be different from what we get from NCEP?
Mohan: NCEP has agreed to put data out in standard form (GRIB2) and in standard lat/lon grid. We must tell NCEP what data we want, what resolution, etc.
(Tim Schneider, Bonny Strong, Don Murray, NOAA)
Tim Schneider gave a quick presentation on the High Impact Weather Prediction Project (http://hiwpp.noaa.gov/). Tim's slides [WILL LINK HERE WHEN THEY ARE AVIALABLE].
Bart G: Will there be some models that you don't go forward with?
Tim: We are evaluating several models; may not incorporate all.
Marty B: Will NCEP ever have the computational/personnel resources to run what comes out of this project?
Russ S: Is this primarily a model improvement project, rather than a forecast improvement project? (Better forecast means ensemble forecast?)
Tim: A bit of both, leaning more toward improving the models.
Marty B: How are you soliciting user input?
Mohan: Are you envisioning providing a platform where trusted partners can use your tools to create their own visualizations?
Tim: Yes, see the NOAA Earth Information System (NEIS)
Do an article about this project for News@Unidata (New ACTION 8).
There was discussion of an interface to load GEMPAK upper air files. Julien asked for feedback on how to configure which levels are included.
Next, Yuan gave a tour of IDV 5.0 features.
Bart G: Under User Preferences, let user range be "adaptive" so you can see variations as you move in (use the data range rather than fixed range).
Kevin: Scroll bar zoom does not use adaptive resolution, right?
Yuan: Yes, as a performance optimization.
Kevin: Can you explain memory usage? The old recommendation was 70% of total memory, now IDV maxes out at around 3GB.
Julien: JVMs are evolving; I encourage you not to look at the memory usage numbers at all. IDV tries to set memory preferences for you.
Yuan: For the 5.0 release we will make adjustments to the defaults.
Kevin: Brendon Hoch raised the ongoing issue of poor video performance on multiple monitors in the Plymouth State University electronic maproom. Based on his interaction with IDV and Mc-V developers, the problem is with Java3D when used on AMD video cards. Is this being addressed at all in the Java3D world?
(Discussion, but general consensus that AMD video hardware is not well supported, and there is really not a solution at this point.)
Kevin: Is there a way to allow IDV to access data using relative in addition to absolute time specifications when loading data from ADDE servers?
Marty B: Archive point data from SSEC ADDE servers is not handled cleanly in IDV. Other formats work properly. This should be a priority to fix. IDV team to investigate this (New ACTION 9).
Yuan: 5.0 formal release in the near term.
Julien: We are planning a screencast to document some of the 5.0 features.
The 2015 Users Workshop will be the week of June 21, 2015. Space is reserved at Center Green for the workshop. Usercomm members are expected to participate in the meeting itself, but also with planning, beginning with the current spring 2014 meeting. Also will need a subcommittee to contact speakers etc.
Mohan: We have been doing this since 1998 every three years.
Do we still see the value in holding this workshop?
It's a lot of work, we should be absolutely sure that this activity is truly useful.
We must get funding from NSF specifically for the workshop.
Bart G: Perhaps the same amount of money could be spent for a remote conference every year rather than a physical one every three?
Kevin T: In 2012 there was limited time for one-on-one with developers and speakers. This was a downside.
Jennifer C: In my mind the next workshop should be much more hands-on.
Jeff W: There was not so much take-away in 2012. If we have hands-on activities people will take new knowledge home.
Tom Y: Knowing what to do with satellite data would be useful.
Marty B: Satellite data would be timely with GOES-R coming online.
Rich S: Anyone doing anything with software carpentry (http://software-carpentry.org/)? We could send a small team to software carpentry workshops to evaluate.
Russ S: If it is a hands-on thing, it's an experience that will make you a more efficient researcher or educator — maybe that's pitch to NSF.
Mohan: Am I sensing that there is broad consensus that these are valuable? (General agreement.)
Steve L: We need to be more meticulous about the outcome we desire. We should have a clear-cut template of expectations for the speakers. Make it very clear what we would like speakers to do.
Jennifer C: Would it be useful to ask presenters to work with developers to revise presentations in advance?
Mohan: Should we extend expectations to attendees as well?
Sen C: Perhaps we can be more selective about attendees — ask everyone the type of questions we ask students. What do you hope to get out of the workshop?
Josh Y: Aim to end up with some kind of cookbook after the workshop?
Steve L: Ask participants to take a questionnaire or read something before they arrive.
Mohan: Dave Dempsey said in a previous meeting that you should expect those who come to the workshop to come prepared.
Marty B: There is precedent for asking participants to prepare (COMET), although it is different for those who are required to attend.
Steve L: This all makes more work for us, but makes us think about what we want the outcomes to be.
Mohan: NSF has asked for documentation that these workshops are useful.
Marty B: How far do we want to go in making a rigorous assessment?
Mohan: NSF just wants to know whether the educators who attended implemented the ideas discussed.
Marty B: Pre/Post survey for comparison?
Figure out how to collect this type of data and do assessment. (New ACTION 2)
Steve L: We could divide up the list of attendees (of the previous workshop) and call them...
Kevin: End goal is for attendees to pass the expertise along.
Marty B: For 2015, fewer speakers, longer sessions.
Mohan: Figure out what you want attendees to take away, and work backward.
Marty B: Would it be reasonable to expect people to apply to attend?
Jeff W, Kevin: Don't want to raise too many barriers to attendance.
Jennifer C: How early can we advertise?
Russ S: How to best appeal to both novice and expert groups (vis-a-vis software carpentry)?
- Cloud computing topics
- Remote sensing / satellite data
- Expand to non-met fields (geology/oceanography/etc) (Remember that funding comes from AGS)
- Software carpentry
Teaching Resources RAMADDA
Jeff W gave a brief demo of the Teaching Resources RAMADDA site. He emphasized that we're trying to gain some conformity between entries, and give some recognition to the individual schools for posting resources.
Sen C: Is there a limit on the amount of storage?
Jeff: Currently we don't have a hard limit on data uploads.
Marty B: So you envision us putting data up there?
Jeff: Yes, IDV bundles are an obvious choice, but other data could be stored too.
Kevin: But you could also point to your own RAMADDA or TDS?
Jeff: I would like to have a centralized copy at first, until the RAMADDA federation is more solid.
Jennifer C: I like the university organization rather than focusing on faculty. What about organizing by themes/types of research?
Jeff: Search is quite effective for finding themes, etc. Whatever organization scheme works for you.
Marty B: My case studies hit ADDE servers. Can we add a data folder?
Sen C: In this way I don't need to run my own RAMADDA server?
Jeff: With luck if people find this useful they'll want to install their own RAMADDAs and federate with Unidata's RAMADDA.
Sen C: Could this link with COMET modules?
Jeff: You could include GCMD keywords in the free text fields for interoperability with other directories like COMET's.
Jeff: We have also considered trying to mint a DOI for each published case study.
Marty B: Can we tie a date to each case study?
Jeff: I'd like to require time and space metadata so people can search by space and time.
Mohan: This can be a launchpad for addressing NSF data management requirements.
Ethan D: DOIs are a promise that we will maintain the data *somewhere* — do we want to get into that?
Mohan: We have options to partner with other data archives.
Jeff: I think of DOIs as an incentive so that authors can get credit for this type of work.
Ethan: Thinking about partners makes sense.
Kevin: How do we upload data? If we don't have accounts talk to Jeff?
Rich S: Do you feel like you get credit for data citations?
Marty B: Not really ... they're nice for the CV but...
Jennifer C: My students are starting to do this, but universities are not yet recognizing data citations.
Marty B: Having DOIs might increase confidence that the resource will not go away.
Mike B: As people cite the data in their papers this might become more valuable.
Site contacts interface
Additional possible standard questions:
Doug D to return to committee with revised topic list, Jen to create Google doc with the list of sites to contact.
Security issue discussion
Mohan: are there RAMADDA vulnerabilities we need to think about? (None known.)
Kevin: Can we remove "change your password" message if the password has been changed? (Yes)
Sen C: Is there any liability issue related to security breach? (No, no private data like SSN or credit card info was compromised.)
Python is really taking off. Thinking about how to integrate Python projects with what Unidata is doing. Not just visualization but data processing. I'm not all that familiar with this, and I'd appreciate a way to get up to speed on how this all fits together.
It would be cool to have a web site where you can do vertical cross sections as a web service (GARP as a web service).
Russ S: I had a student who built something like that but abandoned it. I'll try to find a link and forward it to the committee.
Mohan: It would be a good project to use Python to create such an interface.
I'd like improved documentation for netcdf4 chunking/compression, including some best practices. It would be nice to be able to access the netCDF-Java ToolsUI directly from Python.
Unstructured grids! It would be nice to be able to drive inquiry through a catalog search that ends up hitting web-based services. What are the best practices for getting info from my datasets into the catalogs? Demos in something simple (like iPython notebook) to show people how things fit together. Maybe Unidata could be a sponsor of scipy?
I see a lack of software engineering / computer science knowledge in the academic community. I think this is hampering our progress.
Can we set up an ensemble in the cloud, maybe using WRF? Get it running on Amazon or wherever, then teach students how to process that data and get results back. That kind of ensemble prediction could allow broader participation. Is this a possible workshop topic: "geoscience data programming?"
We need to create a pool of lectures, demonstrations, etc. that we can share. Met Undergrads need additional computer science training to be prepared for current workplace. Need to build expertise with new computing paradigms.
Mohan: So the goal is to "increase IT and software engineering literacy among geoscientists?"
Marty B: We need a use case example of how cloud computing can benefit us as atmospheric scientists.
Rich S: Wakari is a good example — someone can reproduce your science easily, without installing software, in just a web browser.
For Fall meeting have a live demo/example of how to leverage the cloud with science examples (New ACTION 10)
Can someone write a netcdf-Python book for O'Reilly? (Russ R has an O'Reilly proposal in the works.)
I'd like to see a community ensemble in the cloud. I'm also intrigued by the "real-time research" idea — how can I/we take advantage of things like the Reston project in terms of using more immediate data? To do publication-quality research we need archives of the data — how does this fit into the idea of using "real-time" data from NCEP or elsewhere?
I would like an IDV-lite on a mobile device. This would promote users using Unidata stuff. Creating an app could be a higher priority.
IDV fonts are a stumbling block — can't create a publication-quality image or video currently.
I'd like to see more emphasis on teaching resources, videos. Don't think desktops; big cpus will not be around in the classroom much longer. Need access via mobile devices. This could also expand community.
Next spring I am teaching a forecasting class based on work I've been doing. I anticipate using IDV. I'm looking for ways to integrate IDV technologies into the classroom to improve students' knowledge. I'm trying to create some best practices for case studies — how to create them, what format to use, how to get them into RAMADDA, how to use them in the classroom. Are there partnership opportunities for Unidata? I'd be interested in working on this with Unidata — maybe there could be a special session at AMS to figure out how people are using these tools?
Yuan H: Your first request is something like a weather simulator — we need to better understand this workflow.
I'm just starting with Python but want to get into it. I'm still a bit sketchy about cloud computing so I don't yet feel I can understand the capabilities. I would like a RAMADDA tutorial. So many models out there, students don't know which to look at; it might be good to winnow the model choices (in IDV).
Rich S did a short demo of Wakari.
Steve Lazarus and Michael Baldwin agreed to serve as co-chairs for the 2015 Users Workshop.