The 2014 Unidata Summer Internship offers graduate students and upper-level undergrads an opportunity to work with Unidata software engineers and scientists on projects drawn from a wide variety of areas that overlap atmospheric and computational sciences. Unidata's mission is to support the Earth Science research and education community with innovative data access, analysis, and visualization tools. As an intern, you will use, design, and/or modify existing Unidata software in innovative ways to better support the Unidata Community. Projects could come from the following areas (a more detailed list can be found below):
The 2014 program is a 10-week summer position (40 hours / week) running 27 May to 1 August, 2014.
In addition to software projects, as a Unidata intern you will participate in the following activities:
You'll gain professional writing experience by documenting your work, at both the code and user documentation level. You'll be encouraged to participate in the Unidata Developers Blog throughout the summer.
In the last week of your internship, you will have the opportunity to give a Unidata Community Seminar describing your experience and the outcomes of your summer work. You will attend technical staff meetings to report on the status of your project.
As a Unidata intern, you will be part of a larger community of UCAR and NCAR summer interns. Interns are expected to contribute positively to the UCAR/NCAR community and to conduct themselves in a manner appropriate to a professional environment. Interns are expected to fully participate during normal office hours.
Applications for the Unidata Summer Internship are now being accepted. Applications are due by 21 February 2014 and must be submitted through the UCAR Career Portal, see the 2014 Unidata Student Summer Internship (14050) posting for full details.
Part of the application process involves writing a letter of interest describing the work you are interested in doing during the internship. Please indicate which Unidata software package(s) you are interested in using or developing and how your work would positively impact the Unidata community.
Below are a number of possible projects that Unidata developers and scientists have suggested. Please feel free to expand on one of these. However, it is not necessary to restrict yourself to this list, if you have your own ideas of how you would like to enhance or use a Unidata software package to positively impact the community, please write it up and submit it.
Work with Unidata developers and others in the Python Open Source community to improve and expand PyUDL. PyUDL provides access to services provided by the THREDDS Data Server and the ADDE server; it also provides basic plotting utilities building off of the Numpy/Scipy/Matplotlib libraries.
Work with Unidata developers and scientists to develop basic data visualization and analysis applications using HTML5 and JavaScript. Investigate the use of data visualization libraries like D3 and OpenLayers.
Work with the THREDDS developers to extend the set of data access web services available through the TDS.
Implement a C API for optimizing access to big datasets by setting optimum netCDF-4 chunking (multi-dimensional tiling) and compression parameters. Write tests and demonstration code showing huge improvements possible when optimum chunking is used with datasets too large to fit in memory.
Work with Unidata developers and community members to investigate the use of Google Earth Engine with earth science datasets important to the Unidata and broader community.
Work with Unidata software engineers and scientists to develop and document a library of IDV bundles for use by the Unidata community.
Work with Unidata software engineers and scientists to develop and document a library of Jython based routines with an API based upon those currently available in Python visualization/analysis packages (e.g. Numpy, Scipy, Matplotlib, Basemap).
Improve netCDF Fortran-2003 API to exploit new features of the language, such as portable derived types.
Write a few simple netCDF utility tools that replace the overloaded options and complexity of current ncgen and ncdump tools. For example, a tool to generate Python, C, or Java code to read or write an input netCDF file (without using CDL) would be useful.
Work with the THREDDS developers to add a TDS service to capture specified datasets for a given space and time extent.