Here are a few examples of data requiring more than 32 bits of precision and
the measurement time to reach the IEEE 64-bit double precision:
Research Field Required Precision Time for 54bits
-------------- ------------------ ---------------
1. Radio pulsar timing 0.1 microseconds over years 50 years
2. Astronomical X-ray data 0.01 microseconds over years 5 years*
3. Planet Ephemerides 1 microseconds over decades 500 years
4. Laser/Atomic Spectroscopy 0.1 femtoseconds over msec 1 second*
5. Fusion 1 nanoseconds over seconds 1 month
6. Neutrino detection 1 millisecond over decades 500,000 years#
* Already exceeded
# Unlikely to exceed (my opinion)
The point is that there needs to a hierarchy of time levels exceeding 2, as
as pack@xxxxxxxxxxxx as proposed.
I doubt I will respond to the next question (Does anybody know of any
datasets requiring ...?).
I am a new user of netCDF. I was attracted to it because it had no inherent
properties dedicated to a particular discipline, like FITS, for example.
I hope the developers keep this discipline-free attribute ranked high as they
decide how to improve a useful system.
=Fred Knight (INTERNET:knight@xxxxxxxxxx)