Andy,
Here are instructions for converting a text data file to a gridded NetCDF file
which the IDV can contour.
The example uses an online source of point surface gravity anomaly data. This is assuming you have Linux and NetCDF and
GMT (http://gmt.soest.hawaii.edu/) installed and working. These packages allow very complete data manipulation.
get GeoNET gravity data from
http://paces.geo.utep.edu/welcome.shtml which has long-lat-data column order
suitable for GMT
see also http://paces.geo.utep.edu/gdrp/Search.aspx for gravity
or from
http://gis.utep.edu/Projects/DataCatalog/tabid/57/Default.aspx
"The new terrain corrected United States gravity database is available through the GeoNet link below. The terrain
corrections were calculated by Mike Webring of the U. S. Geological Survey using a digital elevation model and a
technique based on the approach of Donald Plouff. The reduction of these data will be updated this year with modern
geodetic datums and a higher precision digital elevation model. In the present version, latitude and longitude values
are referenced to NAD27 (North American Datum 27; horizontal datum) and NAD83, elevation values in meter are referenced
to NGVD29 (Vertical datum) and NGVD88. Both of these datums are listed in the dataset download.
and thence esp. from http://irpsrvgis00.utep.edu/repositorywebsite/
"GeoNet Gravity and Magnetic Dataset Repository Pan American Center for Earth and
Environmental Sciences (PACES)"
Cut out desired columns of data:
for the old paces .dat data files get - the 9th column in this case, plus cols
1 and 2: (all one line)
cat paces_colorado_gravity.dat | perl -ne
'chomp;$col=9;$s=$col-3;/^([\w\-\.]+)\t([\w\-\.]+)\t([\w\-\.]+\t){$s}([\w\-\.]+)/;print "$1,$2,$4\n";' >
paces_colorado_gravity_Bouger_values.xzy
for the new paces *.txt data files use
cat paces_wyoming_gravity.txt | perl -ne 'chomp;@s=split(/ +/,$_);print "@s[3],@s[5],@s[12]\n";' >
paces_wyoming_gravity_bouger.xyz
to get lat (col 3) long (col 5), and bouger anom (col 12).
the result is an a "GMT xzy file", an ascii csv file; (long and lat in that
order must be first output columns)
remove the first 3 to 6 bad lines which are relicts from the source file header
lines.
Run the GMT "blockmean" program like:
blockmean -R-110/-101/36/42 -I78c -V paces_colorado_gravity_Bouger_values.xzy >
paces_colorado_gravity_Bouger_blockmean.xzy
or
blockmean -R-118/-103/40/49 -I30c -V paces_idaho_montana_wyoming_gravity_bouger.xyz >
paces_idaho_montana_wyoming_gravity_bouger_blockmean.xyz
or
blockmean -R-112/-104/41/45 -I30c -V paces_wyoming_gravity_bouger.xyz >
paces_wyoming_gravity_bouger_blockmean.xyz
results like:
blockmean: Given domain implies x_inc = 0.0216867
blockmean: Given domain implies y_inc = 0.0216606
blockmean: W: -110 E: -101 S: 36 N: 42 nx: 416 ny: 278
blockmean: Using a total of 3.53 Mb for all arrays.
blockmean: Working on file paces_colorado_gravity_Bouger_values.xzy
blockmean: Calculating block means
blockmean: N read: 61775 N used: 61775 N outside_area: 0 N cells filled: 21613
grid the values with the GMT surface program:
surface paces_colorado_gravity_Bouger_blockmean.xzy -R-109/-102/37/41 -I30c
-Gpaces_colorado_gravity_Bouger_values.grd -V
note the -R should be the same area or smaller than in the blockmean command;
the -I argument should be the same number or larger than in blockmean command.
You can control several parameters of the gridding rotine in GMT, such as tension. To get god results with your data you
need to have suitable values for the blockmean and surface control values.
results:
surface: Given domain implies x_inc = 0.0216718
surface: Given domain implies y_inc = 0.0216216
surface: W: -109 E: -102 S: 37 N: 41 nx: 323 ny: 185
surface: WARNING: Your grid dimensions are mutually prime.
surface: HINT: Choosing nx = 324, ny = 192 might cut run time by a factor of
107.9177
surface: HINT: Choosing nx = 360, ny = 192 might cut run time by a factor of
103.50099
surface: HINT: Choosing nx = 324, ny = 216 might cut run time by a factor of
101.5696
surface: HINT: Choosing nx = 360, ny = 200 might cut run time by a factor of
98.96357
surface: HINT: Choosing nx = 384, ny = 192 might cut run time by a factor of
98.139326
surface: HINT: Choosing nx = 360, ny = 216 might cut run time by a factor of
92.68345
surface: HINT: Choosing nx = 400, ny = 192 might cut run time by a factor of
91.257045
surface: HINT: Choosing nx = 400, ny = 200 might cut run time by a factor of
89.046667
surface: HINT: Choosing nx = 432, ny = 192 might cut run time by a factor of
87.091478
surface: HINT: Choosing nx = 324, ny = 240 might cut run time by a factor of
86.334161
surface: Minimum value of your dataset x,y,z at: surface: -106.361 39.2009
-339.29
surface: Maximum value of your dataset x,y,z at: surface: -102.517 40.9699
-107.975
surface: 496 unusable points were supplied; these will be ignored.
surface: You should have pre-processed the data with block-mean, -median, or
-mode.
surface: Grid Mode Iteration Max Change Conv Limit Total
Iterations
surface: 1 D 250 0.169843 0.0357331
250
surface: Fit info: N data points N nodes mean error rms error
curvature
surface: 16416 60264 2.75099e-05 0.0202926
3230.9
The ".grd" file made is a NetCDF file.
To use this NetCDf file in the IDV you need to have identify which variable are
latitude and longitude.
Convert the binary netcdf file to its ascii CDL equivalent with the NetCDF
command
ncdump paces_colorado_gravity_Bouger_values.grd >
paces_colorado_gravity_Bouger_values.cdl
Edit the .cdl file and replace all instances of the variable name "x" with
"lon" and all instances of
the variable name "y" with "lat". You may also rename the field variable "z"
to a more informative name such
as in this case "bouger_anomaly."
Now convert the new CDL file into a ".nc" NetCDF file usable in the IDV with
the NetCDF command:
ncgen -o paces_colorado_gravity_Bouger_values.nc
paces_colorado_gravity_Bouger_values.cdl
This is a nornal NetCDF grid file that the IDV can display.
I have data that consists of latitude, longitude and a value in a text
file. Is it possible for me to generate a display that doesn't show the
locations, but draws contour lines based on this data?
Thanks,
Andy Holland
Air Quality Modeler
URS Corporation
1600 Perimeter Park Drive
Suite 400
Morrisville, NC 27560//www.unidata.ucar.edu/mailing_lists/
--
Stuart Wier
UNAVCO 6350 Nautilus Drive Boulder, CO 80301 (303) 381-7500 x 450
The GEON IDV http://geon.unavco.org
For 4D exploration and display of your geophysics data
The UNAVCO WInSAR Archive http://winsar.unavco.org