> From
Dr.T.V.Ramana Murty,
Scientist
National Institute of Oceanography
176, Lawsons Bay Colony
Visakhapatnam - 530 017,
Andhra Pradesh, India
Dear Sir,
Recently we have downloaded POM model from websit:http://sea-mat.whoi.edu,
which works under MATLAB environment. It has many functions. Examples are given
below:
1) function [var]=get_point(file,vname,iindex,jindex,kinde,itime)
file .... the name of the netCDF file
vname ... the name of the netCDF variable
iindex ... i-index of point
jindex ... j-index of point
etc.
Question No.1: I have data files (temp.dat ) in DOS environment. temp.dat
contains data on temperature ( 30 rows and 17 columns). I am reading the same
data from source code (in fortran using statement:
read(1,*)((a(i,j),j=1,17),i=1,30) ). For above such models, I have to supply
netcdf file. I would like to know, how to convert temp.dat file in netcdf file?
Question No.2. I have some software in fortran. I would like to run those
software in MATLAB environment.How?
Example:
program summat
c sum of elements of given matrix.
dimension a(30,17)
open(1,file='temp.dat',access='sequential')
open(2,file='temp.out',access='sequential')
read(1,*)n,m
do 10 i=1,n
10 read(1,*)(a(i,j),j=1,m)
call sum(a,n,m,sum1)
write(2,*)'sum of elements of a matrix=',sum1
stop
end
subroutine sum(a,n,m,sum1)
dimension a(30,17)
sum1=0.0
do 10 i=1,n
do 10 j=1,m
10 sum1=sum1+a(i,j)
return
end
I would like to compile and run same fortran program in MATLAB. How?
I would like to supply the file to 'subroutine sum' as a netcdf file.
How?
Note: I am beginner
ncdigest wrote:
> ncdigest Tuesday, July 9 2002 Volume 01 : Number 644
>
> Today's Topics:
> Re: Problems creating a large array
> Re: Problems creating a large array
>
> ----------------------------------------------------------------------
>
> Date: Tue, 9 Jul 2002 11:10:24 -0600
> From: "John Caron" <caron@xxxxxxxxxxxxxxxx>
> Subject: Re: Problems creating a large array
>
> - ----- Original Message -----
> From: "Mark A Ohrenschall" <Mark.A.Ohrenschall@xxxxxxxx>
> To: <netcdfgroup@xxxxxxxxxxxxxxxx>
> Sent: Monday, July 08, 2002 6:51 PM
> Subject: Problems creating a large array
>
> > Hello,
> >
> > I'm trying to load a 21600 by 43200 array into netCDF -- I succeeded
> > (barely) for a byte array, but am running out of memory for a short
> > array. I'm using the Java API and am using the -Xms and -Xmx parameters
> > to give (or try to give) the required memory to the Java VM:
> >
> > java -cp /home/mao/java:/home/mao/java/netcdf2.jar -Xms2048m -Xmx2048m
> grid2nc.globe
> > Error occurred during initialization of VM
> > Could not reserve enough space for object heap
> >
> > When I try smaller numbers I can start the VM but I then get an out of
> > memory exception.
> >
> > How can I load such a large array into netCDF?
> >
> > Thanks in advance,
> >
> > Mark
>
> NetCDF is really an API for out-of-memory storage, ie disk files. What it
> does is to allow you to efficiently move data between disk and memory. So
> instead of moving your entire array into memory, you want to move just
> pieces of it. The art of this kind of programming is to read the right
> amount of data that will fit into memory, and operate on it as much as
> possible, before you have to get the next piece.
>
> No matter how much internal memory you can afford, you will always have
> files bigger than that, so you have to think in terms of subsetting the
> array.
>
> If you absolutely have to have it all in memory, then you have to buy more
> memory. You can try various clever compression schemes, but these are not
> part of NetCDF.
>
> ------------------------------
>
> Date: Tue, 09 Jul 2002 15:06:57 -0600
> From: "Mark A Ohrenschall" <Mark.A.Ohrenschall@xxxxxxxx>
> Subject: Re: Problems creating a large array
>
> - --------------551E141831D1269075E2389C
> Content-Type: text/plain; charset=us-ascii
> Content-Transfer-Encoding: 7bit
>
> Great idea, John! By setting the dimension size of latitude to unlimited and
> growing the array row by row, I was able to load the 21600 by 43200 grid into
> netCDF:
>
> [mao@panther dods]$ ncdump -h globe.nc
> netcdf globe {
> dimensions:
> lat = UNLIMITED ; // (21600 currently)
> lon = 43200 ;
> variables:
> float lat(lat) ;
> lat:units = "degrees_north" ;
> float lon(lon) ;
> lon:units = "degrees_east" ;
> short globe(lat, lon) ;
> globe:long_name = "GLOBE 30-second DEM" ;
>
> // global attributes:
> :title = "GLOBE 30-second DEM" ;
> :FYI = "http://www.ngdc.noaa.gov/seg/topo/globe.shtml" ;
> :Conventions = "COARDS" ;
> }
>
> Thanks again,
>
> Mark
>
> - --------------551E141831D1269075E2389C
> Content-Type: text/html; charset=us-ascii
> Content-Transfer-Encoding: 7bit
>
> <!doctype html public "-//w3c//dtd html 4.0 transitional//en">
> <html>
> Great idea, John! By setting the dimension size of latitude to unlimited
> and growing the array row by row, I was able to load the 21600 by 43200
> grid into netCDF:
> <pre>[mao@panther dods]$ ncdump -h globe.nc
> netcdf globe {
> dimensions:
> lat = UNLIMITED ; // (21600
> currently)
> lon = 43200 ;
> variables:
> float lat(lat) ;
>
> lat:units = "degrees_north" ;
> float lon(lon) ;
>
> lon:units = "degrees_east" ;
> short globe(lat, lon) ;
>
> globe:long_name = "GLOBE 30-second DEM" ;
>
> // global attributes:
>
> :title = "GLOBE 30-second DEM" ;
>
> :FYI = "<A
> HREF="http://www.ngdc.noaa.gov/seg/topo/globe.shtml">http://www.ngdc.noaa.gov/seg/topo/globe.shtml</A>"
> ;
>
> :Conventions = "COARDS" ;
> }</pre>
> Thanks again,
> <p>Mark</html>
>
> - --------------551E141831D1269075E2389C--
>
> ------------------------------
>
> End of ncdigest V1 #644
> ***********************