Hi,
I've been implementing "filters" for our LaTiS server
(http://lasp.colorado.edu/lisird/tss.html) using the OPeNDAP (DAP2) URL
"function" syntax. For example,
http://lasp.colorado.edu/lisird/tss/american_sunspot_number.csv?&thin(10)&format_time(yyyy-MM-dd)
It's not ideal, but it's a start. DAP2 says very little about functions.
Maybe DAP4 could say more.
I'd also strongly suggest that people think about Functional Programming
paradigms. I suspect that such formalisms will lend themselves better
than an OO biased approach.
Doug
On 6/18/12 3:56 PM, Roy Mendelssohn wrote:
But for a useful service, the form and syntax of the URL should be
independent of the mechanism that does the server side calculations
(which rules out SWAMP). So for example, both Grads an F-TDS use the
same format in the URL to say that "this is an expression", but the
expressions themselves are platform specific. That is not the way to
get overarching services.
Instead, we need agreement on how in the URL request we signal a
server-side function, the syntax of that function (independent of the
engine underneath) and a few simple functions to start (say simple
data transformations, differencing and averaging on a dimension(s)).
Then the server back-end can parse the request and use Ferret or
Grads or NCO or Python or whatever is desired, and like with any good
service, the back end could change without having any affect on the
URL or the user.
I know Matthew Arrott at least used to like the approach in Chapter
12 of "Python Scripting for Computation Science". But a lot of that
is the engine underneath. I am more interested in the form of the
URL. Get some agreement on that, and some real implementation could
proceed.
-Roy
On Jun 18, 2012, at 2:37 PM, Russ Rew wrote:
Jeff,
However, we have to keep in mind performance ramifications. It
still takes a long time to move gigabytes of data across a
network. This brings up the importance of moving the computation
to the data, instead of moving the data to the computation. For
some data sets and many use cases remote access to data works
very well so things like brokering are tractable. However, for
*big* data sets (e.g., climate model output) we need to come up
with richer mechanisms (like the NCO on local data) to bring
computation to the data.
See Daniel Wang's SWAMP (the Script Workflow Analysis for
MultiProcessing), built on top of NCO:
https://code.google.com/p/swamp/
--Russ
_______________________________________________ thredds mailing
list thredds@xxxxxxxxxxxxxxxx For list information or to
unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/
********************** "The contents of this message do not reflect
any position of the U.S. Government or NOAA." **********************
Roy Mendelssohn Supervisory Operations Research Analyst NOAA/NMFS
Environmental Research Division Southwest Fisheries Science Center
1352 Lighthouse Avenue Pacific Grove, CA 93950-2097
e-mail: Roy.Mendelssohn@xxxxxxxx (Note new e-mail address) voice:
(831)-648-9029 fax: (831)-648-8440 www: http://www.pfeg.noaa.gov/
"Old age and treachery will overcome youth and skill." "From those
who have been given much, much will be expected" "the arc of the
moral universe is long, but it bends toward justice" -MLK Jr.
_______________________________________________ thredds mailing list
thredds@xxxxxxxxxxxxxxxx For list information or to unsubscribe,
visit: http://www.unidata.ucar.edu/mailing_lists/