I'd also keep in mind that ObjectOutputStream's style of
storing/retrieving instance attributes then re-attaching the class
definition could be sufficient. Databases won't have the performance of
a lean object cache.
Nils Hoffmann wrote:
Hi Jon and also to the list,
Jon Blower wrote:
Hi Nick and John (and list),
I was about to write pretty much exactly what Nick just wrote - the
serialization would be for short-term storage so I don't think there's
a problem there (although I guess some people might try to misuse the
serialization capability if it were provided).
My _guess_ is that there is no need to keep open file handles and the
overhead of opening a NetcdfDataset is mostly in the reading of the
metadata into the relevant structures (Attributes, Axes etc),
particularly if the dataset is an aggregation (actually it was
aggregations that I had in mind when I suggested the persistent store
in the first place). However, some testing would be needed of course
- I don't really know.
Maybe the metadata could be stored in some other way than
serialization. We have done some analogous experiments with trying to
use a relational database to store metadata but the problem is that
the database gets complicated when you start dealing with projections
and the like. Object-relational mapping could help to manage the
complexity but this won't be trivial to code (e.g. could use Hibernate
over an embedded database like H2).
Maybe instead of going through the hassle of setting up an OR mapping,
you might want to have a look at http://www.db4o.com/, which is an
object database also for Java. It does not require class annotations
or xml descriptions of mappings so it is rather quick to set up,
although I must admit that I haven't tested it for performance against
any other solution, yet. It also allows the database to be local, e.g.
in a file and even allows for updates of an object's interface to some
degree.