JGOFS Data System Overview

Glenn R. Flierl, MIT
James K.B. Bishop, LDEO
David M. Glover, WHOI
Satish Paranjpe, LDEO

Introduction

Large oceanographic programs such as JGOFS (The Joint Global Ocean Flux Study) require data management systems which enable the exchange and synthesis of extremely diverse and widely spread data sets. We have developed a distributed, object-based data management system for multidisciplinary, multi-institutional programs. It provides the capability for all JGOFS scientists to work with the data without regard for the storage format or for the actual location where the data resides. The approach used yields a powerful and extensible system (in the sense that data manipulation operations are not predefined) for managing and working with data from large scale, on-going field experiments.

In the ``object-based'' system, user programs obtain data by communicating with a program (the ``method'') which can interpret the particular data base. Since the communication protocol is standard and can be passed over a network, user programs can obtain data from any data object anywhere in the system. Data base operations and data transformations are handled by methods which read from one or more data objects, process that information, and write to the user program.

Purpose:

Permit scientists to use data without concern for storage technique, location, or format
Networked interchange of data sets
Access to most recent versions of data sets during experiments
Handle multidimensional data
Transmit metadata
Extensible data manipulation routines
Usable interactively or from programs

More Information

Glenn R. Flierl, glenn@lake.mit.edu