UHDAS refers to a suite of programs and processes developed at the University of Hawaii that perform data acquisition, data processing, and monitoring, at sea. In addition, access to documentation and code are provided on the ship’s network. We have tried to make a system that is useful and reliable, easy to operate, and which provides as close to a final dataset as is reasonably automatable while maintaining the fundamentals necessary to reprocess the data from scratch if necessary.
On the WWW, full documentation can be found here: http://currents.soest.hawaii.edu.
At sea, the same documentation starts here.
UHDAS has four components at sea:
- Access (to data and figures)
Data acquisition programs are written in C, and the gui and supporting code are written in C and Python.
Data acquisition includes
a dialog with each of the RDI ADCPs to set parameters and start pinging
acquisition and timestamping of passive serial inputs
data collected are
- binary records (from ADCP ensemble)
- NMEA strings (from serial inputs)
NMEA data recorded usually comprise
- GGA messages (gps) from two sources if possible
- gyro heading
- accurate heading (POSMV, Ashtech, Seapath, Mahrs, Phins,... if available)
files roll over every two hours
timestamps are zero-based decimal day (Jan 1, 12:00 UTC is 0.5, not 1.5)
in the past, all but the most recent two ascii files were compressed to save space, but in more modern installations files are left uncompressed.
a parsed version of each NMEA string is added to a set of intermediate files to stage information for the processing component (“rbin files”)
Processing code is written in C and Python (phasing out a long reliance on Matlab). Final processed output are written as Matlab files and NetCDF files on a regular basis. Processing is done using a CODAS database (Common Ocean Data Access System) as storage and retrieval system. The suite of programs designed to extract from, manipulate, and write to the database is known as “CODAS ADCP Processing” and has been free, maintained, and in use since the late 1980’s. (See the CODAS Processing section for more detail).
In a batch mode, CODAS processing can be applied to single-ping data gathered by UHDAS (or the commercial RDI software “VmDAS”), or averaged data collected by VmDAS or the original DAS2.48 (used with Narrowband ADCPs in the late 1980’s and through the 1990’s).
At sea, a UHDAS installation acquires data and uses CODAS processing to calculate ocean velocities from ADCP measured velocities, position, and heading (gyro, corrected to a accurate heading if one is available). The following three levels of processing combined are called CODAS Processing:
- make sure every ADCP ping has a position and a heading
- gather the next T seconds of data (eg. 300 seconds)
- screen the ADCP data to eliminate bad values (eg. acoustic interference)
- average in earth coordinates
- write to the disk
- load measured velocities into the database
- add navigation to the database
The following are steps automated on a ship with UHDAS, but can be done afterwards with human intervention
- correct the gyro heading to the accurate heading device (if there is one)
- apply scale factor if specified (eg. NB150)
- apply additional fixed rotation if specified
- edit out bad bins or profiles (eg. data below the bottom)
UHDAS Enhancements to CODAS Processing
UHDAS adds steps to the basic processing at sea by extracting (on a regular basis) processed, corrected, edited data for scientists to use during the cruise. These data and figures that are generated from them, are available on the ship’s web.
every 5 minutes
- get the last 5 minutes of new data
- rotate to earth coordinates using gyro as the primary heading device
- correct to the “accurate heading device” (if one exists)
- edit single-ping data (for this 5-minute chunk)
- average, write to disk (staging for addition to the codas database)
- save the 5-minute chunk of data as a matlab file (for plotting)
every 15 minutes
the CODAS database is updated with the staged averages
scale factor and fixed rotation are applied if specified
the averages in the database are also edited (to look for bad bins or bad profiles, and the bottom)
- after the codas database is updated
- the data are extracted and averaged (for plotting)
- the data are extracted with “every bin, every profile”
- data are stored as matlab files and netCDF files, accessible via ship’s web site or via windows shares [samba] or nfs.
- Vector and contour plots of the last 3 days of data are updated, also available on the ship’s web.
Monitoring programs are written in Python and make use of Linux system calls.
Monitoring takes place:
Daily email is sent to land with dignostic information about: * processes running, disk space, error messages * data processing status * heading correction quality * the last 3 days of heavily averaged (vector plot)
at sea: The following are available on the ship’s web:
- most recent 5-minute profiles of all instruments that are pinging
- the last 3 days of data shown as contour and vector plots
- the last half-day of gyro and “accurate heading”
Two fundamental access mechanisms exist
- ship’s web (usually http://currents)
- figures (direct link at sea)
- most recent ocean velocity profile for each instrument
- 3-day tail with surface velocity vectors
- 3-day tail contour plot (vs time, longitude, or latitude)
- data (direct link at sea)
- all data so far, averaged in thick layers (eg 50m) over 1-hour, for vector plots
- all data so far, averaged in thinner layers (eg 10m) over 15 minutes, for contour plots
Accessible via a shared network drive are:
- archive (subset) of the web pictures
the same averaged data that are available via ship’s web
- “every bin, every profile”, i.e. the highest resolution of the
processed data, in Matlab or netCDF format, or the actual CODAS database itself (accesible with Python)
single-ping ADCP data
The section about accessing ADCP data describes Matlab and Python methods for reading ADCP data.