There are several important steps that have to be performed after each download of newly acquired LADCP data:
Because paths may change from one cruise to another, we advise that you write your own file containing all the commands specific to your computer environment at the beginning of the cruise. Examples of such a file used in previous cruises are given in Previous Cruises. Put yours in /home/science/current_cruise/scripts for future reference.
The software on the acquisition computer automatically copies the data to a backup location if a directory has been specified. We will usually have a backup disk /media/AC_backup (Acquisition Computer backup) with a directory on it for backup. See these instructions for details about formatting a new disk and setting up such a directory.
We use the unix command rsync to copy directories for archival or backup purposes. We have written a python script to make the use of rsync less dangerous for simple directory updates. The script will make the contents of the destination directory the same as the contents of the source directory. Use it as:
usage: copydir.py [options] source dest eg. copydir.py source dest copydir.py --test source dest copydir.py --ssh username source computer:/path copydir.py --ssh username computer:/path dest
The program copydir.py is designed to
- copy the contents of source_dir into the directory dest_dir
- only copy files that are new (or changed)
- delete files in dest_dir that no longer exist in source_dir
Type man rsync and rsync --help for more information about rsync. It is useful to know about rsync as a tool for your own use.
The presence or absence of a trailing slash (“/”) after the source directory affects the behavior of the command. Test before you use it, either way!!
At this stage the LADCP data have been downloaded from the LADCP to the acquisition computer and the LADCP operator has moved from that computer to the processing computer to do the rest of the work.
The first thing to do before processing the LADCP data is to copy the data files from the acquisition computer (probably located in the wet lab) to the processing computer (where you will sit’ probably located in the computer lab).
Proceeding from the processing computer, execute:
copydir.py path_to_ladcp_data /home/science/current_cruise/data/ladcp
where path_to_ladcp_data is the full path of the directory containing the raw LADCP data on the acquisition computer. The path path_to_ladcp_data may vary from cruise to cruise. For example, during the cruise I5S_2009:
copydir.py /net/nini.local/home/science/current_cruise_logging /home/science/current_cruise/data/ladcp # where # "nini" was the acquisition computer # (known on the network as "nini.local") # # "/net/nini.local" is the path to exported directories on # "nini" using the "autofs" protocol
The next step is to download the ancillary data necessary to process the LADCP data. In principle, you could process the data with the LDEO software without any ancillary data (see Section 2.2 in LDEO How-To [PDF]) but that produces processed data of poor quality. At least three ancillary datasets are necessary: CTD time series, CTD vertical profile (2dbar data) and GPS [see more details on ancillary data in LDEO How-To [PDF] and Section Which ancillary data do we need?].
For some cruises, the GPS data are included in the CTD data. In this case, the LDEO processing needs only the CTD data, but we ask that you download the GPS data separately in case there is a problem with the CTD data.
SCRIPPS/ODF group provides two kinds of CTD data that can be used for improving the LADCP processing:
- 2dbar “final” profiles – this is the usual CTD product
- 1/2 second (or 1-second) time series of CTD data.
Identify the source locations of these datasets and back up to the following locations:
copydir.py ctd_2dbar_source /home/science/current_cruise/data/ctd_2db copydir.py ctd_timeseries_source /home/science/current_cruise/data/ctd_timeseries
Here is an example of the strategy on I5S_2009.
It is possible that these products don’t have the right format to be used directly for the LADCP processing. In this casem they need to be re-formatted. For instance, during A13.5 (2010), we were given the fullt time series of CTD data, not a 1-s averaged one. The 1-s averaged time series were then produced independently. Look in previous cruises for more information.
On most CLIVAR cruises there will be a UHDAS shipboard ADCP system installed (see below) . UHDAS records GPS data. By copying UHDAS Shipboard ADCP data, you do not need to copy GPS data separately.
On some (Scripps) CLIVAR cruises, longitude and latitude are included in the CTD time series. In that case, you do not need to use separate GPS data at all.
On some cruises (NOAA), longitude and latitude are not included in the CTD timeseries, so you need to transform the UHDAS shipboard GPS data into an ASCII format that can then be used for LADCP processing. To do that, use the Matlab routine gps_rbin_to_asc.m located in current_cruise_data/data/gps/. There are four changes to do in this default file and these changes are detailed in the routine itself.
If a CLIVAR cruise does not have a UHDAS installation, instructions will be provided in a timely manner based on the specifics of the cruise.
Univ Hawaii “currents” group (and hence the LADCP operator) is also responsible for the Shipboard Acoustic Doppler Current Profiler data. The shipboard ADCP data may be acquired by one of two packages:
copydir.py /net/currents/home/data/current_cruise /home/science/current_cruise/data/shipboard_adcp
# if there are two instruments, WH300 and OS75, put the data here: /home/science/current_cruise/data/shipboard_adcp/WH300 /home/science/current_cruise/data/shipboard_adcp/OS75
Although, the LADCP processing can be performed in principle using only the raw LADCP data, the result will be of poor quality, especially with respect of the barotropic component of the flow and the estimated depth. For these reasons, ancillary data are used to constrain the processing.
Two types of CTD data are used: the CTD time series and the CTD vertical profile. The CTD time series helps to correct the estimate of depth. CTD time series alone are sufficient for the LDEO or UH processing to produce its own CTD vertical profile. If available, LDEO softawre will replace the T,S profiles produced with the 2dbar data if that is available (since they are usually of higher quality). GPS data are very important and used principally to constrain the barotropic component of the flow together with the technique of bottom tracking. They are also used to estimate the magnetic declination.
On CLIVAR cruises, Shipboard Acoustic Doppler Current Profilers (SADCP) measure the speed of the currents in the upper 800 m and the LDEO software can used their data to constrain further the estimate of the barotropic flow. However, SADCP data have not yet been used systematically for processing and at this stage we do not ask them to be used, although we do need the SADCP data to be saved (See Data to bring back from the cruise). The E.Firing lab is responsible for processing the SADCP data.
The primary diagnostic program is:
There may be other datasets on the cruise that are not required for LADCP processing and that we are not responsibe for. But these data can potentially provide interesting context for scientific interpretation. Get these if you can, but if you cannot, they will be available on the CLIVAR web site later.
The ship’s logging system will have underway data (Meteorological sensors: temperature, wind speed and direction) logged every second or so. These can be useful data to have when interpreting upper ocean data. Find the location of these data (look on the ship’s web site or ask the computer tech on the ship) and make sure you bring these data home too.
Once all necessary data have been dowloaded, the raw LADCP data processed and the figures plotted and saved, the entire logging directory /home/science/current_cruise_logging on the acquistion computer and processing directory /home/science/current_cruise_logging on the processing computer need to be saved on every other computer and external disks.
For instance, proceeding from the processing computer:
# copy the processing directory on to a copydir.py /home/science/current_cruise destination
This would copy the contents of the entire LADCP CLIVAR cruise directory to the destination directory. That directory could reside on an external disk on either (acquisition or processing) computer, as long as there is permission to write.
To back up the entire LADCP CLIVAR cruise directory from the processing computer to the acquisition computer nini (assuming permission to write) with the command below. This would make the entire LADCP CLIVAR cruise directory on the two computers be identical, using a combination of ssh and rsync.
copydir.py -essh science /home/science/current_cruise nini:/home/science/current_cruise # You will be asked for the password of "science" on nini
Depending on the actual context, the exact paths to the other computer and external disks may vary. See examples in Previous Cruises.
It is important to keep a detailed journal that contains information about what the operator has done on the cruise. By keeping track of problems and their solutions, instrument configuration, and anomalous behavior, an operator on a future cruise may avoid a similar mistake or solve a reocurring problem.
Important information includes:
- for each cast
- serial number of instruments used
- processing steps done
- configuration used
- anomalies in the cast
- in general
- interesting observations in the data
- changes in software configuration
- changes in instrumentation or cabling
- any odd behavior
- problems and solutions
Examples can be found in previous_cruises on the LADCP processing computer.