2.8.9. Understanding Editing (Dataviewer + CODAS)¶
2.8.9.1. Editing and Resetting¶
Dataviewer.py has two kinds of editing: thresholds, and manual editing. Both append data to ascii files. These files contain the timestamp of the bad profile and information about the flagging.
Manual Editing output files
If you use the Editing tools to manually select bins or profiles,
flagging information is appended to files *.asc
when you push the
“Save Manual Editing” button. (If you delete the editing window
without pushing “Save Manual Editing” , those edits will not be
recorded).
When you click “Apply Editing” the flagging information in these files is applied to the database, logged, and then the files are deleted.
There are basically 5 manual editing buttons:
profile – flag bad profiles
rzap bins – rectanglular region selection
lasso – freehand selection
pzap bins – polygon region selection
bottom – identify bottom (use amplitude to see the bottom)
editing type |
ascii file name |
file contains … |
---|---|---|
profile |
hspan_badprf.asc |
timestamp of whole profile to flag |
rectangle polygon lasso |
rect_badbin.asc poly_badbin.asc lasso_badbin.asc |
timestamp and list bins to flag |
bottom |
zap_bottom.asc |
bin identified as ‘bottom’ |
Threshold Editing output files
- The effects of the threshold editing are shown if you:
click “Show” (or right arrow)
have Masking “all”
- When “Apply Editing” is clicked, the following things happen:
the files below are created, with threshold flagging information
all
*.asc
files (these, and the ones above from Manual Editing) are applied to the databaseall lines from
*.asc
files are appended to corresponding files*.asclog
for later retrieval, if requiredall
*.asc
files are deleted
file |
file contains … |
---|---|
abadbin.asc |
timestamp and list bins to flag |
abadprof.asc |
timestamp of whole profile to flag |
abottom.asc |
bin identified as ‘bottom’ |
Note
If you move to the right without clicking “Apply Editing” your edits for the present view will be discarded (manual edit *.asc
files are deleted, no threshold editing is applied).
Resetting profiles to minimal editing
If you click “Reset Some Editing”, the profiles you choose are
reset to editing based on Percent Good Minimum only. Any references
to those profiles are deleted from the log files (*.asclog
)
and any existing *.asc
files. These references must be
removed because otherwise the profiles might get flagged again
when “Apply Editing” is clicked. It is also possible to reset
all the flags for the whole database (instructions are lower
on this page.)
Note
For best results, ALWAYS click “Apply Editing” before moving on.
2.8.9.2. Details of “Apply Editing”¶
The quick_adcp.py step called --steps2rerun apply_edit
performs
several steps on the whole database. These are the same steps
run by “Apply Editing” in the dataviewer tool (but for a limited time
range). All steps are performed in the`` edit/`` subdirectory:
dbupdate ../adcpdb/dbname abottom.asc
This step takes any bins identified as the bottom (stored in “abottom.asc”) and identifies them in the database
dbupdate ../adcpdb/dbname abadprf.asc
This step identifies bad profiles in the database by setting the “last good bin” value to -1.
badbin ../adcpdb/dbname abadbin.asc
This step identifies bad bins in the database. A decimal value of 1 is put into “profile_flags” at this stage.
set_lgb ../adcpdb/dbname beamangle
This step takes the bottom (from #1) and makes flags (decimal flag value of 4) below the bottom. It also masks data close to the bottom if the data are subject to side-lobe contamination (depends on beam angle). The default beam angle is 30 degrees.
setflags setflags.tmp
This does two things:
takes the bad profile identifier and and gives every bin in that profile a “bad bin” flag (decimal value of 1)
flags all values with percent good below the specified threshold (usually 30% or 50%) using a decimal value of 2 for the profile flag.
Try using showdb
to look at a database. The variable PROFILE_FLAGS
shows the editing status of a given bin or profile. Bins are flagged in
the database with a binary bit, depending on why they were flagged.
This is a useful way to see whether data have been flagged or not.
binary |
decimal |
below bottom |
percent good |
bad bin |
---|---|---|---|---|
000 |
0 |
|||
001 |
1 |
bad |
||
010 |
2 |
bad |
||
011 |
3 |
bad |
bad |
|
100 |
4 |
bad |
||
101 |
5 |
bad |
bad |
|
110 |
6 |
bad |
bad |
|
111 |
7 |
bad |
bad |
bad |
2.8.10. “Unediting” scenarios¶
2.8.10.1. (1) starting over¶
Example: You have a UHDAS dataset in which too much data were edited out
If you need to start over, you will have to clear all
profile flags, remove ascii files associated with editing, and do the
editing from scratch. If all you need to do is add flags, just run
dataviewer.py -e
and flag the offending data (i.e. add flags to the
existing ones). Here are three examples when you might need to return
the flags to zero and start over
At-sea defaults for Error Velocity are flagging too many points: the dataset looks fine but there are lots of + marks of missing data.
You are training someone else to process ADCP data and they flagged way too many things as bad. You do not need to redo all their processing, just the editing
You can use the “Reset Some Editing” button to restore a given segment to minimal flagging (Percent Good only).
Or if you want to start over with a clean slate:
go to the edit directory
remove all
*.asc
files and all*.asclog
files- edit a file called clearflags.tmp
in newer processing it is staged for you
if there is not one, copy setflags.tmp to clearflags.tmp and edit as follows
original (setflags.tmp) new (clearflags.tmp)
--------------------------- ---------------------------
dbname: ../adcpdb/a_demo dbname: ../adcpdb/a_demo
pg_min: 50 pg_min: 50
set_range_bit clear_all_bits
time_ranges: clear_range
all clear_bad_profile
time_ranges:
all
run “setflags clearflags.tmp”
start editing. Be sure to “list” before moving to “next”
2.8.10.2. (2) change percent good¶
You can change the percent good used by quick_adcp.py (eg. change to
30 instead of 50) by specifying --pgmin 30
but (depending on what
version of software produced the dataset) you may have to
include that option every time you run quick_adcp.py.
To change percent good once,
go to the edit directory
do not remove any files
copy setflags.tmp to setflags30.tmp
edit setflags30.tmp as shown below
run
setflags setflags30.tmp
original (setflags.tmp) new (setflags30.tmp)
--------------------------- ---------------------------
dbname: ../adcpdb/a_demo dbname: ../adcpdb/a_demo
pg_min: 50 pg_min: 30
time_ranges: time_ranges:
all all
2.8.10.3. (3) Workhorse: (bugfix) recover a few deep bins¶
“We” (a vigilant user) discovered that bottom blanking was hardwired for 30deg, which is not correct for most Workhorse instruments. This bug only affects data collected when the bottom was in range. Data processed with code prior to May 1, 2009 will have had 15% of the range flagged near the bottom (cos(30deg)), but if the instrument has 20deg beams, the flagging should have been more like 5% (cos(20deg)).
To recover the range 85%-95% of the water column depth from such a dataset
make sure your executable
set_lgb
accepts an argument for beam angleif it does not, follow these instructions to install new programs.
go to the
edit
directoryremove all
*.asc
filescopy setflags.tmp to clearflags.tmp
edit clearflags.tmp as shown below
run “setflags clearflags.tmp”
original (setflags.tmp) new (clearflags.tmp)
--------------------------- ---------------------------
dbname: ../adcpdb/a_demo dbname: ../adcpdb/a_demo
pg_min: 50 pg_min: 50
set_range_bit clear_all_bits
time_ranges: clear_range
all clear_bad_profile
time_ranges:
all
If you have a file called dbinfo.txt
in the root processing
directory, edit that file and change the beam angle to 20.
If you do not have a file like that, you could reprocess
the data from scratch or use these instructions
to generate the dbinfo file. Then you can edit it and change the
beam angle to 20.
Then run
quick_adcp.py --steps2rerun apply_edit --auto