Shipboard ADCP processing workshop May 14,15,16 (2013) WHOI session hours audience ---- start overview/seminar/brownbag section ------ (day 1: morning) (1) general overview of adcps, 1.5 general(*): between a department seminar codas processing, uhdas and a brownbag lunch. in an auditorium or seminar room, with an overhead projector ==> break for coffee <== ==> group gets smaller <== (2) what "postprocessing" a codas 1.5 more technical(*); still does not require dataset involves, what to computers or network. maybe good look for that indicates a for users of adcp data, and people potential problem and what is who supervise people who process data required to address it ==> break for lunch <== ==> group gets smaller <== (*) anyone who has actually done any codas processing in the past is exempt from the requirement that they attend these two. Or they have to NOT ask any questions I already answered in (1) and (2) and have to still be able to move forward. They should clear this with me first ------ start "lab/workshop" section ---- (day 1: afternoon) -- requires computers, network, power, and (yes) projector session hours audience (3) get virtual machine set up, get 1.5 people who want to do anything that test datasets in place follows (4) We all run through post-processing 1.5-2 fundamental introduction to adcp of a recent UHDAS dataset together data and terminology and tools Introduction to tools used on codas databases (how to access/view data) ================================= (day 2) =========================== (5) introduction to UHDAS single-ping 8 back up; take more details. processing (i.e. how to get a audience: this is the meat; dataset ready for 'postprocessing' anyone who wants to process UHDAS data should be here for this. extra time allocated for additional questions, perhaps special datasets, more about tools to look at singleping data and serial data ================================= (day 3) ================================= (6) processing VmDAS data; 4+ anyone who thinks they will assess VmDAS data; convert to have to process a VmDAS UHDAS directory structure, apply a VmDAS dataset knowledge from previous steps