sum150

Interestingly, the sample command line brings me to my first observation: the raw sumfiles that are available for download from the MSC might not be exactly ready to use "out of the box", at least if you're of the conservative sort (like me). Instead of the out-and-out *.sum files, you may notice above that I'm using something with the *.sum150 extension.
 * What's this *.sum150 business about?**

Basically, there's a column in the *.sum files one may download that indicate how hard the interferometer is working when it is making a given measurement. Actually, there's a number of proxies that can be used for this: for example, the number of locks the instrument went through for a 125-second scan sequence can be useful. Ideally, this number is one - the instrument locked once and then never lost lock before completing the data collection. If the fringe tracker gets lost and has to search for fringes, then the lock value increments, and this happens for each loss of lock. Anything more than ~a dozen is a cause for concern.

Instead of lock losses, I use an item of telemetry in the data stream called the **jitter**. Effectively this is a characterization of the amount of shift in fringe position from frame to frame. A frame time is usually ~10 milliseconds, sometimes ~20 ms, during which the fringe tracking engine of the interferometer samples the star light to see if its sitting on the zero-path-difference point between the two telescopes. The ZPD point can shift because the atmosphere above each telescope is a swirling maelstrom of ickyness, and the effective path through the atmosphere for each telescope can change from frame to frame - thus shifting the ZPD point. For somewhat not-obvious reasons jitter is measured in radians (think phasor diagrams and you'll be on the right path): what the *.sum150 files do is essentially toss any entries that have jitter values above 1.50 radians.

This is an attempt to toss the 'bad data' in a non-biased way, scans where the interferometer was simply working too hard and the results may be suspect.

Well, since we're using Linux, if you want to pretend you're a //real// black belt, use //awk// and //sed//:
 * What's a quick way to generate the *.sum150 and *.spec150 files?**

//$ awk '$12 < 1.50 {print $0}' 106278.sum > 106278.sum150; awk '$4 {print $4}' 106278.sum150 > tmp150; grep -f tmp150 106278.spec > 106278.spec150//

The above command line takes the //sum// and //spec// files associated with night 106278 and creates their corresponding //sum150// and //spec150// files - any lines that corresponded to jitter in excess of 1.50 radians were unceremoniously dumped. The way it does it is through 3 commands, separated by semicolons ';' - (1) uses awk to compare column 12, the jitter, to see if it is less than 1.50, and if it is, print the whole line. '>' redirects that output to a new file, //106278.sum150//. (2) prints the timestamps associated with the vetted (jitter<1.50) lines to a temporary file, //tmp150//. (3) compares the timestamps in //tmp150// to entries in //106278.spec// and dumps the matches to //106278.spec150//.

You can generate a batch file that does this for a whole series of sum/spec files by replacing the '106278' above with the individual date stamps. An Excel program can be useful for this: