Summary of chop-nod analysis telecon of Oct 8, 2008 Mike, Giles, Lero, Hiroko _____________________________________________________________ Lero's analysis of M 82 This is following up on earlier work by Lero (pointing, sorting good from bad files), Giles (smoothed tau, sharpinteg, sharpcombine), and then Lero again (chi2 but with v4). See E-mail to sharp-software mailing list for details, but here is overview: about 50 useful files binned into 6 bins of about ten files each. sharp_combine v5 is used to get: reduced chi2 is 2.6 for Q and 1.4 for U. peaks in reduced chi2 seem more or less randomly distributed on chi2 maps. we decided that outlier rejection work not necessarily warranted at this time as its time consuming and did not help L1527. Giles suggests a possible future course of action: reorganizing the six bins so that they each have about the same statistical weight, as judged from looking at the typical errors in Q and U for the central portion of each bin's sharp_combine_5 map. This involves adding files to some bins by removing them from adjacent bins. It would be an iterative process. The chi2 output will be more meaningful and the results could then be reorganized into 2 or 3 super-bins and examined for consistency of vectors across the super-bins. _____________________________________________________________ Mike's work on 6334 and 1333 6334 has been done with a 350 micron RGM. Based on map-wide red-chi2, the nominal 6-sigma threshhold is expected to correspond to an actual 3-sigma threshhold. Using the nominal 6-sigma threshhold, we have many vectors and an interesting map. 1333: new code developed to pointing-correct these data. 8 files discarded (spikes, probably will be removed when we have a new RGM). map-wide red-chi2 is ~1.8. Based on this, an actual (not nominal) 3-sigma threshhold is applied yielding vectors (P~1%) on 4A and (P~tenths of %) on 4B, and one in-between. _____________________________________________________________