Summary of dicussion during chop-nod analysis telecon of Jan 30, 2008 Lero, Mike, John, Giles _________________________________________________________________ M82 analysis: We discussed Lero's work on the data from the early April run. Last Sunday, Lero sent out maps from this run. From a statistical point of view, this is the best M82 run. Many 3-sigma and 4-sigma detections! But orthogonal to what we were seeing in Jan 2006 and Feb 2007! But note that these are our first > 3-sigma detections from M82. (They imply a toroidal field, not vertical as we inferred from previous 2-sigma results from Jan 2006 and Feb 2007.) In this first analysis, the I-map looked bad. The two M82 peaks were not clear. Following John's advice, Lero divided the data into halves until locating the offending file. With this file (36784) removed the I map looks much better, as we saw in the maps Lero sent out this AM. Lero also found that eliminating two further files increases the number of polarization detections, but we decided that we should not make cuts on this basis. The reason that the bad file is not easy to locate using the I maps from sharpinteg is that a really bad pixel just ends up looking like an NaN. We agreed on the following next steps for Lero's analysis: - changes to implement now: use -bg 10 0 instead of -bg 5 5 use -idl to display results of background subtraction use i.p. subtraction (see July 3 2007 memo - last bullet) - changes to implement soon (in random order): 1. Explore the background subtraction (e.g., try "-bg 20 0", etc) and pay attention to the results. John notes that background subtraction routines can be used to flag files where the background subtraction is not working well. This may mean that the applied value of tau was incorrect or that outliers resulted in a bad fit. In this case investigate carefully to determine if the file should be removed. John has found that if he iterates in this way he end up removing 1 or more files in each try, which results in many files being tossed, so be careful in determining if a file should really be removed. 2. Use smoothed tau: See second bullet of Giles' DG tau posting (posted last May) for syntax. The smoothed tau data for both April runs was also posted last May. For reference to syntax, here are some lines cut from the second bullet of Giles' DG Tau posting: 36078_int.fits 0.035 -117.7 91.3 36079_int.fits 0.035 -117.4 91.1 36080_int.fits 0.0348 -119 91.8 36081_int.fits 0.0346 -118.7 92.7 36082_int.fits 0.0342 -119.1 91.7 # 36083_int.fits 0.0338 -119.5 92.8 36084_int.fits 0.0335 -119.2 93.2 36085_int.fits 0.0333 -118.7 93.8 36086_int.fits 0.0332 -118.9 93.7 36087_int.fits 0.0333 -118.8 94.6 36088_int.fits 0.0336 -118.1 94.7 (Note that the file name is followed by the soothed tau values and then by the pointing offsets.) 3. Do the pointing corrections (with Mike's help) 4. Grade all the I images 5. Grade the Q and U (with Giles' help) _________________________________________________________________ i.p.: We discussed the new memo posted to analysis logbook, which has "best-guess" i.p. for the Jan 2006 (350 micron) run and the 450 micron runs in Dec 2006 and June 2007. We agreed that these values are the best we can do for now. _________________________________________________________________ h: For the June run, the correct value of h is "6". We should update the web page, and tell Larry and Hua-bai. _________________________________________________________________ Reduced-chi-squared procedures: Code is apparently working; producing reduced chi squared values near unity for Q and U for the Feb 2007 M82 data. We discussed some next steps: 1. Play with the existing code: - Try it on different data sets - Write up some documentation so others can play with it. (Giles is especially interested in looking at the Q and U maps for improving the DG Tau analysis; in the process of doing so he will be able to do "spot checks" to test whether these Q and U maps are correct, providing an additional useful check.) - For cases where the reduced-chi-squared is > 1, track down offending files by analyzing different subsets of the bins. - For cases where the reduced-chi-squared is about 1, plot up the distributions of reduced-chi-squared and compare with the theoretical prediction for this, as in Novak & Jarrett 1994. (This step will inform step 2 below.) 2. Develop code that can inflate the errors on each pixel (or set of nearby pixels) according to the reduced-chi-squared value for that pixel (or set of nearby pixels) _________________________________________________________________ The following items were not discussed on the telecon: Mike's plans to make a smoothed tau for November 2007 linked list-serves for SHARP (Giles action item) alternate ideas for telecon - iChat on Mac postscript problems on Lero's Mac and on kilauea, and Hiroko's investigation of the kilauea problems. In any case, I guess these are solved. _____________________________________________________________