Following from my previous post about my first session in VS observing, I have another question.
In spectroscopy we are advised to use a ‘cosmetic outliers’ file to clean up extremely hot pixels. If you dont, you can end up with some spurious, very sharp emission lines! I think the 1D summation/binning done in spectroscopy maks it extremely sensitive to hot pixels. I looked for a similar process in photometry but, while AstroImageJ (for example) includes an option to do it, it seems to be deprecated.
I generally eyeball my data to see if there are any strange data points. Looking at the measured errors can also give a clue as to what might be happening- as can having a look at what is happening to the Check star at the same time. I’d be vary wary of any automated process for cleaning data.
But be careful about being too enthusiastic about deleting data as the results might be real!
Further to Jeremy’s comment: the use of dark and flat frames goes a long way to to cleaning your image of truly hot and cold pixels. However, it cannot deal with either cosmic ray hits or satellite trails close to the VS or a comparison. If you see any anomalously faint or bright estimate, take a close look at the corresponding image to see if there is an obvious reason to reject it.
In my case, I throw away perhaps two estimates per thousand. It’s only when processing thousands of images does it become really noticeable.
I have done regular processing with darks and flats, so I think I’ll leave the (semi)-automated processing for now [The cosmetic file is usually generated using a threshold on the dark image, looking for outliers]
I liked AstroImageJ as it had a simple preview facility which allows you to scroll through the images