British Astronomical Association
Supporting amateur astronomers since 1890

Secondary menu

Main menu

Home Forums Variable Stars
Terms of use

VSS Campaign to observe U Leo

15 posts / 0 new
Last post
Jeremy's picture
Offline
Last seen: 5 hours 3 min ago
Joined: 04/03/2014 - 22:00
VSS Campaign to observe U Leo

The 2019 December BAA Journal and the Variable Star Section Circular 182 contains an article about the curious variable star, U Leonis. Although U Leo has been linked to a possible nova in 1885, this is by no means certain and its identity remains something of a mystery.

The VSS has launched a CCD photometry campaign on U Leo during the current observing season with the aim of shedding light on this enigmatic star.

I have recently been alerted by Professor Boris Gänsicke (University of Warwick) that he hopes to get some spectroscopy on U Leo during February 2020 using the Isaac Newton Telescope. This might allow measurement of the radial velocity of the F-type star in the system. It would therefore be helpful to obtain ground-based photometry in support of his observing run from now onwards.

Given this star is rather faint, at mag 17.3, it is a target for CCD observers. It will require long and unfiltered exposures to get a reasonable signal-to-noise ratio.

A chart and sequence for U Leo is available from the AAVSO website. Please submit your observations to the BAA VSS or the AAVSO database. U Leo is located at RA 10 24 03.81 Dec +14 00 25.9 (J2000.0)

The campaign is already running and will continue until the end of the 2020 observing season. Many thanks to all who are already observing this star.

If you have any questions, please contact me.

Jeremy Shears

Director, BAA Variable Star Section

Xilman's picture
Offline
Last seen: 5 hours 18 min ago
Joined: 24/03/2018 - 15:17
I took 100 minutes of

I took 100 minutes of unfiltered 30s images in the wee hours of 2020-01-20, though the SNR is so low that to get adequate SNR they will probably have to be co-added a few at a time. Another run took place on 2020-01-04. They, as well as some precision photometry in Johnson V and Sloan r´, still need to be processed. Perhaps this data is too early for the INT run; I will try to take some more in the next few weeks.

A question arises from our earlier discussion in the forum about the use of non-standard sequences for VS observations to be submitted to the BAA-VSS database. There is no standard sequence for unfiltered data. Advice would be welcome, whether here, in the original thread, or off-line via email. Another case without any standard sequence concerns the observations of AT 2019xim reported on my members page.  I have specifically withheld submission of the results for this reason.

Andy Wilson's picture
Offline
Last seen: 1 day 5 hours ago
Joined: 29/03/2014 - 16:05
Unfiltered data

Hi Paul,

In answer to your questions.

You should use the "CV" filter where you take unfiltered photometry and calibrate using V filter comparison star magnitudes. This filter is standard in the BAA and AAVSO and is there exactly for this purpose.

It is common to add images if the SNR is very low, or to take longer individual subs.

It appears the AAVSO have a sequence for AT 2019xim. It is fine and common practice to quote AAVSO sequences when submitting to the BAA Photometry database when a BAAVSS sequence is not available or if it is not appropriate for digital photometry. They have far greater resources available to produce sequences. Just quote the chart reference, e.g. AAVSO X25107PN or X25107PN.

Best wishes,

Andy

Xilman's picture
Offline
Last seen: 5 hours 18 min ago
Joined: 24/03/2018 - 15:17
Thanks!

OK, CV it is then. The sequence for AT 2019xim must have been added after I took the data, which is not surprising.  My observations were completed within 48 hours of the ATEL being released.

My SOP for all variables except the very brightest is to take 30-second subs and stack in real time.  That way I can estimate the SNR and move to another object when it is high enough or very obviously too faint to be measured with sufficient precision. Stars which can vary by several magnitudes between successive observations can easily turn out to be saturated or invisible if purely dead reckoning is used to set exposure lengths.

Jeremy's picture
Offline
Last seen: 5 hours 3 min ago
Joined: 04/03/2014 - 22:00
U Leo

Good to hear you are getting CV data on U Leo, Paul. Hopefully you will be able to continue to get some runs during Feb.

Jeremy

Xilman's picture
Offline
Last seen: 5 hours 18 min ago
Joined: 24/03/2018 - 15:17
Exposure expectations.

"Given this star is rather faint, at mag 17.3, it is a target for CCD observers. It will require long and unfiltered exposures to get a reasonable signal-to-noise ratio."

It is rather tedious sitting here waiting for photons to arrive so I am trying to get ball-park estimates of what is likely to be achieved. As the autoguider on the 0.4m Dilworth / unfiltered CCD combination still doesn't work I restrict each exposure to 30 seconds. The sky tonight is fairly dark but not exceptionally so. The SX 814 has a very low dark noise and so the major limit on SNR is sky glow. I have found that averaging 20 subs, for a total exposure time of 10 minutes, gives a SNR of 50 to 55 when U Leo is at altitude of 60 degrees. It will be worse at lower altitudes, of course, or if the Moon is above the horizon. A SNR of 50 corresponds to a precision of 0.02 magnitudes. The light curve in the VSS Circular suggests that the peak-to-peak amplitude is perhaps 0.1 magnitudes. A period of 3.2 hours corresponds to 192 minutes. Accordingly I can hope for at most 19 samples per period  at an amplitude precision of 20%. In practice it will be less because of download times and inevitable discarded images from poor tracking.

It is going to take a lot of heavy duty signal processing to get robust results from my observations alone. If I had perhaps ten times the data I should be able to get something useful. We need lots more observers, in other words.

One unexpected side benefit is that U Leo lies in a rich field of galaxies, including a bright face-on spiral known as 2MASX J10234921+1357083 (it appears on Jeremy's finder chart roughly halfway between U Leo and the bright star in the lower left corner; an edge-on spiral of around 17th magnitude lies to its left).  Only a few hints of spiral arms have appeared on a large stack but a "pretty-picture" may yet be possible. The galaxy's magnitude is g=15.9 and r=15.2 so it shows up on every single sub.  There are many more galaxies brighter than 20th magnitude within a few arc minutes of U Leo; they are starting to appear in my data as the stacks get deeper.

Added in edit: since typing the above text the SNR has almost doubled.  Presumably the sky has become notably brighter for some reason I do not understand.  Perhapsthe signal processing may be easier than feared.

Andy Wilson's picture
Offline
Last seen: 1 day 5 hours ago
Joined: 29/03/2014 - 16:05
Exposure and magnitude uncertainties

Hi Paul,

In my opinion an uncertainty of 0.02 magnitudes is excellent on this target. The uncertainties in the CRTS photometry in the VSS Circular phase diagram and other VSS observers appear to range between roughly 0.03 to 0.15 magnitudes, so you are doing much better.

The lower SNR observations are still useful as statistical data can be extracted from many observations, and they confirm U Leo is not doing anything different. Though as always the higher the SNR the better, so I am sure your efforts will be greatly appreciated.

Andy

Xilman's picture
Offline
Last seen: 5 hours 18 min ago
Joined: 24/03/2018 - 15:17
Stacking subs

More 30s captures of U Leo are coming in to add to the hundreds already stored.  This is all rather tedious so I have been thinking about data processing and, in particular, about stacking the images.  The SNR of each sub varies between about 10 and about 20, depending on sky transparency, air mass, etc.  A desirable SNR might be 30-50 so perhaps 9 should be stacked.

So far, so simple. I could just divide up each night's observations  into consecutive and non-overlapping sub-sequences of length 9, stack them and use the mid-exposure time as the date of each measurement. However, it seems that there is nothing particularly magic about any particular selection of 9 consecutive images. If the first image, say, was deleted by accident the process would yield much the same output but would contain a sample of the light curve slightly displaced from that which would have been created from the full set of images.

If this is the case, why not create 9 light curves, each with a temporal displacement from its neighbour set by the cadence of the subs?

Each curve would have no more information than any other, and the sum of them would be just as noisy in intensity values but would have more temporal sampling.  Might not this assist subsequent processing to extract a smoother light curve from the noisy data?  If not, what am I missing?  To me it appears related to a simple running-average smoohng process. It's late and I am sleepy so could well be overlooking something which should be obvious.

Andy Wilson's picture
Offline
Last seen: 1 day 5 hours ago
Joined: 29/03/2014 - 16:05
Using the same images more than once

Hi Paul,

If I have understood what you are proposing, then it should not be done for submissions to the database as a single same image would contribute to multiple measurements/observations.

My understanding is you would first combine images 1-9 to get a good signal, then images 2-10, and so on up to 9-18. This is of course simplified as you would have many images. Image 9 would be used 9 times, and so the same photons would contribute to 9 observations and that would be wrong.

You can do this kind of thing in an analysis, as long as what has been done is stated. Basically it is a smoothing function. It just should not be used for submitted observations to the database. This leaves the choice of whether or not to smooth with the researcher.

Best wishes,

Andy

Xilman's picture
Offline
Last seen: 5 hours 18 min ago
Joined: 24/03/2018 - 15:17
Smoothing etc

Thanks.  You understand my proposal and I was specifically thinking of the analysis phase.  That should have been made clearer.

Unfortunately from the analysis point of view, the database stores only a magnitude and its uncertainty at a particular JD. The last is an annoyance but nothing more as conversion to the more useful HJD or BJD is a tedious but straightforward computation. The first two destroy some of the information held by the intensity counts of the VS, the sequence stars, local sky backgrounds and the latter's variance. Further, it does not store the data pertaining to other stars in the image which can be used --- with care --- to give a tighter estimate of the ZPM and its uncertainty through ensemble photometry. Needless to say, I keep all the raw data around for subsequent re-analysis. Please note that I am not saying that the database structure should be changed!

Perhaps I should submit the raw 30s results with their 0.05 to 0.1 magnitude uncertainties and let others smooth them as they wish for their analysis. I can pretty much guarantee the 3.2-hour variability will not be visible without such smoothing.

Andy Wilson's picture
Offline
Last seen: 1 day 5 hours ago
Joined: 29/03/2014 - 16:05
Data held in the BAA Photometry Database

Hi Paul,

I think I understand what you are saying about the database, but I will just explain what is held to avoid possible confusion, including anyone who may be following this thread. Where the observer provides the instrumental magnitudes and uncertainties/errors of the variable and comparison stars then this data is recorded in the database. Thus, if someone wants to recalculate the magnitude using different reference magnitudes, or to exclude one of more of the comparison stars then this is possible. It does not hold information on non-comparison stars, e.g. every star in an image, nor does it hold data on background counts etc. We only require the Julian Date to be submitted, noting HJD and BJD can be calculated, and as a general rule data that can be calculated from existing data is not stored. The exception to this is the derived magnitude, as different observers could calculate this by different methods, and it is desirable to hold what the observer calculated.

As you are doing, it is a good idea to store images including the calibration frames. Then if a recalculation is required, or there is a query from a researcher, it is possible to delve into the detail.

There is no easy or right/wrong answer on whether to submit individual images or combined images. Individual images allow researchers to choose how to combine data (within the limitation of not having the raw images to hand), while the observer doing this can make life easier for the researcher. Also, the observer will be most familiar with their setup and the images. Personally I would suggest combing images if your results are dominated by the noise.

Cheers,

Andy

Robin Leadbeater's picture
Online
Last seen: 2 min 19 sec ago
Joined: 05/03/2014 - 00:50
campaign or general use

In this case since it is a particular campaign with specific goals I would probably look to the PI for guidance.

In general though the processing (including combining) of data for submission to databases where the end use is unknown (and in this case potentially may not be known until all of us are long gone) is an ever present dilemma. I faced this in real life for example concerning databases storing vast quantities of process control and quality time series data from a continuous process (a paper machine) where (similar to astronomy) variations over several orders of timescale (from milliseconds to years in that case) are potentially of relevance. Ultimately, storing the data from every exposure and letting the final user make the decision would be ideal  (aided perhaps by tools in the database to allow the casual user interrogating the database to view the filtered data).  Andy (like our paper mill IT manager at the time) might baulk at every exposure being measured and stored individually indefinitely though!  An alternative approach could perhaps be based on examining the data prior to submission to look at what point  when combining data the variation if any becomes significant compared to the uncertainty, thus preserving the maximum information while storing  the minimum of data.  

Cheers

Robin

Andy Wilson's picture
Offline
Last seen: 1 day 5 hours ago
Joined: 29/03/2014 - 16:05
Storing every exposure

There is no problem storing the photometry from every exposure, in fact I think that is what the majority of observers do. For faint targets, deciding whether and how many sub-frames to combine is always a tricky choice.

Cheers,

Andy

Xilman's picture
Offline
Last seen: 5 hours 18 min ago
Joined: 24/03/2018 - 15:17
Helpful comments.

My thanks to Andy and Robin for their helpful advice. I was aware what is held in the DB but that explanation should be useful to others. The suggestion to contact the PI is a good one and as I will be keeping all the raw (but calibrated with darks and flats) images that will remain a possibility for those who may wish to re-analyse my data.

Most people, though, are unlikely to be that finicky and would like to have an overview of the light curve which is sufficiently good for their purposes without having to spend a great deal of effort. With that in mind, I will stack the images such that the integrated SNR is at least 30, yielding a statistical uncertainty of ~30mmag, easily good enough to pick up the \pm 110mmag variation reported so far. As the database holds an explen field for each entry, I don´t see that it matters whether three subs or a dozen are stacked for each individual measurement --- as long as the total duration and effective mid-exposure time is recorded of course, as it will be.

FWIW, I store essentially every image ever taken, with its metadata in a PostgreSQL database,  because disk space is cheap and I have yet to reach even a terabyte of data. Multiply that by a few hundred observers, especially if they are taking high frame rate videos, and I can see it starting to become a problem.

Jeremy's picture
Offline
Last seen: 5 hours 3 min ago
Joined: 04/03/2014 - 22:00
SNR

Paul, I shall be seeking photometry that has the best possible SNR with a suitable time resolution (mutually exclusive!) to refine the ca 3.2 h period. To do, this I will be combining your data with other observers to get as long a baseline as possible. So please process your photometry with this in mind. If this is not suitable for the database (bearing in mind Andy's guidance), then please send the data to me directly.

The upshot of all this is that 30 sec exposures are not really long enough, so stacking will help. But the real answer is to get your autoguider on line; I wish you luck with that. I have used 120 sec integrations with a C11, but which are not really long enough. Another observer is using 240 sec integrations.