Forum Replies Created
-
AuthorPosts
-
Dr Paul LeylandParticipant
I had similar problems when using the PaleMoon browser (a Firefux derivative). Logging in again fixed it for me.
Seems strange that three separate browsers running on very different hardware should all have the same issue…
Dr Paul LeylandParticipant“launching astronauts to the International Space Station on Crew Dragon at 21.32 on May 27th”; “The UK has an ISS pass at 21.20,”; “and the[n] go into garden to wave them on there [sic] way.”
Am I missing something? How do I subsequently wave at the crew 12 minutes before they leave the Earth?
Dr Paul LeylandParticipantCould it be run as a Zoom meeting?
Dr Paul LeylandParticipantOK, OK, I can take a hint …
😉
Dr Paul LeylandParticipant“Is anyone else making different observing plans in the current situation?”
Yup, I’m trying to get hold of a telescope to use until I can get back to La Palma. See the “Telescope wanted” thread.
Plans that either come to naught or half a page of scribbled lines.
Hanging on in quiet desperation is the English way.
The time is gone, the song is over, thought I’d something more to say.Dr Paul LeylandParticipantFunny you should say that…
Already checked before you posted. Nothing there but I placed an ad in the Wanted forum.
Dr Paul LeylandParticipantUnfortunately that one has been sold, so I’m still in the market.
Dr Paul LeylandParticipantMy attention has just been drawn to a paper by Jonathon McDowell and available at https://arxiv.org/abs/2003.07446
One of his references is to a paper by Buffon published in 1777.
Dr Paul LeylandParticipantActually, gaseous NO_2 is in dynamic equilibrium with its dimer N_2_O_4, aka dinitrogen tetroxide. It is the former which has the characteristic brown colour, the dimer being colourless.
I always knew that A-Level chemistry would come in useful one day.
Dr Paul LeylandParticipantI may be wrong but I believe the nitrogen-oxygen compound in the lower atmosphere is NO_2 (nitric oxide for old-timers, nitrogen dioxide for those born in the last fifty years) and not N_2O (nitrous oxide / dinitrogen oxide to the IUPAC fans).
Be that as it may, satellite monitoring has shown a truly dramatic decrease in the concentration of nitrogen oxides in mainland China recently. https://earthobservatory.nasa.gov/images/146362/airborne-nitrogen-dioxide-plummets-over-china has more detail. (Incidentally it supports my claim that NO_2 is the compound in question.)
Dr Paul LeylandParticipantI have been using APT (Aperture Photometry Tool) by Russ Laher for all my VS work. It is free, as in both speech and beer, and platform-neutral because it is wrotten in Java. Russ is on the IPAC team and has produced a very fine program. He is responsive to bug reports and feature requests, a few of which I have made.
APT produces output in its own TBL format (basically TSV with an initial couple of comment lines and another at the end) which it can export to CSV for loading into any standard spreadsheet. One nice feature of APT is that one FITS card can be exported into a CSV column — I use it to record for either JD or HJD according to what is desired for later analysis — as well as the complete FITS header where it is readily available for subsequent processing.
Converting the CSV into a BAA-VSS TSV-format file is easy enough and I will happily provide my script on request. The script uses ensemble photometry to produce an instrumental zero point magnitude from a list of (magnitude, error) pairs for the sequence members and then propagates errors appropriately to the derived magnitude of the VS.
I have another script which take an AAVSO VSP photometry web page and generates the correct source list and sequence data files for APT and the script noted above respectively. Naturally this script is also freely available.
Dr Paul LeylandParticipantI’m a frayed knot. I haven’t shaved since the summer of 1976.
Hmm, looks like this could be me. I have changed my appearance a lot in 30 years and I’m not sure I recognise myself any more.
Dr Paul LeylandParticipantWould you expand, please, on why you question those statements?
Dr Paul LeylandParticipantWell, my forum avatar is a mugshot of me. Unlikely to be of much use though unless you had a 4d-aware camera at the time. 😉
Dr Paul LeylandParticipantI attended the meeting of 1991-09-20/22 in Durham so there’s a chance that I may be on some of the photos. Not spotted me yet but there are a number of folks barely visible behind others. Perhaps I am on some of the unpublished work.
Also a nice motorbike ride from Bucks. as I remember.
Dr Paul LeylandParticipantAlas poor Heather! I knew her, David, a woman of infinite jest, of most excellent fancy.
Heather Couper and I knew each other from our Oxford days. She once gave a talk to OUAS (the Oxford University Astronomical Society) and greeted me with the phrase: “Hello Paul, my old sausage, how are you doing?”. This caused a little surprise in the people near by, one of whom asked me: “What did she just call you?”.
She was often called “Heather Cowpat”, but never in her hearing AFAIK.
Very sad.
Dr Paul LeylandParticipant“no reason why we can’t regard the first decade of the Common Era as having only 9 years”. I can think of an excellent reason and it is entirely a matter of etymology. “Deca”, from the Greek Δεκα, meaning “ten”.
If you wish to refer to the first few years CE as a nonade please go ahead and do so — I will support you whole-heartedly.
Dr Paul LeylandParticipantMy thanks to Andy and Robin for their helpful advice. I was aware what is held in the DB but that explanation should be useful to others. The suggestion to contact the PI is a good one and as I will be keeping all the raw (but calibrated with darks and flats) images that will remain a possibility for those who may wish to re-analyse my data.
Most people, though, are unlikely to be that finicky and would like to have an overview of the light curve which is sufficiently good for their purposes without having to spend a great deal of effort. With that in mind, I will stack the images such that the integrated SNR is at least 30, yielding a statistical uncertainty of ~30mmag, easily good enough to pick up the pm 110mmag variation reported so far. As the database holds an explen field for each entry, I don´t see that it matters whether three subs or a dozen are stacked for each individual measurement — as long as the total duration and effective mid-exposure time is recorded of course, as it will be.
FWIW, I store essentially every image ever taken, with its metadata in a PostgreSQL database, because disk space is cheap and I have yet to reach even a terabyte of data. Multiply that by a few hundred observers, especially if they are taking high frame rate videos, and I can see it starting to become a problem.
Dr Paul LeylandParticipantThanks. You understand my proposal and I was specifically thinking of the analysis phase. That should have been made clearer.
Unfortunately from the analysis point of view, the database stores only a magnitude and its uncertainty at a particular JD. The last is an annoyance but nothing more as conversion to the more useful HJD or BJD is a tedious but straightforward computation. The first two destroy some of the information held by the intensity counts of the VS, the sequence stars, local sky backgrounds and the latter’s variance. Further, it does not store the data pertaining to other stars in the image which can be used — with care — to give a tighter estimate of the ZPM and its uncertainty through ensemble photometry. Needless to say, I keep all the raw data around for subsequent re-analysis. Please note that I am not saying that the database structure should be changed!
Perhaps I should submit the raw 30s results with their 0.05 to 0.1 magnitude uncertainties and let others smooth them as they wish for their analysis. I can pretty much guarantee the 3.2-hour variability will not be visible without such smoothing.
Dr Paul LeylandParticipantMore 30s captures of U Leo are coming in to add to the hundreds already stored. This is all rather tedious so I have been thinking about data processing and, in particular, about stacking the images. The SNR of each sub varies between about 10 and about 20, depending on sky transparency, air mass, etc. A desirable SNR might be 30-50 so perhaps 9 should be stacked.
So far, so simple. I could just divide up each night’s observations into consecutive and non-overlapping sub-sequences of length 9, stack them and use the mid-exposure time as the date of each measurement. However, it seems that there is nothing particularly magic about any particular selection of 9 consecutive images. If the first image, say, was deleted by accident the process would yield much the same output but would contain a sample of the light curve slightly displaced from that which would have been created from the full set of images.
If this is the case, why not create 9 light curves, each with a temporal displacement from its neighbour set by the cadence of the subs?
Each curve would have no more information than any other, and the sum of them would be just as noisy in intensity values but would have more temporal sampling. Might not this assist subsequent processing to extract a smoother light curve from the noisy data? If not, what am I missing? To me it appears related to a simple running-average smoohng process. It’s late and I am sleepy so could well be overlooking something which should be obvious.
-
AuthorPosts