Forum Replies Created
-
AuthorPosts
-
Grant PrivettParticipant
A once a month 120s shot of Gyulbudaghian’s nebula would be very welcome in the Deep Sky Section I imagine….
Grant PrivettParticipantLooked for Mayak with 10×50 bins during the 00:06 pass tonight and saw nothing. Sky was hazy so limiting mag 6.5-7.0 but nothing seen at all. I assume it has yet to deploy.
Has anyone had any luck yet?
Grant PrivettParticipantAm I reading this right? Its a accumulating signal from a bunch of short exposures, stacking them and then displaying the result on a small screen inside an eyepiece? Quite neat then. Impressive in something so compact.
No different to looking at your laptop screen of course but easier to work with.
Grant PrivettParticipant“Mayak will stay on orbit for one month. After termination of term of use it will be deorbited and burnt in the atmosphere.”
That was on their website. I take that to mean it will exist as a simple cubesat for a while (a month?) and it will collect data. Then when that mission is achieved it will deploy the sail to deorbit. I imagine a month also gives them a chance to get a good handle on how the orbit would decay without the sail, so that after deployment the change becomes more apparent.
Cubesats can be pretty dim so an up to date TLE, good pointing accuracy and a GPS system setting your system clock are essentials for a tracking system.
Grant PrivettParticipantIt may even be right.
I observed the Canadian CanX-7 satellite which had a smaller solar sail deployed and during one pass it became as bright (for a second) as Regulus – perhaps because it was pretty obviously tumbling at that point.
Should be worth a look. Is there a date yet for when it deploys?
Grant PrivettParticipantAn image of C/2017 k2 at mag 18.9 and 15.9AU from the Sun. As seen through thin cloud (was supposed to be clear but clouded while I was setting up – not exactly uncommon round here) with a 10″ RC and Trius 694 camera.
I additive stacked 120s exposures tracked at star rate only. The variable transparency would have played hell with a median stack.
The position is close to the prediction in RA but – if Astrometry.net is right – not quite right in Dec. My SNR was too poor for a clearer assessment.
The frames need to be aligned on the comets motion for a better result.
Grant PrivettParticipantDoes IRAF allow filename wildcards to avoid the need to name the files in a .dat file?
Grant PrivettParticipantThat sounds like IRAF. I recall trying to fit a high order polynomial to a dataset and IRAF crashed inelegantly. I had asked for a higher order than the data would permit but, rather than tell me so, it just died instead. Its very powerful, but assumes that those using it don’t do things that are daft.
Its worth learning…
Cannot say whether there was a GUI. You would think so after all these years but professional astronomers are perfectly happy with personalised batch files processing chains akin to a DOS .bat file. Its us Windows users who have gone soft 🙂
Grant PrivettParticipantI was struck by how obvious the 17th mag SN was, despite being seen against the spiral arm.
Taken with a Starlight Trius 694 and an Altair 10″ RC. 8x 60s exposures centred on 224139UT on 26th May 2017.
Grant PrivettParticipantWhat have vacuum cleaners got to do with this? 🙂
Grant PrivettParticipantSounds really worth while and practical. Which university was it with?
Grant PrivettParticipantYes, lasers can mess up CCDs – if you dump enough coherent energy into a device it is no big surprise they can fail.
I’m not sure what the energy density required is though.
Grant PrivettParticipantI assume they are worried that a visual observer who happened to be looking down the barrel of a space based LIDAR laser beam might suffer eye damage if looking directly at it through a large telescope using the naked eye. CCDers would not be under threat.
Afterall, even if you were using a laser of a wavelength generally deemed “eye safe” then the light grasp of a 12″ could massively increase the number of photons reaching your eye.
EDIT: Should have read the very end of the survey. They are indeed trying to model the risk to ground observers.
Grant PrivettParticipantThat’s a very worthwhile thing to do.
IRAF is used by lots of people to very good effect – especially by professionals. People can save a lot of effort and money this way, but the learning curve can be steep. Did much of the IRAF command line code get integrated into GUIs?
Could a case be made for doing this under Cygwin? Does IRAF run under that? What are the advantages? I know STARLINK has appeared under Cygwin.
Grant PrivettParticipantYep. AstroArt6 can do photometry, astrometry and relative photometry (batch processing). Look under the Tools options. You have to set up the Star Atlas so its looking at the same lump of sky as your image, which can be a little fiddly, but it works well enough. When I was measuring images of PV Cephei it seemed to produce pretty acceptable results – though I cannot say precisely how accurate as I wasn’t using a V filter.
I find AstroArt very easy to use, but I have been using it since version 3 🙂
Grant PrivettParticipantThanks for the sanity check Callum. Have checked my files and the £15k system I worked with was SWIR and purchased about 4 years ago and worked at 12bits. Didn’t realise they were so much more sanely priced now (though still out of range for us mortals).
Which of the Kites was that price for?
Grant PrivettParticipantI thought EMCCDs usually clock in at around £15,000 so they are a bit of a niche market. Hope you manage to find a donor.
The QE can be very good – and some go out into NIR (1.8micron) but many are stuck with a maximum integration of 1/25 seconds, a slightly unpredictable gain control and an 8 bit output format. So for dim targets its probably actually cheaper to co-mount 2 OTAs and use 2 CCD cameras at the same time (arranged so the start of readout of one initiates the integration of the other).
Nice idea though.
Grant PrivettParticipantAm curious.
What is it people generally use AIP4WIN for?
Which parts of the image/data process is it seen as essential for?
There are obviously alternatives for much of what it does.
Grant PrivettParticipantYep. Defect pixels are those that cannot be relied on: being either too sensitive, too insensitive or just plain whacky in their behaviour. When a chip contains that many pixels its not surprising that a few are not perfect.
I also tend to create my defect map using the master dark. I measure the standard deviation and image background (Statistics option if using AstroArt) of the dark and then set the threshold at background + 5 x standarddev. The Starlight 694 I use doesn’t have many defective pixels and that usually cleans them up. My approach is overkill perhaps, but its adaptive and does lend itself to processing automation – I still sometimes use the Starlink CCDPACK image reduction system under Linux.
But if you always use the same camera temperature and binning, then a hardwired threshold derived by experiment should be fine – as Robin has found.
Grant PrivettParticipantIt is usually the case that the best way to change an organisation, is from within. Why not stand for Council?
Alternatively, write a paper correcting the perceived error or start a thread here.
Its worth remembering that your suggestion opens the way to the vindictive pedant whose only pleasure is criticism – and astronomy has some of those. We’ve seen it in the past. Its not fun to watch.
-
AuthorPosts