Forum Replies Created
-
AuthorPosts
-
Grant Privett
ParticipantAn f/8 315mm in a 2.1m dome? That really is tight. Did you add tube counterweights to balance at the midpoint? How frequently did you need to tweak the dome position?
My dome certainly isn’t new, but in pretty good condition, so would prefer to avoid major surgery.
I have a plan for the dome motor – it will probably need a couple of attempts to get it right, but a solution is possible. Finding the time is another. In the autumn perhaps when the gardening duties are slackening off is best bet.
I need to spend some quality time with a Raspberry Pi to and some NEMA 23 motors too.
EDIT: Someone kindly pointed out that the 315mm is a Planewave CDK and comparable in length with a C14. Very nice bit of kit, but a bit out of my budget range. 🙂
Grant Privett
ParticipantNot a problem. I made exactly the same mistake myself at one point. 🙂
Grant Privett
ParticipantI can certainly see a way of doing it with a couple of stepper motors/drives, a Raspberry Pi and after a small amount of dome bodging. The metal bashing is the tough bit. I think I have a plan for that but I may have to do 10 minute adjustments until then, which are not fun in the winter.
I certainly take the point that 11″ would be better. I rather fancied the RASA but the field of view is perhaps a bit big unless what you want is pretty extended nebulae – which isn’t quite me. I don’t know anyone who makes an 11″ f/4-f/5 Newt and a 250mm took too long to reach mag 20 for my taste.
More important is rats attacked the dome last night and tried to gnaw there way in through my lovingly applied mastic… Ho hum.
Grant Privett
ParticipantSmall breaks in cloud from Broad Chalke near Salsibury. Got one or two tolerable pics. Very civilised time here: feet up, glasses on and a cup of tea in hand as we sat in deck chairs and gawped.
Grant Privett
ParticipantStill going.
Grant Privett
ParticipantNot sure why I ended up with 2 copies of the Perl comment. But to clarify, it was in about 1996 when I was maintaining some astronomy code for a PPARC project called…. Starlink. Code that is still available for Linux.
Grant Privett
ParticipantHave fun with that. I never got over versions 4.99, 5.00 and 5.001 all giving different answers to one script I used. 🙂
Grant Privett
ParticipantThe code for plate solving under Python is pretty trivial. I’ve got a copy if you ever want it.
Grant Privett
ParticipantI obviously misunderstood your intent. I thought cost and convenience to the user was everything. Several members, in response to your request, suggested a variety of approaches that you felt failed on cost grounds. I suggested an alternative approach that required only the purchase of a simple camera, a stick on bubble level and a bit more effort with the software driving the camera. No engineering required. That seemed the cheapest and simplest option for supporting the cash strapped beginner.
Also, I must apologise, I had not realised that when you grabbed a chunk of the image frame you resized it using a method that so badly affected the apparent performance. The dimmer stars are pretty poorly shown. You might want to use a bicubic spline or similar next time. See attached.
Yes, the image section you now supply is much nicer and, as it happens, the image solved with astrometry.net, so you could obviate the need for charts.
I look forward to seeing Roger‘s design hit the market place, but think the commercial mark up will probably take it above the cost of a new ZWO 120mm or the systems sold by Altair, unless made in bulk.
But we’re into vanishing returns here…
Grant Privett
ParticipantI think I would be a bit worried by that image. Those are first and second magnitude stars! What sort of exposure are you using there? The camera allegedly has a 60% QE and is only 8 bit, but is good for exposures up to 60s apparently. I’m kind of hoping that’s a 1/25th sec exposure, as I would really expect to see more than that.
On plate solving. Its the elegant way to solve the problem. A stick on spirit level is hardly a demanding set up and you don’t need to buy a finder as you just use the main scope and also don’t need to consult charts – so its a cheaper solution too.
Grant Privett
ParticipantStill looking forward to seeing one of the images.
Grant Privett
ParticipantI think you will find that the people on here were all beginners once and most did star hopping and remember it well. Some still go that way by choice. I myself didn’t use a GOTO in anger until about 2008 – 37 years since I first had a telescope. While I wanted an LX200 from their introduction onward, mortgages got in the way… An EQ6 eventually proved affordable and is still what I lug outside at night.
My experience suggests that Astrometry.net will generally solve any image with more than 10 stars on with an SNR > 5. I’ve seen lots of frames solve with 8 stars – but it may take longer. So, what size webcam sensor are you thinking of then? In a 1sec exposure with a 100mm aperture (£60 from a car boot sale) with an uncooled Lodestar I would expect to image stars down to 12-13th mag which with the 100mm gave an ~30 arc minute field. So what field of view are we thinking for the webcam? Could you post one of his pictures?
Also, the idea of using the main scope was mainly for simplification ie no need for finder scope and the camera used could be used for object imaging.
If you had an alt-az mount set up roughly level or a German equatorial aligned by eye to the pole star, by recursively taking images and adjusting the scope slow motions (as indicated by software) either in alt/azimuth or RA/Dec – you would move toward the correct position even if the scope was not properly aligned. The better aligned the quicker it would succeed, but it would still work in a few iterations (unless your alignments were hugely out – a spirit level for alt/az and the pole star would avoid that).
The lack of blobbiness might be explained by the colour webcam having an integral IR blocker (quite common) but the cost of that is lower camera sensitivity and fewer stars.
Re: eyeballing. You don’t need to know where the telescope is pointing. If you plate solve, the software will know where its pointing from the plate solution for the middle pixel of the image. Thus it can tell you in which direction to move the slow motions. You just need to move in roughly the right direction.
From where I’m standing – beyond using a guide camera instead of a normal Zoom/Skype webcam – the issue isn’t so much the equipment as the software to use it. Its fairly simple though.
Grant Privett
ParticipantI’ve just realised I missed something in this.
Why not put the camera straight on to the main telescope, display the image it generates, platesolve via a cygwin installation of Astrometry.net and have the software tell you how far to move in RA and Dec to get to the right place. Most Astrometry.net solutions only take 10secs on my 5 year old laptop, so you could point at the rough location and be there 2-3 minutes later.
If you are not autoguiding why have the finder guidescope at all? Eyeballing along the tube would give a good start point.
Grant Privett
ParticipantSilly question, but why stay with a small refractor when a Meade 85mm reflector (or similar) could be used instead?
I’ve tried CCDs on cheap small aperture refractors and the star images are distinctly blobby because the different colours come to different focuses.
I’ve used a reflector as my autoguider/finder for years – run by my own code – and found using a small reflector, with its less blobby stars, more sensitive than when I used the refractor(s).
Would also say, avoid colour cameras as far as possible, the Bayer matrix reduces sensitivity a lot.
Intrigued by “Star hopping plan”. I just used a copy of Uranometria…
5 May 2021 at 11:55 am in reply to: Introducing MetroPSF – a program for ensemble photometry #584173Grant Privett
ParticipantI look at which point is furthest from the generated fit and remove that.
Then I recalculate the fit and repeat until a decent regression coefficient is achieved and/or errors are below a threshold and sufficient stars remain.
Its not an ideal approach and has trouble when there are few stars in the scene, but the results I got suggested it was doing quite a good job – certainly much better than not culling the outliers. It may not make a huge difference generally, but its easy to code up and test so possibly worth a look as a potential refinement.
Had expected stars with the most extreme colour indices would be causing the data points far from the curve, but while they were not great they were not always the worst outliers.
Could probably dig out the code if you wanted it. I was using Gaia DR2 and cooled sensor data captured using a normal Silicon CCD.
4 May 2021 at 10:48 pm in reply to: Introducing MetroPSF – a program for ensemble photometry #584168Grant Privett
ParticipantI think the version I use is a standard conversion of the DAOFIND routine to Python. I use it to provide the positions of the stars and then do photometry on them.
After that I compare my coordinates with those of stars in Gaia DR2 and generate the matches. So I then have measured flux versus catalogue mag.
I then fit a linear regression and recursively remove the outliers. I’m fairly sure I found that more successful than using weightings. I had expected extreme colour index stars to cause problems too but that had a relatively minor impact.
I think in my process any star with a peak brightness >50,000 was excluded from the linear regression. As you say setting a magnitude limit should work but the count was easy.
Grant Privett
ParticipantHi there,
Not quite sure I understand what your code does.
Differential photometry using Poissonian/Gaussian profile and then using gaia DR2 to get the magnitude of the reference star?
How does the fitted result differ from what you would get using DAOPHOT?
Also, do you set a value that allows stars that are nearing saturation to be ignored?
Is the star to be measured denoted by hand or by RA/Dec somehow?
It does look like code people would find useful.
Grant Privett
ParticipantThats rather nice. Must have a go at that.
Saturday best bet here too. Its traditional though, we’re past full moon, hence its cloudy.
Grant Privett
ParticipantIts being reported that SpaceX have gained approval to drop the altitude of the constellation to reduce the latency of the system. That will make all the satellites brighter.
Some info here:
Grant Privett
ParticipantAs I recall, the Flyeye guys were making some pretty impressive performance claims when this was first announced. It will be interesting to see if they achieve them.
Oddly, I recall the whole system being described as automatic rather than human assisted, but I could easily be wrong. Human intervention would certainly make high sensitivity performance more attainable.
-
AuthorPosts