Forum Replies Created
-
AuthorPosts
-
Grant PrivettParticipant
Still going.
Grant PrivettParticipantNot sure why I ended up with 2 copies of the Perl comment. But to clarify, it was in about 1996 when I was maintaining some astronomy code for a PPARC project called…. Starlink. Code that is still available for Linux.
Grant PrivettParticipantHave fun with that. I never got over versions 4.99, 5.00 and 5.001 all giving different answers to one script I used. 🙂
Grant PrivettParticipantThe code for plate solving under Python is pretty trivial. I’ve got a copy if you ever want it.
Grant PrivettParticipantI obviously misunderstood your intent. I thought cost and convenience to the user was everything. Several members, in response to your request, suggested a variety of approaches that you felt failed on cost grounds. I suggested an alternative approach that required only the purchase of a simple camera, a stick on bubble level and a bit more effort with the software driving the camera. No engineering required. That seemed the cheapest and simplest option for supporting the cash strapped beginner.
Also, I must apologise, I had not realised that when you grabbed a chunk of the image frame you resized it using a method that so badly affected the apparent performance. The dimmer stars are pretty poorly shown. You might want to use a bicubic spline or similar next time. See attached.
Yes, the image section you now supply is much nicer and, as it happens, the image solved with astrometry.net, so you could obviate the need for charts.
I look forward to seeing Roger‘s design hit the market place, but think the commercial mark up will probably take it above the cost of a new ZWO 120mm or the systems sold by Altair, unless made in bulk.
But we’re into vanishing returns here…
Grant PrivettParticipantI think I would be a bit worried by that image. Those are first and second magnitude stars! What sort of exposure are you using there? The camera allegedly has a 60% QE and is only 8 bit, but is good for exposures up to 60s apparently. I’m kind of hoping that’s a 1/25th sec exposure, as I would really expect to see more than that.
On plate solving. Its the elegant way to solve the problem. A stick on spirit level is hardly a demanding set up and you don’t need to buy a finder as you just use the main scope and also don’t need to consult charts – so its a cheaper solution too.
Grant PrivettParticipantStill looking forward to seeing one of the images.
Grant PrivettParticipantI think you will find that the people on here were all beginners once and most did star hopping and remember it well. Some still go that way by choice. I myself didn’t use a GOTO in anger until about 2008 – 37 years since I first had a telescope. While I wanted an LX200 from their introduction onward, mortgages got in the way… An EQ6 eventually proved affordable and is still what I lug outside at night.
My experience suggests that Astrometry.net will generally solve any image with more than 10 stars on with an SNR > 5. I’ve seen lots of frames solve with 8 stars – but it may take longer. So, what size webcam sensor are you thinking of then? In a 1sec exposure with a 100mm aperture (£60 from a car boot sale) with an uncooled Lodestar I would expect to image stars down to 12-13th mag which with the 100mm gave an ~30 arc minute field. So what field of view are we thinking for the webcam? Could you post one of his pictures?
Also, the idea of using the main scope was mainly for simplification ie no need for finder scope and the camera used could be used for object imaging.
If you had an alt-az mount set up roughly level or a German equatorial aligned by eye to the pole star, by recursively taking images and adjusting the scope slow motions (as indicated by software) either in alt/azimuth or RA/Dec – you would move toward the correct position even if the scope was not properly aligned. The better aligned the quicker it would succeed, but it would still work in a few iterations (unless your alignments were hugely out – a spirit level for alt/az and the pole star would avoid that).
The lack of blobbiness might be explained by the colour webcam having an integral IR blocker (quite common) but the cost of that is lower camera sensitivity and fewer stars.
Re: eyeballing. You don’t need to know where the telescope is pointing. If you plate solve, the software will know where its pointing from the plate solution for the middle pixel of the image. Thus it can tell you in which direction to move the slow motions. You just need to move in roughly the right direction.
From where I’m standing – beyond using a guide camera instead of a normal Zoom/Skype webcam – the issue isn’t so much the equipment as the software to use it. Its fairly simple though.
Grant PrivettParticipantI’ve just realised I missed something in this.
Why not put the camera straight on to the main telescope, display the image it generates, platesolve via a cygwin installation of Astrometry.net and have the software tell you how far to move in RA and Dec to get to the right place. Most Astrometry.net solutions only take 10secs on my 5 year old laptop, so you could point at the rough location and be there 2-3 minutes later.
If you are not autoguiding why have the finder guidescope at all? Eyeballing along the tube would give a good start point.
Grant PrivettParticipantSilly question, but why stay with a small refractor when a Meade 85mm reflector (or similar) could be used instead?
I’ve tried CCDs on cheap small aperture refractors and the star images are distinctly blobby because the different colours come to different focuses.
I’ve used a reflector as my autoguider/finder for years – run by my own code – and found using a small reflector, with its less blobby stars, more sensitive than when I used the refractor(s).
Would also say, avoid colour cameras as far as possible, the Bayer matrix reduces sensitivity a lot.
Intrigued by “Star hopping plan”. I just used a copy of Uranometria…
5 May 2021 at 11:55 am in reply to: Introducing MetroPSF – a program for ensemble photometry #584173Grant PrivettParticipantI look at which point is furthest from the generated fit and remove that.
Then I recalculate the fit and repeat until a decent regression coefficient is achieved and/or errors are below a threshold and sufficient stars remain.
Its not an ideal approach and has trouble when there are few stars in the scene, but the results I got suggested it was doing quite a good job – certainly much better than not culling the outliers. It may not make a huge difference generally, but its easy to code up and test so possibly worth a look as a potential refinement.
Had expected stars with the most extreme colour indices would be causing the data points far from the curve, but while they were not great they were not always the worst outliers.
Could probably dig out the code if you wanted it. I was using Gaia DR2 and cooled sensor data captured using a normal Silicon CCD.
4 May 2021 at 10:48 pm in reply to: Introducing MetroPSF – a program for ensemble photometry #584168Grant PrivettParticipantI think the version I use is a standard conversion of the DAOFIND routine to Python. I use it to provide the positions of the stars and then do photometry on them.
After that I compare my coordinates with those of stars in Gaia DR2 and generate the matches. So I then have measured flux versus catalogue mag.
I then fit a linear regression and recursively remove the outliers. I’m fairly sure I found that more successful than using weightings. I had expected extreme colour index stars to cause problems too but that had a relatively minor impact.
I think in my process any star with a peak brightness >50,000 was excluded from the linear regression. As you say setting a magnitude limit should work but the count was easy.
Grant PrivettParticipantHi there,
Not quite sure I understand what your code does.
Differential photometry using Poissonian/Gaussian profile and then using gaia DR2 to get the magnitude of the reference star?
How does the fitted result differ from what you would get using DAOPHOT?
Also, do you set a value that allows stars that are nearing saturation to be ignored?
Is the star to be measured denoted by hand or by RA/Dec somehow?
It does look like code people would find useful.
Grant PrivettParticipantThats rather nice. Must have a go at that.
Saturday best bet here too. Its traditional though, we’re past full moon, hence its cloudy.
Grant PrivettParticipantIts being reported that SpaceX have gained approval to drop the altitude of the constellation to reduce the latency of the system. That will make all the satellites brighter.
Some info here:
Grant PrivettParticipantAs I recall, the Flyeye guys were making some pretty impressive performance claims when this was first announced. It will be interesting to see if they achieve them.
Oddly, I recall the whole system being described as automatic rather than human assisted, but I could easily be wrong. Human intervention would certainly make high sensitivity performance more attainable.
Grant PrivettParticipantEvening Owen,
Thanks for the headsup. By chance, just a couple of weeks ago I was talking to Martin about his code. I was lending a hand on his attempt to get the Lodestar running happily on Windows. We were having fun trying to get Pyusb and the sxccd code on github working okay. Seemed to work beautifully on Linux but is not always a happy camper on Windows with problems depending on which USB backend was in use on your system. I think he is taking the ASCOM route (though I may be wrong) which I will be curious to see, but would prefer to avoid.
I must admit that after several problems with different approaches, I may just cheat and write a VB6 32bit non-gui executable that I can talk to via environmental variables and set that running to take pictures, telling it to stop / reconfigure as needed. I can then shell/spawn/subprocess that from nearly any language I choose and will be okay until Microsoft decide its end of life for 32 bit processes in W10 (hopefully at least 10 years hence). Would still rather do it properly, but am having a terrible time getting there….and, frankly, theres things a lot more fun to do.
Clear sky outside … and 99% full Moon. It was ever thus.
Grant PrivettParticipantWould be nice to see an Agenda and list of speakers…
Grant PrivettParticipantAs late as that! I really had not realised.
Grant PrivettParticipantGot to say that looks fun, was seriously tempted, but I have too much going on despite (or perhaps because of) being now semiretired.
Would suggest that the project is best suited to someone who programs for fun, is familiar with Python (well beyond the “Hello World” stage), has had previous experience automating equipment control and has very good attention to detail.
When you think you have found all the ways a control program can fail, the real world has half a dozen more saved up for a rainy day.
-
AuthorPosts