Forum Replies Created
-
AuthorPosts
-
Grant PrivettParticipant
Still looking forward to seeing one of the images.
Grant PrivettParticipantI think you will find that the people on here were all beginners once and most did star hopping and remember it well. Some still go that way by choice. I myself didn’t use a GOTO in anger until about 2008 – 37 years since I first had a telescope. While I wanted an LX200 from their introduction onward, mortgages got in the way… An EQ6 eventually proved affordable and is still what I lug outside at night.
My experience suggests that Astrometry.net will generally solve any image with more than 10 stars on with an SNR > 5. I’ve seen lots of frames solve with 8 stars – but it may take longer. So, what size webcam sensor are you thinking of then? In a 1sec exposure with a 100mm aperture (£60 from a car boot sale) with an uncooled Lodestar I would expect to image stars down to 12-13th mag which with the 100mm gave an ~30 arc minute field. So what field of view are we thinking for the webcam? Could you post one of his pictures?
Also, the idea of using the main scope was mainly for simplification ie no need for finder scope and the camera used could be used for object imaging.
If you had an alt-az mount set up roughly level or a German equatorial aligned by eye to the pole star, by recursively taking images and adjusting the scope slow motions (as indicated by software) either in alt/azimuth or RA/Dec – you would move toward the correct position even if the scope was not properly aligned. The better aligned the quicker it would succeed, but it would still work in a few iterations (unless your alignments were hugely out – a spirit level for alt/az and the pole star would avoid that).
The lack of blobbiness might be explained by the colour webcam having an integral IR blocker (quite common) but the cost of that is lower camera sensitivity and fewer stars.
Re: eyeballing. You don’t need to know where the telescope is pointing. If you plate solve, the software will know where its pointing from the plate solution for the middle pixel of the image. Thus it can tell you in which direction to move the slow motions. You just need to move in roughly the right direction.
From where I’m standing – beyond using a guide camera instead of a normal Zoom/Skype webcam – the issue isn’t so much the equipment as the software to use it. Its fairly simple though.
Grant PrivettParticipantI’ve just realised I missed something in this.
Why not put the camera straight on to the main telescope, display the image it generates, platesolve via a cygwin installation of Astrometry.net and have the software tell you how far to move in RA and Dec to get to the right place. Most Astrometry.net solutions only take 10secs on my 5 year old laptop, so you could point at the rough location and be there 2-3 minutes later.
If you are not autoguiding why have the finder guidescope at all? Eyeballing along the tube would give a good start point.
Grant PrivettParticipantSilly question, but why stay with a small refractor when a Meade 85mm reflector (or similar) could be used instead?
I’ve tried CCDs on cheap small aperture refractors and the star images are distinctly blobby because the different colours come to different focuses.
I’ve used a reflector as my autoguider/finder for years – run by my own code – and found using a small reflector, with its less blobby stars, more sensitive than when I used the refractor(s).
Would also say, avoid colour cameras as far as possible, the Bayer matrix reduces sensitivity a lot.
Intrigued by “Star hopping plan”. I just used a copy of Uranometria…
5 May 2021 at 11:55 am in reply to: Introducing MetroPSF – a program for ensemble photometry #584173Grant PrivettParticipantI look at which point is furthest from the generated fit and remove that.
Then I recalculate the fit and repeat until a decent regression coefficient is achieved and/or errors are below a threshold and sufficient stars remain.
Its not an ideal approach and has trouble when there are few stars in the scene, but the results I got suggested it was doing quite a good job – certainly much better than not culling the outliers. It may not make a huge difference generally, but its easy to code up and test so possibly worth a look as a potential refinement.
Had expected stars with the most extreme colour indices would be causing the data points far from the curve, but while they were not great they were not always the worst outliers.
Could probably dig out the code if you wanted it. I was using Gaia DR2 and cooled sensor data captured using a normal Silicon CCD.
4 May 2021 at 10:48 pm in reply to: Introducing MetroPSF – a program for ensemble photometry #584168Grant PrivettParticipantI think the version I use is a standard conversion of the DAOFIND routine to Python. I use it to provide the positions of the stars and then do photometry on them.
After that I compare my coordinates with those of stars in Gaia DR2 and generate the matches. So I then have measured flux versus catalogue mag.
I then fit a linear regression and recursively remove the outliers. I’m fairly sure I found that more successful than using weightings. I had expected extreme colour index stars to cause problems too but that had a relatively minor impact.
I think in my process any star with a peak brightness >50,000 was excluded from the linear regression. As you say setting a magnitude limit should work but the count was easy.
Grant PrivettParticipantHi there,
Not quite sure I understand what your code does.
Differential photometry using Poissonian/Gaussian profile and then using gaia DR2 to get the magnitude of the reference star?
How does the fitted result differ from what you would get using DAOPHOT?
Also, do you set a value that allows stars that are nearing saturation to be ignored?
Is the star to be measured denoted by hand or by RA/Dec somehow?
It does look like code people would find useful.
Grant PrivettParticipantThats rather nice. Must have a go at that.
Saturday best bet here too. Its traditional though, we’re past full moon, hence its cloudy.
Grant PrivettParticipantIts being reported that SpaceX have gained approval to drop the altitude of the constellation to reduce the latency of the system. That will make all the satellites brighter.
Some info here:
Grant PrivettParticipantAs I recall, the Flyeye guys were making some pretty impressive performance claims when this was first announced. It will be interesting to see if they achieve them.
Oddly, I recall the whole system being described as automatic rather than human assisted, but I could easily be wrong. Human intervention would certainly make high sensitivity performance more attainable.
Grant PrivettParticipantEvening Owen,
Thanks for the headsup. By chance, just a couple of weeks ago I was talking to Martin about his code. I was lending a hand on his attempt to get the Lodestar running happily on Windows. We were having fun trying to get Pyusb and the sxccd code on github working okay. Seemed to work beautifully on Linux but is not always a happy camper on Windows with problems depending on which USB backend was in use on your system. I think he is taking the ASCOM route (though I may be wrong) which I will be curious to see, but would prefer to avoid.
I must admit that after several problems with different approaches, I may just cheat and write a VB6 32bit non-gui executable that I can talk to via environmental variables and set that running to take pictures, telling it to stop / reconfigure as needed. I can then shell/spawn/subprocess that from nearly any language I choose and will be okay until Microsoft decide its end of life for 32 bit processes in W10 (hopefully at least 10 years hence). Would still rather do it properly, but am having a terrible time getting there….and, frankly, theres things a lot more fun to do.
Clear sky outside … and 99% full Moon. It was ever thus.
Grant PrivettParticipantWould be nice to see an Agenda and list of speakers…
Grant PrivettParticipantAs late as that! I really had not realised.
Grant PrivettParticipantGot to say that looks fun, was seriously tempted, but I have too much going on despite (or perhaps because of) being now semiretired.
Would suggest that the project is best suited to someone who programs for fun, is familiar with Python (well beyond the “Hello World” stage), has had previous experience automating equipment control and has very good attention to detail.
When you think you have found all the ways a control program can fail, the real world has half a dozen more saved up for a rainy day.
Grant PrivettParticipantDo they overheat in the summer or are they automatically throttled or something?
Grant PrivettParticipantI’m using a Dell E5430 laptop for similar purposes. Its got 4x USB (3x USB2 and 1x USB3), is cheap – hence my enthusiasm – and spare parts are readily available. With 8GB it copes with TheSKyX and Python code coexisting – both are a bit memory hungry. I have an SSD in mine, but I worry a bit about how those feel about low temperatures – there was ice on the lid of the laptop on Saturday night. Also, some E5430 variants have an Express card slot so you can easily add 2 further USBs.
Alternatively, NUCs look nice and Seeed Studio make some fascinating alternatives.
Grant PrivettParticipantThanks for the thought. Rother Valley were who I bought the upgrade kit from and have always been helpful with me too.
Grant PrivettParticipantThat looks very hopeful. The EQ6 I have was pretty usable with a quite good PE, but was becoming more prone to nights where there were large spikes superimposed on the PE waveform – dirt in the RA worm I assume.
Interesting what you say about CMOS. Is it that you need to take fresh darks every night and can’t rely on using a bias frame and long exposure dark to generate darks for arbitrary exposure lengths? CMOS certainly are having an impact these days and with CCD foundries closing round the world, we may not have any choice soon.
I must admit that, CMOS sensors like the QHY600 have nice looking specs – interesting to see how the spec and actuality compare.
Grant PrivettParticipantLosing one of the externally threaded nuts/slotted insert that holds the RA worm and bearing in place has a similar impact. 🙂
3 hours into the search for it, floor, table, hall, rest of kitchen and will be searching conservatory tomorrow (in case it got stuck to soles of shoes). Even checked the rubbish bin in case it got caught up with a greasy tissue. I’m really not enjoying this activity very much so far… 🙂
Grant PrivettParticipantThe nut came off! ……. Eventually.
It required the metal band oil filter wrench. Because its only meant for 60mm diameter or greater nuts I had to introduce a 4mm deep strip of rubber and it took me leaning very heavily on it to make it move – even then it was reluctant. Doesn’t, look especially corroded or glued. A poor thread originally perhaps. The surface of the nut is slightly damaged, but I use my scopes rather than worship them, so I’m really not fussed.
Anyway, I strongly recommend an oil filter wrench to anyone trying this sort of thing – and also the bearing removal tool thingy (technical engineering term) that Rowan sell.
Mine is an old EQ6, which probably also explains the bad thread on the gear wheel attached to the RA worm – . Even after 24hrs doused in WD40 it was still hard enough on that I thought the Allen key would snap…
That said, I’m happy to say I have not seen any of the swarf recorded on some accounts of servicing an EQ6.
So, the mount now is at max entropy. Now to try and put it back together with the Rowan upgrade.
There may be a whining and the gnashing of teeth heard throughout the land.
-
AuthorPosts