admin wrote:
PixInsight's stacking and integration modules yield the absolute best (consistent) quality currently on the market. It's worth the license fee for that alone (and even then it is very reasonably priced for all the other tools you get).
Well said, sir.
I think there's definitely room for both it and StarTools in my toolbox. I'm absolutely blown away with PI's DBE tool. It's amazing. But the thing can't correct imperfect stars without 184 lines of machine code, 732 mouse clicks, and a dead chicken swung in a counter clockwise direction (presuming you're north of the equator of course...you aussies should swing clockwise, obviously).
I think, to be honest, the best description from a USER's point of view I can come up with is this : It certainly makes some complicated things very simple...but it maddeningly makes the simplest of things extremely complicated. Seriously, PixInsight? Crop needs to be a dynamically linked function that requires me to launch an "applet" and link it to an instance of an image?? What? Come on...it's CROP for chrissakes.
Well, looking at the data, the frames are just allowed to drift in a straight line between frames (which is indeed better than nothing!), whereas a preferred pattern would be spiralling out relative to the first frame, thereby spreading any pixel bias evenly across the frame, leaving a much less distinguishable pattern that is also much easier to noise reduce
You know what? I was sitting here typing out a long thing about being sure PHD dithered exactly as you're talking about. I'm quite familiar with the concept, and quite understand how and why it should be done a certain way, and so on. And I'm thinking "What the hell is Ivo talking about?? I know this, and that's how PHD does it!"
Then it occurred to me...the last time I was imaging...back in April or May I guess?...I was having some difficulty with my ancient CG5. It was being cranky, and didn't want to settle down on the Dec axis after a dither. So GUESS WHAT I DID!
Good lord, Ivo...I bet I'd have spent WEEKS trying to sort that out if you hadn't described it just as you did. Thanks!
You're totally right though - I tend to push the data in the face of noise - mostly using the Life module and a mask - where other people (probably rightfully) back off. Perhaps too much for some. I don't use the Life module at all when there is ample signal for the object to 'stand on its own two feet' so to speak, unless the image is too 'flat' or there is a buasy star field obscuring he object of my interest. As for colors, I always like to show the full range of star temperatures.
That is, I think, why this hobby can be so intriguing sometimes, besides just the fact we're imaging some pretty cool stuff.
There's such a hard core geeky technical side to a lot of it. The physics, geometry, electronics, and so on...it all appeals to my very black/white/on/off/1/0 nature. I do databases for a living, and fly airplanes for fun. In my world, I tend not to think in terms of "eh, about close enough kinda sorta." Either a value does or doesn't satisfy a where clause...either the plane is or isn't flying. There's not a whole lot of room to be guessing.
But then it comes time to process...and it's SOOOOOOOOOOOOO much about personal taste...what "looks right". I think that's why I so often refer over on /r/astrophotography to a goal of acquiring "enjoyable" images rather than "good or bad" ones. We can lose so much of what this OUGHT to be about (imo anyway) by chasing down who did a "better" job of getting this color or that just right.
Yesterday, my wife found a picture on facebook of a soldier holding a rabbit. The comments were a mine field of vitriol and argument and debate about the "meaning" of the image and so on and so forth...and finally, someone commented "Hey everyone...just shut up and enjoy the bunny."
Yeah...pretty much.
So a couple of days ago, I started adding a feature to allow people to use alternative ways of bringing color into their images, for example retaining even illumination but emulating human visual response to color and brightness (leading to less saturated highlights, leading to the image looking like 'you are there' with super human eyes), or using the luminance stretching history to manipulate highlight saturation (leading to the look you get from most other software). The differences can be quite dramatic. Stay tuned!
That should be an EXTREMELY powerful tool.
Perhaps it's time to explore giving StarTools its own integration module?
Perhaps for 1.5 - it needs to be done right though.
Well, to be sure...we know you don't do anything until it's done right.
But I sure as heck think there's room for a piece of software that can manage to do 2 things:
1) Stack data and not completely ruin it
2) Do it without requiring a masters degree in IS and a 73 click 417 step process.