Donut stars

Questions and answers about processing in StarTools and how to accomplish certain tasks.
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: Donut stars

Post by Mike in Rancho »

Thanks again, Ivo! An interesting discussion and of course upgrade, though I admit I am still trying to wrap my head around things. :? I will keep reading and thinking though.

And cool background on OptiDev. So it's scoring likely activity to determine how to curve things. Kind of explains why we can get a good starting point with much detail on hand, right away, when the same would take multiple (and subjective) iterations of histogram stretching.

I ran the graph out 10 more pixels on either side, and good grief I still didn't get down to background. Quite interesting just how far diffraction spreads out on a Mag 8 star! But not sure there's any new info here on the plateau.

Stretching Buckets v2.jpg
Stretching Buckets v2.jpg (180.73 KiB) Viewed 8034 times

Unless you were saying that OptiDev was sensing that stellar core (which would be a huge spike even if I had a bigger well I think), and so made sure it was still distinctly visible, and thus the surrounding shoulder pixels ended up compressed/plateaued as kind of an unfortunate byproduct -- again because it found the spike more worthy of interest?

The maxed pixels you noted was another interesting thing to chase down. I pulled another sub and things were a bit different, with only 3 pixels across overexposing instead of 4, and overall a softer shape. Kind of neat to zoom that deep into linear stars, and makes me believe there will be a great deal of variability. Clouds, transparency, focus drift, and most of all maybe seeing. Then of course the stacker will be trying to compute a star centroid based on what it has in front of it for each sub, which could have small variations as well.

Still, even with that sort of averaging I would have expected the results to be closer to 65535 than they were. So I checked another stack, the Horsehead, and even Alnitak wasn't maxed out. What is PI doing? Iin the end though I found the source to be the calibrated subs. Seems like that 65535 gets dropped due to darks subtraction and then I suppose flats division, and then afterwards probably gets played with by the possible variations in subs.
dx_ron
Posts: 288
Joined: Fri Apr 16, 2021 3:55 pm

Re: Donut stars

Post by dx_ron »

admin wrote: Mon Mar 25, 2024 12:23 am I'm definitely not proposing we should be doing 16-bit stacking, however the assumption is that the source material is *multiple subs* of 10, 12, 14 or 16-bit quantized data, where values fell within the range of 0 to /1024, 4096, 16384 or 65536. Stacking multiple subs will - in essence - add 2^(stacked subs) precision to this (e.g. 0.5, 0.25, 0.125, 0.0625 for 2, 4, 8, 16 subs, etc.).

While very useful for intermediary calculations and representation. Floating point operations and encoding will more readily allow ranges outside of the expected range of (0..unity) and let software (or humans) erroneously interpret out-of-bounds data or encode potentially destabilizing rounding errors, while in the real world, we only captured integers and we know that there is a range beyond which data numbers should not occur (or are not trustworthy).

For example, I believe Ron's dataset encoded numbers/pixels past 1.0, which if 1.0 was taken as unity, should not really occur. In an attempt to establish what "pure white" (unity) is, StarTools assumes the highest number in the dataset is unity (unless it encounters a FITS keyword that says otherwise) - it goes to show how this introduces unnecessary ambiguity. This ambiguity can have real consequences.
Yes, Ivo - the Optidev explanation helps a lot! (not that I'm now able to go out and give a seminar or anything like that...)

I looked closely at the pixels in the core of the brightest star in the M3 stack, and it is clear that that values are normalized to the green channel. Green maxes out at 1.0, but red has a few pixels >1, with a max of 1.00457. I may raise this as a question to the Siril devs.

I will also go back and re-calibrate the subs as 16-bit files to see if that changes the end-result 32-bit fits.
dx_ron
Posts: 288
Joined: Fri Apr 16, 2021 3:55 pm

Re: Donut stars

Post by dx_ron »

The answer is that re-calibrating and registering the subs as 16-bit integer files did indeed resolve the 'greater than 1' issue. Which helps explain why Ivo writes signal processing programs and I sit here and push buttons.

Here, by the way, is the 1.9.565 version of the M3 data, properly calibrated (or at least more properly calibrated 8-) )
M3_220x45s_rgb_50bin_NR_1600.jpg
M3_220x45s_rgb_50bin_NR_1600.jpg (499.96 KiB) Viewed 7980 times
2h45min of 45s subs low-conversion gain (gain 0 to you ZWOers) at f/7 AT130EDT. I'm sure I will add more data - not because that's a particular goal but because I'm sure most of the clear skies coming my way will be during full moon.

Cropped, Wipe 90% 4px DAF, Optidev with an RoI concentrated on the cluster, Contrast, SVD (there weren't very many properly green-centered stars), binned to 50%, Color, NR
User avatar
AndyBooth
Posts: 94
Joined: Mon Jan 07, 2013 12:48 pm
Location: NEWARK ON TRENT - ENGLAND

Re: Donut stars

Post by AndyBooth »

That looks really really nice and natural.
You have same setup as me essentially, what camera are you using, AA26c?
I’ve been away from home a few days, cant wait to try the new optidev!
decay
Posts: 493
Joined: Sat Apr 10, 2021 12:28 pm
Location: Germany, NRW

Re: Donut stars

Post by decay »

Hi all,

I was not sure about posting this here in Ron’s thread, but I think I can add some points with reference. I hope that’s OK, Ron.

This is a very early stage (aka: a quick hack), but I already want to share.

I wrote a small tool (once again ;-) ) that should help us analyzing OptiDev’s behavior by reconstructing the global stretch curve. How does it work?

- Open unprocessed stacked image in ST.
- (Optional: BIN, CROP, WIPE)
- Save image as 16-bit TIFF (image 1 / pre stretch).
- Apply final stretch (OptiDev or FilmDev).
- Save image as 16-bit TIFF (image 2 / post stretch).

Now the tool reads both images and compares every pixel position in both images. It builds a map which maps the brightness of each pixel of image 1 onto the brightness of the corresponding pixel of the second image. The result is a table or map containing all brightness levels of image one and their corresponding brightness levels of the second image. The output can be written into a CSV file and imported into any software that is able to draw charts. For the first attempts I used LibreOffice Calc, but later on I will use gnuplot, probably.

My first victim was my recent dataset of M106 (as Ron retracted his dataset of M13). Above described procedure was executed twice, first run using the old beta version and the second run with Ivo’s latest version with modified stretching algorithm.

And this is how it looks like. Dear Ladies and Gentlemen, for the first time proudly presenting OptiDev’s stretching curves:
2024-03-27 16_25_52-Unbenannt 1 – LibreOffice Calc.jpg
2024-03-27 16_25_52-Unbenannt 1 – LibreOffice Calc.jpg (80.23 KiB) Viewed 7922 times
(blue line is 563 Beta 6, red 565 Beta 7)

A lot of surprises here, at least for me. The new stretch (red) looks nice and smooth, but somehow much more conventional, than I had expected. No fancy curves and bends, reflecting distribution of details in different brightness ranges?

But what about the old version? There are obvious hard kinks. Strange, isn’t it?
2024-03-27 16_33_29-Unbenannt 1 – LibreOffice Calc.jpg
2024-03-27 16_33_29-Unbenannt 1 – LibreOffice Calc.jpg (72.33 KiB) Viewed 7922 times
I will have to think again about all this, Ivo’s explanations and our findings. And I will try other stretches and images and examine all this in more detail.

Any thoughts? Is it useful? We could try to find out more about OptiDev, how it works and compare it to other stretching algorithms, for example what Andy does when using APP. If anyone is interested I would share this tool, of course.

Best regards, Dietmar.
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: Donut stars

Post by Mike in Rancho »

Hi Dietmar, A for effort once again!

But, I need some help understanding what we are looking at. :think:

You said there's a comparison of pixel mapping per file, resulting in....? :confusion-shrug:

The first sample shows two lines, both post stretch. How are we seeing the differences applied to pixels in this graph (since all the pixels is a 2D image)? We would almost need a 3D graph then, right?

But even so, if this is akin to the curve line in a histogram graph during stretching, it would be similar to the initial such stretch, and thus the scale of the stretch is such that it would likely be hard to discern the minor fluctuations around regions of detail (which is done by OptiDev via deterministic data mining if I have that right, and by other software by multiple iterative stretches). Maybe? I wonder if zooming in to a more narrow region would reveal adjustments made away from smooth curving.

Now an actual stretching histogram usually has a 2D plot of the data, which is the total number of pixels in each brightness level slot, then a 45 degree line (the starting point for linear), and then the "after" curve that shows how that 45-degree linear line gets modified, and at the same time the 2D data plot (lump(s)) also gets altered correspondingly. I think. The kind of thing you can see in a HT or GHS display.

Not that I have full comprehension of all this. I get the gist, but I have to think more on the whole "tranches" of data (of which we now have more!), the deterministic expanding or compression of dynamic range applied by OptiDev, and what that expanding/compressing of range allocation actually means in terms of a graph to visualize it all. :confusion-shrug:


That said, I've only given quick re-tries to my most "recent" captured data, and new OD is Christmas in March for my stars, that's for sure! I still don't have everything figured out yet for how to do the initial global stretch and then process, based on the changes. Again to be quick I just raised the levels a bit by using a couple clicks of gamma in OD, and lowered shadow linearity. So I'm already messing up the optimal stretch. :lol:


Oh one last thing and I'm not sure if your graph tool would show it, but I see you mentioned the two saves are post Wipe. I am still wondering where things get black-pointed, Wipe or OD or both, as I think there could be something there that leads to a bit of harshness in appearance (and thus perhaps the "less smooth" result that Andy has talked about compared to other stretching). Similar concept can also often be seen when people show samples of flats. They can look very different, while both being "linear" in a way, if it is just pure brightness levels mapped to the display view, or if it has been scaled to something like MIN/MAX.
dx_ron
Posts: 288
Joined: Fri Apr 16, 2021 3:55 pm

Re: Donut stars

Post by dx_ron »

Hi Dietmar - interesting project with the plot, for sure. (and absolutely on target for the thread) (I only removed the M13 because my dropbox is nearly full)

Here's how I'm currently thinking about it (but I could be completely wrong!)

The real action is happening in just the first few points in the lower left, so you're going to have a hard time seeing how Optidev is varying the stretch intensity (the local slope of the in-value vs out-value line). If I scan around a linear image, virtually anything that is not a star has pixel vales less than 2000. These pixels have their output values mapped to values more in the 40,000 - 50,000 range, but I'm not sure what the best way would be to visualize the fact that input values of, say, 830-840 get mapped to 41,250 - 43,725 (a wide range - using 'a lot' of the available dynamic range) while values from 950-960 might only get mapped to output values between 45,000 - 45,200 (because Optidev did not score many pixels in the 950-960 range as having high 'busyness' scores).

Each bin in the linear values can have its own slope in its little slice of your curve, although I would expect rather a lot of autocorrelation.
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: Donut stars

Post by Mike in Rancho »

Yeah I think that makes sense, Ron. Though Dietmar I'm still interested in your app and wanting to know what the X and Y axes represent!

I chewed on it some, and even (sorry Dietmar!) created a little synthetic image of 10 increasing brightness bars in Gimp from black to white, and then played around with the curves transformation while watching the histogram and using a pixel pointer. Interesting to see dynamic range get altered like that when the starting point is so easy to follow, but I'm still not sure how to properly graphically represent OptiDev and it's scored buckets. :confusion-shrug:

From a bird's eye perspective of the initial linear state to non-linear stretch, I imagine OptiDev will look like most first astro curves -- a big leap up from the lower left point that arches over and then angles into the top of the highlights. Kind of a necessity of raising the skyfog and everything else. But the magic will be in the details you would have to zoom in tight to see, most likely. And being dynamic, every dataset will end up allocated differently and thus have a different resulting final "curve," no?

I guess of remaining interest might be what leads to what in terms of relative scoring, can it be fooled by some data, and can it be manipulated to taste (other than just bounding it with a ROI)? I'll have to play with OptiDev a bunch more now with these factors in mind.

Other than that though, pretty happy those star plateaus are gone! :D I'm almost not upset it's going to rain all weekend; might have to start digging up old data for reprocessing.
decay
Posts: 493
Joined: Sat Apr 10, 2021 12:28 pm
Location: Germany, NRW

Re: Donut stars

Post by decay »

Mike in Rancho wrote: Wed Mar 27, 2024 4:18 pm But, I need some help understanding what we are looking at. :think:

You said there's a comparison of pixel mapping per file, resulting in....? :confusion-shrug:
It's quite simple, I probably just messed up the explanation. The word 'comparision' I used is simply wrong in this context, sorry. So yes, it's this kind of graph we know from other stretching tools - FITSWork, Gimp, Siril or whatever. The diagonal line from bottom left to top right. This diagonal line is the identitiy transformation, input and output values are identical. And after stretching it is a curve, representing the global stretch transformation which is applied. The histogram is missing in the screenshots, but I already coded it yesterday evening.

The documentation of Siril's GHS transformation (the link which Andy posted) contains a quite useful explanation of this (first paragraphs).

Having a look at my first diagram: both axes represent brightness levels/values. As ST saves out 16-bit TIFFs this is a range reaching from 0 up to 65536. The x axis represents the input values - brightness values of any pixel contained in the pre-stretch image. So lets say there is a pixel in the image somewhere in M106. Not too dark and not too bright. Let's say a brightness of 1000. Now looking up the transfomation curve gives a corresponding value on the y axis, which now represents the brightness of the pixel in the post-stretch image. In our example this would be something about 37500:
2024-03-28 09_06_57-Donut stars - Page 4 - StarTools – Mozilla Firefox.jpg
2024-03-28 09_06_57-Donut stars - Page 4 - StarTools – Mozilla Firefox.jpg (43.75 KiB) Viewed 7877 times
This single mapping is true for every pixel in our image having the same brightness and the curve represents the mapping of all brightness values - the whole stretch transformation ... function ... whatever :lol:

So my tool (re-)constructs this curve by looking up (not comparing ?! :roll:) each corresponding pixel values of pre- and post-stretch images.

As this curve is a full representation of every stretching function and is used by many other tools, my idea was to use it to compare different stretches - of ST but of other tools as well.

- to be continued -
decay
Posts: 493
Joined: Sat Apr 10, 2021 12:28 pm
Location: Germany, NRW

Re: Donut stars

Post by decay »

Mike in Rancho wrote: Wed Mar 27, 2024 4:18 pm and thus the scale of the stretch is such that it would likely be hard to discern the minor fluctuations around regions of detail [...]. Maybe? I wonder if zooming in to a more narrow region would reveal adjustments made away from smooth curving.
I don't think so. :think: My example delivered about 3800 x/y mappings. Ivo mentioned 4096 (old beta) points 'to interpolate the constructed curve', but after having seen the blue curve (old beta) I wonder if it were only about 64 or so. :confusion-shrug:
Mike in Rancho wrote: Wed Mar 27, 2024 4:18 pm I am still wondering where things get black-pointed, Wipe or OD or both, as I think there could be something there that leads to a bit of harshness in appearance (and thus perhaps the "less smooth" result that Andy has talked about compared to other stretching).
Yes, that's something which I have in mind, too.
dx_ron wrote: Wed Mar 27, 2024 9:22 pm The real action is happening in just the first few points in the lower left, so you're going to have a hard time seeing how Optidev is varying the stretch intensity (the local slope of the in-value vs out-value line). If I scan around a linear image, virtually anything that is not a star has pixel vales less than 2000. These pixels have their output values mapped to values more in the 40,000 - 50,000 range, but I'm not sure what the best way would be to visualize the fact that input values of, say, 830-840 get mapped to 41,250 - 43,725 (a wide range - using 'a lot' of the available dynamic range) while values from 950-960 might only get mapped to output values between 45,000 - 45,200 (because Optidev did not score many pixels in the 950-960 range as having high 'busyness' scores).
You are right Ron, the real action (except stars) takes place on the left side, only a small range. But I think, it should be possible, by zooming in or using logarithmic scales, like other tools do. :think:

As said, this is an early stage. Thanks for all your feedback. I will maybe use the weekend to rethink things. And I hope my explanations can help to clarify things and to understand how the tool works.

Best regards, Dietmar.
Post Reply