Data set of IC443 in SHO for processing

Questions and answers about processing in StarTools and how to accomplish certain tasks.
riccdavis
Posts: 7
Joined: Tue Jan 25, 2022 11:29 pm

Data set of IC443 in SHO for processing

Post by riccdavis »

Hello everyone

My name is Richard and I’m new on here. I have just returned to imaging after around 17 years absence! My, a lot has changed 🤣

I have purchased Startools and have been having a go at processing the data acquired (I started late October 2021).

I’m currently imaging with a vintage Pentax SDHF 75 and an ‘old fashioned’ Moravian G3 11000 mono.

I have here some S H O data of IC 443 (The Jellyfish) which I have taken. It’s all stacked and calibrated and I’m trying to get something decent out of it using Startools, but I’m struggling a bit. My friend, who uses Pixinsight has already produced a very good image from my data using that software and I am hoping to achieve the same with Startools.

There is Ha (7:40), Oiii (6:20), Sii (5:20).

https://drive.google.com/drive/folders/ ... sp=sharing

I’m wondering if anyone who is more experienced with the software than I am could have a go at processing it for me (and provide a log file of the sequence if that is possible.?)

Any help and efforts would be most welcome 🤗

Best wishes
Richard
fmeireso
Posts: 384
Joined: Mon Sep 28, 2020 8:46 pm
Location: Belgium

Re: Data set of IC443 in SHO for processing

Post by fmeireso »

I don't have much experience in SHO processing. I went through rather quick in Startools but tweaked a bit the color in GIMP
I don't know if that is what you are expecting, if so i will sent the log,
Yelly1 (Medium).jpg
Yelly1 (Medium).jpg (377.75 KiB) Viewed 4479 times
riccdavis
Posts: 7
Joined: Tue Jan 25, 2022 11:29 pm

Re: Data set of IC443 in SHO for processing

Post by riccdavis »

Thanks for the reply. This is what Chris produced with PI.

https://drive.google.com/drive/folders/ ... dpxLkUKc1L
fmeireso
Posts: 384
Joined: Mon Sep 28, 2020 8:46 pm
Location: Belgium

Re: Data set of IC443 in SHO for processing

Post by fmeireso »

Looks good but that looks more HOO , bi color then SHO ...
riccdavis
Posts: 7
Joined: Tue Jan 25, 2022 11:29 pm

Re: Data set of IC443 in SHO for processing

Post by riccdavis »

fmeireso wrote: Wed Jan 26, 2022 1:01 pm Looks good but that looks more HOO , bi color then SHO ...
Your log file would be interesting to me. Thank you 😊
fmeireso
Posts: 384
Joined: Mon Sep 28, 2020 8:46 pm
Location: Belgium

Re: Data set of IC443 in SHO for processing

Post by fmeireso »

https://www.dropbox.com/s/ft5guvuszgr4m ... 3.txt?dl=0

Link to the txt file (logfile). I did tweak the color a bit in GIMP , more saturation.

Evenso i could net get to color youre friend make of it. It was very nice.

SHO is not exactly my thing ...
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: Data set of IC443 in SHO for processing

Post by Mike in Rancho »

Hi Richard, and welcome! :D

Thanks for posting your data to try. Almost an infinite number of ways to create a narrowband SHO image, particularly with the colors. I went through a mostly normal processing flow to start with. I tried to match up the sample tricolor image you provided, and settled on an OHS mapping as the closest. Normally I would probably use one of the more normal SHO matrices. Still, I couldn't quite come up with a balance that matches what your friend created, so not sure how he managed that. I did turn the main Jellyfish blue, but that being the hydrogen, a lot of the rest of it should go blue also. And yet he came up with red. :?: Maybe I just need to work on the balancing more. Then again, PI is a strange beast.

I didn't want to bleach the stars either, so they still have some color in them, even if it is OHS, and the two large ones were kept under control.

I'll probably try again later in a more typical SHO map. Always fun to play with narroband!

As with Freddy, if you are interested I can provide a 1.8 log, though with some finalizing tweaking it could go pretty long (when the mask codes are included). Let me know.

Image is crunched down and compressed in order to reasonably fit (or nearly fit) the posting limits.

Richard JF OHS ST8 1C.jpg
Richard JF OHS ST8 1C.jpg (519.39 KiB) Viewed 4437 times
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: Data set of IC443 in SHO for processing

Post by admin »

Hi Richard, and welcome to the forums! :text-welcomewave:
riccdavis wrote: Wed Jan 26, 2022 10:23 am My, a lot has changed 🤣
You're not wrong! :) More on that below.
I’m currently imaging with a vintage Pentax SDHF 75 and an ‘old fashioned’ Moravian G3 11000 mono.
Fortunately, good glass doesn't age. :thumbsup:
I have here some S H O data of IC 443 (The Jellyfish) which I have taken. It’s all stacked and calibrated and I’m trying to get something decent out of it using Startools, but I’m struggling a bit. My friend, who uses Pixinsight has already produced a very good image from my data using that software and I am hoping to achieve the same with Startools.
And this is where not all change, these past 17 years, has been good change.

It appears your friend hasn't been completely truthful (or at least has omitted some crucial details). This image does not appear to be exclusively processed with PixInsight, but rather, has been significantly altered ("deep faked") by an algorithm based on neural hallucination (the latter may sound ridiculous, but is actual part of AI research nomenclature). In fact, these images are actually one of the more egregious examples I have seen. Most likely we're looking at the interventions of the Topaz AI suite (ostensibly not part of PI as it has not place in astrophotography), which is trained for the purpose of "enhancing" images of people, buildings, vehicles, nature scenes and animals with plausible (but not real) detail.

As a consequence, sadly, much of the fine detail in the two images presented by your friend, is entirely made up and does not originate from your actual dataset. A quick comparison with a high resolution, non AI-tainted image (for example this image by Chris Heapy) should readily reveal the discrepancies in the fine details. The neural hallucination algorithms will have created - out of nowhere - many wavy patterns, "hair", "tendrils", "veins" and "textures" that do not truly occur in the object or, indeed, anywhere in outer space. AI-based detail augmentation tends to be most easily detectable around bright stars, where diffraction patterns tend to thoroughly confuse such detail generators.

Yours truly, as does the author of PixInsight, vehemently opposes such manipulations, as it quite simply "lies" to your viewers (and yourself) as to what is truly out there. By and large, astrophotographers aspire to practice documentary photography (e.g. documenting what is out there to the best of their abilities), and software like ST and PI exist for the purpose of facilitating this. Particularly in StarTools, everything is focused on making the most of your actual recorded signal, and keeping artifacts at bay, for the purpose of conveying reality in the most truthful way.

With that out of the way, the color compositing of your friend's image is, as others pointed out, not according to the Hubble Palette (aka SHO palette). The SHO palette is named so, because it maps S:H:O to R:G:B (e.g. S-II to red, H-alpha to green, O-III to blue). Depending on prevalence of the three distinct emissions, this tends to yield the familiar/classic blue/golden coloring with hints of green in areas of particularly strong H-alpha emissions.

While false color is a necessity to be able to show the three different emissions, contrary to popular-ish belief, it is not the case of "anything goes". The coloring is meant to convey to a viewer an accurate representation of emission concentrations in any one spot of your image, across your entire image, independent of location or pixel brightness. As such, compositing of the three bands - for the purpose of coloring - must be done in the linear domain; individually stretching the three bands is a big no-no; it completely skews hues and saturations, breaking the requirement that coloring is an accurate prediction of emission concentrations everywhere in your image (incidentally, there is a reason why non-linearly processing/stretching individual channels for terrestrial imaging is similarly not done).

In more practical terms; In properly processed (and calibrated) SHO images, green-ish tints correspond to a relative dominance of H-alpha. Orange corresponds to a relative dominance of H-alpha and S-II but a relative paucity of O-III. Blue corresponds to a relative dominance of O-III and a relative lack of H-alpha and S-II. Teal points to a relative dominance of of H-alpha and O-III, but a relative lack of S-II, and so on, and so forth. In other words, it is important you and your viewers can trust the coloring; the colors are not just there to be "pretty".

To achieve such an informative/documentary color retention, StarTools processes your detail and coloring separately (yet simultaneously) in one unified workflow. Coloring is respected perfectly linearly, while you can process the detail (luminance) unencumbered non-linearly.

A quick workflow looks as follows;

Compose; load S-II as red, load Ha as green, load O-III as blue. Set the exposure times for the three channels accordingly. StarTools will, behind the scenes, create a properly weighted synthetic luminance (detail) dataset and separate color dataset. You will be processing mostly the detail (in mono), with the coloring popping up here and there. The two aspects - detail and color - are composited once you hit the Color module.
AutoDev; to see what you are working with.
Crop; crop away the stacking artefacts.
Wipe; remove the gradient and bias levels in both luminance and color (I increased the Dark Anomaly Filter just a little bit, as the dataset is quite noisy).
AutoDev; We can now non-linearly stretch the detail in earnest. Pick a Region of Interest (click & drag) that includes the parts of the image that are most important and excludes "empty" background. I also increased the "Ignore Fine Details >" parameter to make AutoDev "blind" to the fine background noise.
Contrast; to taste, I used defaults.
HDR; to taste, I used defaults.
Sharp; to taste, I used defaults.
Decon; the only true way of recovering real detail from your dataset. Deconvolution is able to reverse some of the adverse effects of atmospheric turbulence, as well as imperfections in your optics. I sampled a few stars across your image. See docs on how to operate this module. E.g. to give you an idea;
StarTools_decon.jpg
StarTools_decon.jpg (465.06 KiB) Viewed 4410 times
Color; color and detail are now composited. Start off with the SHO(HST) preset. Use the Bias Reduce sliders to throttle the relative contribution of the bands. To reiterate, this is done entirely in the linear domain, and what you are doing here is simply multiplying (or dividing in the case of Bias Reduce) the signal of an individual band by a specific factor.
Shrink; to taste, I used the defaults + Unglow preset.
Super Structure; Isolate preset (with smaller Airy Disc setting of ~ 13% to better match the field of view) and dialled back Gamma (0.75).
Super Structure; Saturate preset (with smaller Airy Disc setting of ~ 13% to better match the field of view) with dialled back 100% saturation.
Switch off Tracking and perform noise reduction. I used defaults.

You should end up with something like this;
Image
It may not show the fantastic(al) detail of your friend's image, but it certainly approaches the actual detail (and emission concentrations) that I am familiar with from other SHO datasets.

Hope this helps!
Ivo Jager
StarTools creator and astronomy enthusiast
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: Data set of IC443 in SHO for processing

Post by Mike in Rancho »

Interesting write-up, Ivo, I will have to come back to this one for reference. :thumbsup:

If I may side-track slightly to just your processing here?

Was your Wipe essentially default other than the DAF? Being narrowband, I had hit the NB preset, but also noticed the major noise, and actually ended up at aggressiveness 95 with DAF 3. Too much? It's possible I was trying to do to much with Wipe, instead of the ensuring AutoDev. Though I did use an ROI, it may not have been optimal, and I also used a IFD of 2.2 and dropped the shadow linearity to 40. I then took off another 10 percent in Contrast.

I also did default HDR and Sharp except for using 200%, but your results pulled out the web-like detail far better than I did. Though my saturation could have overwhelmed that also I suppose.

I believe any of the tricolor matrices, as well as pulling back on a filter (here the Ha), are still legit representations of the concentration? Even though I chose a pretty unusual mostly blue version. :D I've read the docs and website, but am still trying to wrap my head around how they stay relative to each other when the mapping matrices are used. Is it that it is still a tricolor, and the listed combos (for example 70Ha+30OIII) just set the particular color/hue for that element? I know the R G and B remain tied to their filters, so maybe I'm finally coming around to getting it. Maybe.

I did not think of the two SS runs (or Isolate instead of dim), but will check it out on this data again. I also chose 20% for the Airy parameter, so my PSF grokking still needs some work it seems. Close! But not a precision 13%.

I'll give the dataset another go using your provided general workflow and see how it goes.

Thanks
fmeireso
Posts: 384
Joined: Mon Sep 28, 2020 8:46 pm
Location: Belgium

Re: Data set of IC443 in SHO for processing

Post by fmeireso »

I had a hunch of that ....
Post Reply