Wow, it actually works! This is a very quick result of feeding calibration frame analysis to the Tracking code, which then helps the Denoise module use it.
The improvement in the example below is only slight, but noticeable; some more detail seems to emerge from the murk, proving this is a viable approach.
Crop from an image processed with exact same parameters and workflow, except that in one case flat Tracking was enabled.
Will experiment and research some more...
You might be applying flats to your noise reduction and deconvolution soon!
Using calibration frames for Tracking; some early results
Using calibration frames for Tracking; some early results
Ivo Jager
StarTools creator and astronomy enthusiast
StarTools creator and astronomy enthusiast
Re: Feeding Calibration frames to Tracking; some early resul
And this is feeding flat frame information to both the decon and denoise modules;
What is happening here is that Decon can now vary its regularization locally based on signal strength as impacted by the flat frames.
Deep Sky Stacker, for example, performs a normalisation procedure when flat frame calibrating that can actually not boost but attenuate the signal (and its noise component) in order to even out brightness. This means that Decon, in places, can actually go further than global statistics otherwise indicate.
Crop from an image processed with exact same parameters and workflow, except that in one case flat Tracking was enabled.What is happening here is that Decon can now vary its regularization locally based on signal strength as impacted by the flat frames.
Deep Sky Stacker, for example, performs a normalisation procedure when flat frame calibrating that can actually not boost but attenuate the signal (and its noise component) in order to even out brightness. This means that Decon, in places, can actually go further than global statistics otherwise indicate.
Ivo Jager
StarTools creator and astronomy enthusiast
StarTools creator and astronomy enthusiast
Re: Using calibration frames for Tracking; some early result
Looks very interesting.
Re: Using calibration frames for Tracking; some early result
So I can understand, is the image with less pixelation the after image please?
Re: Using calibration frames for Tracking; some early result
It's a 400% zoomed crop (hence the pixelation!)happy-kat wrote:So I can understand, is the image with less pixelation the after image please?
This is another more recent example (also 400% zoom, so it too is pixellated);
In this example it successfully compensated for a subduing of the signal by DSS in the core as a result of its normalisation during flat frame calibration (e.g. the flat corrected stack was darker in the core than just the lights stacked). Knowing this, its has allowed decon to be more aggressive, while allowing denoise to back off somewhat, while keeping noise and artefacts perfectly stable elsewhere (not shown - outside the crop).
Shown are;
"naive" processing (no deconvolution, no denoise).
Standard Tracked deconvolution
Standard Tracked deconvolution + Standard Tracked denoise
Enhanced Tracked deconvolution + Enhanced Tracked denoise (everything "YES")
The problem I'm trying to address, is that calibration frames can greatly impact Signal-to-Noise Ratios locally in a dataset. The signal "boost" in response to, for example, vignetting around the edges, is not "free". The boost to even out lighting also comes with a boost in noise. That means an increased noise level for previously darker areas, relative to (already) fully lighted areas of the image.
Similarly, some flat normalization schemes actually reduce brightness (for example as found in DSS) to match overall brightness levels. That means a decreased noise level, relative to other areas of the image. Really, they're just two sides of the same coin.
The main thing is that they cause signal quality - and noise - to vary in different parts of your dataset. The factor by which they vary can be quite significant.
Without knowledge about these varying noise levels, all an algorithm or filter can do is treat all pixels the same across the whole image. The best it can do is choose a good "middle ground" (or, worse, put the onus on you if it's not smart enough to objectively do this). In practice, this means overcorrection, under-correction and - usually - a combination of both in parts of your image.
In the case of a noise reduction scheme, this means that some areas are noise-reduced too much, while others may not be noise-reduced enough,
In the case of a deconvolution algorithm, this means that some areas develop artefacts, while in others it may leave recoverable detail on the table.
And so on, and so forth.
Calibration Tracking aims to fix this.
Ivo Jager
StarTools creator and astronomy enthusiast
StarTools creator and astronomy enthusiast
Re: Using calibration frames for Tracking; some early result
That is truly remarkable.