binning to speedup processing

Questions and answers about processing in StarTools and how to accomplish certain tasks.
Post Reply
micheleorsini
Posts: 87
Joined: Fri Feb 26, 2016 7:28 pm

binning to speedup processing

Post by micheleorsini »

Hi,
I own a Mac Retina, and unfortunately, Startools is painfully slow in spite of 16Gb of RAM.

The problem is that it is difficult to experiment because many modules require too much time (even when changing a few params). The only way to understand what changed and how is taking screenshots of the app, not exactly the best.

I was wondering how much a strong bin could affect module operations, here is my idea: have a picture, bin heavily to reduce the image, make all my tests and operations up to a final image that I want. Then, repeat all Startools operations from the log except for the binning.

What I'm expecting is that the final big image will be as the test small one except for the noise (and some details) but that other module operation won't change too much their response, is that correct?

Michele
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: binning to speedup processing

Post by admin »

Hi Michele,

It's a good idea in some situations - so good in fact it's already implemented in some modules behind the scenes! :)

Unfortunately it's not always appropriate as, depending on the algorithm, vastly different results may arise due to difference in detail on different detail scales - what's appropriate for a smaller version of the image may not be for the larger version (exactly as you say) due to differences in detail and noise. That would then mean that the binned "preview" is not actually showing a preview that is true for the unbinned version.

The idea is implemented in some modules (for example Wipe, Contrast, HDR) in cases where a scaled down version of the image is a sufficiently detailed representation of the full image for computational purposes. But even there, there is a parameter ("precision") in cases where that scaled down model is not sufficiently detailed.

Could you give me some indication of the resolution you are processing at? Is your data oversampled while processing it? Which modules in particular do you have to wait for a lot?

Thanks!
Ivo Jager
StarTools creator and astronomy enthusiast
micheleorsini
Posts: 87
Joined: Fri Feb 26, 2016 7:28 pm

Re: binning to speedup processing

Post by micheleorsini »

Thank you for the answer!

data are from a Canon D700 so they have roughly 3400x5200 pixels, scale on my newton 200F4 is 1.10"/pixel, so binning might help reducing noise but it is not necessary (I'm not oversampling, isn't it?).

About module timings, here are a few information:

1. application is overall slow, for example "keeping" results for various modules require more than 10 seconds (actually: nearly 20)

2. some modules require a huge amount of time of preparation (before you can start experimenting with params), for example life-isolate requires 20 minutes (then is surprisingly fast reacting to param changes). But this is not a great problem since this part is not interactive

3. modules that annoy me more are those like color, hdr e deconvolution (even with a small preview) where every change require 20 seconds approx to see the result
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: binning to speedup processing

Post by admin »

micheleorsini wrote:Thank you for the answer!
data are from a Canon D700 so they have roughly 3400x5200 pixels, scale on my newton 200F4 is 1.10"/pixel, so binning might help reducing noise but it is not necessary (I'm not oversampling, isn't it?).
1.1"/pixel is at the very high end of extremely good seeing conditions (unless you're in the Atacama desert or on the peaks of Mauna Kea :D), while less-than perfect optics, collimation troubles (and things like your DSLRs anti-aliasing filter) will all impart further scattering and diffraction. It is very likely you are oversampling.
That said, it's fairly easy to verify any oversampling. Simply zoom into the image and see if the "smallest" (e.g. non-overexposing) stars are "smeared out" over more than 2x2 pixels. If they are, you are very likely oversampling.
About module timings, here are a few information:
Thanks for this. There is not much I can do, however I can at least explain what's going.
1. application is overall slow, for example "keeping" results for various modules require more than 10 seconds (actually: nearly 20)
This is the Tracking feature doing its thing, analysing how noise grain has changed (amongst other things). If you have the opportunity to run StarTools from an SSD drive or, better still, a RAM drive, these "keep" delays should be greatly reduced.
2. some modules require a huge amount of time of preparation (before you can start experimenting with params), for example life-isolate requires 20 minutes (then is surprisingly fast reacting to param changes). But this is not a great problem since this part is not interactive
Indeed some things just take a long time in the interest of fidelity. There are ways to recalculate light diffraction using a fast (but inaccurate) Gaussian kernel. I could maybe add this.
3. modules that annoy me more are those like color, hdr e deconvolution (even with a small preview) where every change require 20 seconds approx to see the result
Hmmm... it shouldn't take that long...

What are the specs of your machine? RAM, CPU, storage medium?
Ivo Jager
StarTools creator and astronomy enthusiast
micheleorsini
Posts: 87
Joined: Fri Feb 26, 2016 7:28 pm

Re: binning to speedup processing

Post by micheleorsini »

Hi!
first, let me say that, being a programmer, I know how hard (and sometime frustrating) is optimizing the code, respect!

Here are my computer specs

MacBook Pro (Retina, 13-inch, Early 2015)
16 GB 1867 MHz DDR3
3,1 GHz Intel Core i7
SSD disk

that said, the graphics card is not so performant, can be the this cause?

Concerning the resolution, now I have horrible FWHM (partially due to guiding, I guess, I'm investigating) but I have to admit that I will never reach such seeing values :-(

So I did a test with 50% binning, which, by the way, helps me a lot living in suburbs, and I found that times are cut considerably, roughly by a factor 3 (4 for life: now elaboration requires 5 minutes only), and are not an issue anymore. I think I'll go with 50-70% binning for next images

thank you again for the support!
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: binning to speedup processing

Post by admin »

micheleorsini wrote:Hi!
first, let me say that, being a programmer, I know how hard (and sometime frustrating) is optimizing the code, respect!

Here are my computer specs

MacBook Pro (Retina, 13-inch, Early 2015)
16 GB 1867 MHz DDR3
3,1 GHz Intel Core i7
SSD disk
Aha! Most laptop CPUs (especially consumer-oriented ones like those from Apple) are usually heavily focused on low power consumption, rather than processing speed.
The i7-5557U (the U denotes Intel's ultra low power consumption line) in your Macbook is only a dual core (though with hyperthreading) and is simply not very fast.
Not much has changed in laptop performance by the way for this same reason - battery longevity is more important in this segment than raw processing power.
This is why the average desktop CPU from 2010/2011 still vastly outperform your Macbook and even the latest 2017 models from Apple. This is especially the case in areas that count in StarTools (which is heavy on memory bus, multi-core, integer calculations).
that said, the graphics card is not so performant, can be the this cause?
StarTools does not utilise the GPU at all for a number of reasons to do with branching complexity and memory bus requirements. In short GPUs are great at running simple algorithms very, very fast. But this all falls apart when complex branching needs to be performed (for example due to consulting Tracking information), or when results need to be constantly shifted from GPU memory to system memory.
Concerning the resolution, now I have horrible FWHM (partially due to guiding, I guess, I'm investigating) but I have to admit that I will never reach such seeing values :-(
Seeing around 2"-3" is quite normal for us mere mortals on a good night. Add to that the other factors I mentioned and oversampling will happen very quickly. It's also the reason why DLSRs seem to have much higher resolutions than astro-specific CCD cameras. For astro-purposes, it's much better to use the same sensor size for fewer but "bigger", more sensitive pixels since the atmosphere is likely to be the limiting factor and a high resolution is likely going to waste.
So I did a test with 50% binning, which, by the way, helps me a lot living in suburbs, and I found that times are cut considerably, roughly by a factor 3 (4 for life: now elaboration requires 5 minutes only), and are not an issue anymore. I think I'll go with 50-70% binning for next images
That sounds about right! When reducing an image 50% in the X and Y axes (0.5*0.5=0.25), you end up with a quarter the size of the original image. It stands to reason that this should roughly yield a (1/0.25)=4x speed boost.
Ivo Jager
StarTools creator and astronomy enthusiast
micheleorsini
Posts: 87
Joined: Fri Feb 26, 2016 7:28 pm

Re: binning to speedup processing

Post by micheleorsini »

Hi,
after our discussion I went studying my FWHM well, I used the FWHM results from DSS and ... well they are quite disappointingly above 6".

After this finding I started improving my guiding and then I tried making frames at 30, 60, 120 and 240s to see if guiding (which, btw, was not bad, at 2/3 of the imaging resolution) was the cause of the poor FWHW.

Findings: not found an increase of FWHM with time, now I'm guiding better and still FWHM is at the same figures.

Now I think that FWHM comes from a combination of seeing and DSLR (bayer matrix, filters ...) and my suspects are confirmed by an https://www.cloudynights.com/topic/5942 ... try8144606

Now, after this long premise, here is my question: if I cannot go below, say, 6", and there is no room for improvement with this gear, then I should bin even more, maybe at 50% without loosing *any detail* and almost doubling SNR. Isn't it?

thank you

Michele
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: binning to speedup processing

Post by admin »

Hi Michele,

You're correct! :thumbsup:

It's fantastic you've proven for yourself and your unique gear what angular resolution you can expect from your data! (and 6" for real-life scenarios is actually not unexpected).

Now, binning in StarTools is unique from all other software (to my knowledge) in that it allows for fractional binning. This means that you can specify an exact SNR vs resolution tradeoff (other software will only allow fixed 2x2, 3x3, 4x4 binning).

This means you can bin exactly the amount you want, without having to go too far or not far enough.

Quite simply, when binning, zoom in, then reduce resolution until "small" (e.g. non-overexposed!) stars take up 3x3 pixels at the most (one pixel for the core, approx. one on each side for the light fall-off). With the light fall-off still spanning multiple pixels, deconvolution can often still enhance detail further.

Hope this helps!
Ivo Jager
StarTools creator and astronomy enthusiast
Post Reply