StarTools 1.8.518 Beta 1 now available

General discussion about StarTools.
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by admin »

firebrand18 wrote: Wed Sep 15, 2021 2:26 pm @Ivo, you hit the nail on the head with "It is perhaps worth mentioning though, that normally Deconvolution is extremely sensitive to noise or artefacts and singularities (such as over-exposing star cores, or hot/dead pixels, etc.)." That is probably the crux of the problem in my case as shooting from a Bortle 8 where noise and other spurious junk makes its way into subs, even with narrow-band filters (I use L-Extreme).

In a heavy nebula image (currently processing the Tulip) good luck trying to select clean sample stars; never mind WP, I get deformed stars no matter how I play with the settings. Have to resort to synthetic decon in these cases, which still does a pretty nice job with deconvolution. Had much better success with my Glob images using SVDecon with only 1-3 WP's, easily healed out.
Thank you. Improving the results from all algorithms in ST will always be high on the agenda, so it may well be that I can find a solution for more difficult cases. And, of course, if the WPs are the result of bugs (rather than a "misinterpretation" of the data by the algorithms), then these definitely require fixing with high priority before we can make it to beta/release. There is plenty of scope in the algorithms for those bugs to creep in (for example as @Mike in Rancho and @jackbak have shown me).
On the subject of 64-bit GPU (any version); I cannot launch it; assume because my Nvidia GTX-1060 6MB on a pretty powerful Intel i7 PC is not supported? If so, any options; I feel left out! :(
Your system is most definitely supported, and should yield a nice speed boost in most cases. You just require drivers that support OpenCL, which you should be able to obtain from nVidia's website.
jackbak wrote: Wed Sep 15, 2021 7:17 pm I am about to pull the trigger on a moderately priced (for today's bloated prices) used AMD RX 570 as I have been told that AMD drivers are built-ins for
linux. I sure hope this is not a waste of $400 US. I just can't redo my complete computing world for Kubuntu - I do everything from banking to astronomy in linux, there is no Windows in my world.
Just FYI, $400 US for an RX570 is excessive, even in this climate. Depending on how recent your distribution is, Linux will support most AMD and NVidia cards. However, you will, at the very least, still require to install the vendor's proprietary drivers, which will enable OpenCL support.

If you are buying second hand, it will be much more cost-effective to go for a card that crypto miners don't want (the RX570 is rather popular with them because it is a fairly low power vs high hash rate card).

I currently run an eight year old AMD R9 290X under the latest Linux Mint (was a little bit of a hassle to get the card working though). It's about as powerful as an RX570/RX580, but draws a lot of power when processing. If going for an older card, just make sure your power supply can handle the power draw.

A new GTX 1650 Super should also be possible for around than $400 USD, which performs better than the RX570.

Hope any of this helps!
Ivo Jager
StarTools creator and astronomy enthusiast
jackbak
Posts: 22
Joined: Sun Jun 14, 2015 3:38 pm

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by jackbak »

Thanks Ivo,

Not pulling triggers yet, just got back from the dentist :shock:

I do have an older Nvidia card that I couldn't get running in two monitor mode (which I need). So I'm thinking AMD.
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by admin »

jackbak wrote: Thu Sep 16, 2021 12:26 am Thanks Ivo,

Not pulling triggers yet, just got back from the dentist :shock:

I do have an older Nvidia card that I couldn't get running in two monitor mode (which I need). So I'm thinking AMD.
Depending on how old, give the old Nvidia card another try, and be sure to use the Nvidia proprietary drivers. Older cards are typically somewhat easier to get going in Linux. What's the model?
Ivo Jager
StarTools creator and astronomy enthusiast
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by Mike in Rancho »

admin wrote: Wed Sep 15, 2021 8:41 am
It is perhaps worth mentioning though, that normally Deconvolution is extremely sensitive to noise or artefacts and singularities (such as over-exposing star cores, or hot/dead pixels, etc.). We are quite spoiled in "StarTools land" with the signal evolution tracking engine, but is perhaps worth remembering that in any other traditional application, deconvolution is generally advised against (for example by the author of PI here and here), unless you have exceptionally clean data at hand. Such is the challenge of implementing this type of signal recovery algorithm.
My inferior data isn't SVD worthy? :( :( :lol:

Well, if it is truly data driven, there are workarounds and I will be experimenting. There's always heal, if you know where the WP's are. And sometimes just the right choice of sample stars...or even one star...will do the trick. I've also tried less of a stretch in AutoDev, which sometimes works, followed by a Restore to Linear, Wiped, Deconvolved, in order to then AutoDev the way I wanted followed by the other modules again. That's a work in progress, but has worked once for me so far. One problem is AutoDev's data-sensing stretch is different with the data being deconvolved already.

And there's always synthetic too! I pushed the same data to 2.5 pixels and 50x iterations (a train wreck, but just to test) with no funny pixels. Same when I loaded the dataset into 1.7. So it is at least deconvolution worthy, if not SVD worthy. :D

Exposure is always a compromise, and I'm still trying to figure it out. This data was 6 minute subs at ISO 400, total of 9.5 hours, L'eNhance, F/9 ED refractor, and probably Bortle 7-ish. I don't think the brightest stars are completely blown at the outset (not all 255's), but any stretching will max the cores out for sure. And yet, I worried about underexposure, since I had no left side histogram gap and my red and green channels were slightly merging into the bias point. In other words and using one particular benchmark, I did not achieve 10xRN^2. Probably not even 5xRN^2. So, I may have been better off to expand the well by using ISO 200, but perhaps go 8 minutes. The goal here was to try to pick up the Cygnus X-1 shock wave in OIII (I really didn't), and trade off some saturated stars if needed.

Looking forward to 513 though! :thumbsup: Gosh I was just thinking of some things too whilst working on an M57 set from a couple days ago (drizzled, big fle). Ever considered a pre/post tweak button in Color? Or a cursor-following zoom in Layer? Just thoughts, nothing critical.

Oh and that big file did take quite a bit of CPU/GPU churning even on my i7 laptop, so I might pick up a second desktop machine that can handle ST 1.8. Will maybe start another thread for that to get recommendations.

:obscene-drinkingcheers:
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by admin »

Mike in Rancho wrote: Thu Sep 16, 2021 12:59 am My inferior data isn't SVD worthy? :( :( :lol:

And there's always synthetic too! I pushed the same data to 2.5 pixels and 50x iterations (a train wreck, but just to test) with no funny pixels. Same when I loaded the dataset into 1.7. So it is at least deconvolution worthy, if not SVD worthy. :D
You're spot-on with the data drive comment - it's indeed the crux of the "issue". The Spatial Variance thing is a whole extra layer of complexity and, in essence, relies on the data's "correctness" even more. It now requires the data to be "correct" not just across the entire image, but additionally across local regions as well, in order to catch how the Point Spread Function changes per area. It means that local inconsistencies will have a bigger impact, as those inconsistencies cannot just be "averaged out" with estimates from the whole image.
Exposure is always a compromise, and I'm still trying to figure it out. This data was 6 minute subs at ISO 400, total of 9.5 hours, L'eNhance, F/9 ED refractor, and probably Bortle 7-ish. I don't think the brightest stars are completely blown at the outset (not all 255's), but any stretching will max the cores out for sure. And yet, I worried about underexposure, since I had no left side histogram gap and my red and green channels were slightly merging into the bias point. In other words and using one particular benchmark, I did not achieve 10xRN^2. Probably not even 5xRN^2. So, I may have been better off to expand the well by using ISO 200, but perhaps go 8 minutes. The goal here was to try to pick up the Cygnus X-1 shock wave in OIII (I really didn't), and trade off some saturated stars if needed.
Over-exposure of stars is not something that is/should be too detrimental (they're pretty much unavoidable), but can definitely contribute to processing difficulties. For one, as CCD wells (or the CMOS equivalent) approach saturation, they will start "rejecting" photons (a little like squeezing people into a full subway carrriage). This means that the response becomes non-linear, which "breaks" deconvolution as well.
Looking forward to 513 though! :thumbsup: Gosh I was just thinking of some things too whilst working on an M57 set from a couple days ago (drizzled, big fle). Ever considered a pre/post tweak button in Color? Or a cursor-following zoom in Layer? Just thoughts, nothing critical.
Stop it! I'm overworked as it is! :lol:
Oh and that big file did take quite a bit of CPU/GPU churning even on my i7 laptop, so I might pick up a second desktop machine that can handle ST 1.8. Will maybe start another thread for that to get recommendations.
:obscene-drinkingcheers:
With the current GPU-weirdness, now is actually a really great time to look at "gaming" laptops.
Some food for thought.
Ivo Jager
StarTools creator and astronomy enthusiast
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by Mike in Rancho »

Thanks Ivo! :bow-yellow:

I'm probably a couple years from a new laptop, but will watch, and also post an alternate thread to get everyone's ideas and more specifics.

1.8 SVD is too good, so I'll find ways to make it work even if my "real world lol" data is lacking. Even the controls are better when just using synthetic or 1-star. And all this testing helped me learn a bit about the mask and deringing, so happy I learned I can tweak the mask for that (non sample stars of course).

And thanks, good to know I have to allow even more extra buffer on star saturation to allow for both stretching and a non-linear zone. The WP's still strike me as odd though -- something about them I haven't quite figure out yet. I'll keep watching... :think:
jackbak
Posts: 22
Joined: Sun Jun 14, 2015 3:38 pm

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by jackbak »

Thanks Ivo,

Yeah I may give (operative word is may) the Nvidia GTX 560Ti another go. But it does remind me of yesterday's dental visit.
firebrand18
Posts: 86
Joined: Tue Feb 04, 2020 1:43 pm

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by firebrand18 »

@Ivo, I made updates to my Windows 10 and now the GPU version works; very exciting! Thanks. :thumbsup:
Mike in Rancho
Posts: 1166
Joined: Sun Jun 20, 2021 10:05 pm
Location: Alta Loma, CA

Re: StarTools 1.8.512 public alpha/preview 4 now out

Post by Mike in Rancho »

Don't know if this might have been noted already in earlier threads (or even this one), but I was playing with NBAccent tonight and ran into a little snafu in Wipe. Not really substantive that I am aware of, just procedural.

The pre/post-tweak and before/after toggles don't work right when viewing the NB file - and instead just immediately throw you to the color file.

The toggles do work appropriately in color and luminance.

Also, though I realize that the NB file gets a later crack at a quasi-AutoDev, is it included in the overall Wipe settings chosen? Just curious, because I usually find myself utilizing very different Wipe settings between my RGB datasets and my L'eNhance datasets when done separately.
User avatar
admin
Site Admin
Posts: 3382
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: StarTools 1.8.515 public alpha/preview 5 now out

Post by admin »

Hi all,

The latest alpha is now up for download;

* New HDR module (implementing Spatial Contrast and Gamma Limited Local Histogram Optimisation), with more consistent results, better stellar profile protection, better multi-core and GPU utilization and preview support
* Now disabling exposure length level setters in Compose module for compositing modes that do not need/use them
* Tweaked Color Filter Array (aka. "Bayer pattern") artifact filter in Wipe module
* Added 1:1 ratio to Crop module
* Fixed bug in Tracking (affecting decon and mask generation) in CPU versions of the alpha
* Fixed label in Wipe's Luminance/Color/Narrowband button not changing correctly, depending on what datasets are available
* Cleaned up various other glitches in Wipe

Documentation for the new HDR module is here. Really, it is actually simpler to use than the old module, the one caveat being that initial launch and every change to the 'Context Size' parameter will see the module do a lot of precalculations. The upshot is that all other parameters should yield results in near-real-time.

Have a great weekend! (I really need some downtime this for a couple of days to recharge - I'll be back soon to answer questions, etc.!)
Ivo Jager
StarTools creator and astronomy enthusiast
Post Reply