Some updates from my stacker:
I tried writing it in C++, and it's just not going so hot. It's a lot more difficult for me to write correct, fast code in it - I've found that python/scipy is actually
faster than my C++ implementation, because they have spent years optimizing and getting the algorithms just right, whereas my C++ one is not so great. In any case, runtime is a 10-20 seconds for a 10 1000x1000 images, so it's not too big of a deal, since it's an all-automated process that you can start then tab away and do something else.
I did some experimenting with phase correlation, though... I got it working pretty robustly, although all I've tested is my data.
Basically, I run a preprocessing filter over it, and then run it through FFT/etc. The preprocessing filter extracts the brightish stars and zeroes everything else out. Then, because everything is zeroed out, a misalignment (matching wrong star pairs) is very unlikely to happen. Even if two stars happen to line up, all the other stars will be paired with zeroes and will multiply to zero, causing that result to be near-zero. Only the correct alignment, where all stars are paired with each other and not zeroes, will the result have a high value (and therefore easily findable).
This magical "preprocessing filter" has some issues, though. What I'm doing right now is doing this psudocode:
starImage = image - percentile_filter(image, 10%, 10px)
zeroIdx = starImage < max(starImage) / 32
image[zeroIdx] = 0
Essentially, the percentile filter does a crude "background extraction" that is sort of like "wipe" in ST, making the floor be zero on a very localized basis (but allowing major local anomalies like stars to survive). The divide by 32 is sort of like deep sky stacker's "star threshhold" variable, I'm looking for a way to make that dynamic without requiring user input (as the optimal value varies depending on the SNR of the image).
Then I take
image = log(image + 1)
to normalize stars to give them sort of equal weight, so the brightest star doesn't dominate the alignment.
The issues with that, though, is that percentile_filter is slow (on the order of 0.25 - 0.5 seconds for a 1000x1000 image), and the "32" constant that I'd rather just be automatic, and finally the fact that I don't think it's particularly robust - it gives good results for my data, but I'm not sure about others.
So I was going to take you up on your offer and ask you how either the ST auto-star-mask works, or the wipe module
Edit: Actually... Taking the FFT, zeroing out the corners in a 5px box, doing IFFT, then thresholding with (image < mean(image) * 8) works pretty well. There's still a nasty constant there that I really don't like, but removing low-frequency data via FFT is a lot more... analytical-y than a percentile filter (and I like analytical mathy stuff), as well as more resilient to fat stars that might be bigger than the filter size.
Double edit: I just went through the pains of implementing a star finder, and just using "closest point" for the matching algorithm (so it can't drift that much - but my test data is very well tracked, so I just shoved that 'till later), and it looks like it's much, much more accurate than FFT, simply because I'm fitting 100 gaussians and averaging, not just one. Looks like I've convinced myself it's better, haha - now time to implement triangle similarity matching. I'll be gone tonight and tomorrow (US time), so I'll probably have a release for you to use in two days or so.
Speaking of, what OS are you on? I can prebuild the python into a (rather large) binary, but it's platform-specific executables, obviously.