Combining data from different scopes

OscarAris PopeJon RistaRodrigo Roesch
39 replies993 views
Aris Pope avatar
I have data from two different scopes, one last year one this year. Same camera and filter as well. I'm having the darndest time combining the data together. I have about 5 hours from last year and 3 hours from the other night and I would love to put it together. I'm having issues with wbpp. It fails during the registration process. I've tried everything. I've tried setting the reference frame at auto, I've tried setting it for each year, and I've tried manual.

I know it can be done since I see all of these collaborations that have different scopes and cameras. How can I combine the data?

Using Pixinsight 

Thanks much,

Aris
YingtianZZZ avatar
In this case, I may try integrate them separately and use starAlignment process to re-assemble both. It usually works when I combine two sessions with large FL difference (750mm/1480mm from 107phq and C8).
Richard Carande avatar
What software are you using?  Sounds like PixInsight from the wbpp reference.  I do this in Astro Pixel Processor, and it just works, automatically resampling to the reference pixel scale, or another specified scale.  Not sure about how to do it in PI, which I also use occasionally.
Oscar avatar
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)

Rodrigo Roesch avatar
I have done that by staking the data from each scope separately then register and combining  each stack using registar.
I also have combined different FL like in this image
https://astrob.in/368968/D/
I combine data taken with DSLR and 135mm lens with WO 98mm and asi 294mc. Registar is a very powerful software to register any type of image
Aris Pope avatar
Rodrigo Roesch:
I have done that by staking the data from each scope separately then register and combining  each stack using registar.
I also have combined different FL like in this image
https://astrob.in/368968/D/
I combine data taken with DSLR and 135mm lens with WO 98mm and asi 294mc. Registar is a very powerful software to register any type of image

I have a 294 MC as well. My question is when you use the image registration isn't there a minimum number of images like four?
Aris Pope avatar
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


That's awesome.. star alignment gets them registered but does not combine them? So you would have to go and integrate the images? 
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


Rodrigo Roesch avatar
No, you can register two images. Not sure if pi has a limit since I haven’t registered small amount, but Registar or even DSS can register two
Aris Pope avatar
Richard Carande:
What software are you using?  Sounds like PixInsight from the wbpp reference.  I do this in Astro Pixel Processor, and it just works, automatically resampling to the reference pixel scale, or another specified scale.  Not sure about how to do it in PI, which I also use occasionally.

Pixinsight
Aris Pope avatar
Rodrigo Roesch:
No, you can register two images. Not sure if pi has a limit since I haven’t registered small amount, but Registar or even DSS can register two

I'm sorry I meant integration.. don't you need a minimum of 4 images for integration after registration?
Oscar avatar
Aris Pope:
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


That's awesome.. star alignment gets them registered but does not combine them? So you would have to go and integrate the images? 

Correct
Oscar avatar
Aris Pope:
Rodrigo Roesch:
No, you can register two images. Not sure if pi has a limit since I haven’t registered small amount, but Registar or even DSS can register two

I'm sorry I meant integration.. don't you need a minimum of 4 images for integration after registration?

for PI, yeah it's like 3 or 4 minimum
Rodrigo Roesch avatar
For integration, you need a minimum because you need the statistics to work, some of them won’t do it. For example, all the pixel rejections such as sigma clipping etc won’t have a reference. Maybe be just average could work
Oscar avatar
tried 2 with Average, no rejection, it gave this error:

"ImageIntegration: Cannot execute instance in the global context.

Reason: This instance of ImageIntegration defines less than three source images."
Rodrigo Roesch avatar
The minimum I have tried is 5 with a comet. Maybe you can try DSS. It is free software
Oscar avatar
with DSS, you can actually stack just one (although nothing will happen besides debayering and calibration if you want that)
Marc V avatar
With Siril, you can pre-process your images from each nights/each cameras separatly and then register them all together… And the end, you crop a bit and your done…

The cameras should have a similar rotation so that the image will be more or less in the same orientation, otherwise it will look like the image the OP posted…
Harry Karamitsos avatar
There are a few videos that cover similar scenarios. I used them to figure out how to integrate images from my Skywatcher and Askar. 

This one helped.

https://m.youtube.com/watch?v=GyuJyHk0P2g&t=319s&pp=ygUkcGl4aW5zaWdodCBzdGFja2luZyBtdWx0aXBsZSBuaWdodHMg
Jure Menart avatar
tried 2 with Average, no rejection, it gave this error:

"ImageIntegration: Cannot execute instance in the global context.

Reason: This instance of ImageIntegration defines less than three source images."

If you have only 2 images to integrate, you can use 2 copies of each of the 2 images and integrate those 4.
Álvaro Méndez avatar
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


Unless someone else tells you a method that works in the software you already use, you can always try Astro Pixel Processor (it has a one month free trial) to do this. The Local Normalization process tends to blend these things when combining different stacks. You just add all the images (the minimum is 2 images) and integrate choosing Local Normalization. Start with 2nd degree, 4 iterations which usually works and go up from there if needed. Sometimes I’ve needed to use 4th degree but it usually works great with the 2nd. It matches the background intensities and blends the seams. The option “multi-band blending” supposedly helps with these but I havent had any good results with it although you might play with it too.
Helpful Respectful
Jan Erik Vallestad avatar
Aris Pope:
I know it can be done since I see all of these collaborations that have different scopes and cameras. How can I combine the data?


What kind of errors are you getting, could you share a log? How different are the scopes in terms of focal length, and how well aligned are the images in terms of rotation/fov?
Jon Rista avatar
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


I think you said you acquired these from different public stacks...

Looking at the result here, I would first question if each of these stacks were properly flat calibrated? The fields do not appear to be flat or uniform, and edge artifacts don't appear to have been cropped out of several of these either.

If the goal is to create the largest field possible from integrations from disparate sources, I would say that flat calibration and really calibration in general, would become critical. Even with good calibration, due to dithering, there is often going to be border/edge artifacts, and I would think you would want to crop those out first, before trying to combine all the individual registered integrations into a single integration. Flat calibration in particular, would need to be very meticulously don to ensure that the field was as clean and flat as possible. 

As for blending across the borders...there is bound to be some degree of difference between the individual integrations. You could give LocalNormalization a try and see if it is able to neutralize the differences between the backgrounds and eliminate (or at least minimize) the borders.
Helpful Respectful
Oscar avatar
Álvaro Méndez:
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


Unless someone else tells you a method that works in the software you already use, you can always try Astro Pixel Processor (it has a one month free trial) to do this. The Local Normalization process tends to blend these things when combining different stacks. You just add all the images (the minimum is 2 images) and integrate choosing Local Normalization. Start with 2nd degree, 4 iterations which usually works and go up from there if needed. Sometimes I’ve needed to use 4th degree but it usually works great with the 2nd. It matches the background intensities and blends the seams. The option “multi-band blending” supposedly helps with these but I havent had any good results with it although you might play with it too.

It's a good thing PI has Local Normalization too! Will try PI first.

Jon Rista:
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


I think you said you acquired these from different public stacks...

Looking at the result here, I would first question if each of these stacks were properly flat calibrated? The fields do not appear to be flat or uniform, and edge artifacts don't appear to have been cropped out of several of these either.

If the goal is to create the largest field possible from integrations from disparate sources, I would say that flat calibration and really calibration in general, would become critical. Even with good calibration, due to dithering, there is often going to be border/edge artifacts, and I would think you would want to crop those out first, before trying to combine all the individual registered integrations into a single integration. Flat calibration in particular, would need to be very meticulously don to ensure that the field was as clean and flat as possible. 

As for blending across the borders...there is bound to be some degree of difference between the individual integrations. You could give LocalNormalization a try and see if it is able to neutralize the differences between the backgrounds and eliminate (or at least minimize) the borders.

Nope, no flat calibration; none were provided.

Yep, LocalNormalization is my new tactic, and I'll crop out the edges of each stack like you say. Thx.
Tim Ray avatar
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


alot of camera rotation within one of the data sets. Could be why PI WPPP is having difficulty.  I have used WBPP to combine various datasets without any difficulty. But each dataset was more "camera angle" stable...

I would follow the advice of others in this instance. Register each data set separately then Register(star-align) the two results. 


CS. Tim
Oscar avatar
Tim Ray:
Star Alignment works great for registering them

I wonder how to get the background lightness and color to match though, with different data

I tried getting some free data stacks from different people (including myself), of M63, to make a very long integration image, and after registering, easily saw the borders between the frames in the stacked image; it was so noticeable and I could do nothing about it; I redid my steps, this time tried LinearFit on all of the data stacks, stacked them, was no good; went back again, tried BGN, and Gradient Correction, and Automatic BG extractor, still didn't help; so I gave up and forgot about it.

Tried different settings in ImageIntegration too; like using LinearFit rejection vs no rejection vs Percentile, Median vs Average stacking, nothing helped as much as I'd like.

So I'll be listening to this thread.

This is what it looks like; not very appetizing:

(Median stacked, Percentile clipping; autostretched)


alot of camera rotation within one of the data sets. Could be why PI WPPP is having difficulty.  I have used WBPP to combine various datasets without any difficulty. But each dataset was more "camera angle" stable...

I would follow the advice of others in this instance. Register each data set separately then Register(star-align) the two results. 


CS. Tim

thanks a lot for trying to help!

but I didn't use WBPP, and each dataset was already individually stacked by their owners.

and it's more than just 2 individual stacks; it's 8.

I'm gonna play around with LocalNormalization and see what that does.

but thx again.
Related discussions
Dual-telescope rig: seeking advice and strategies
I live in an area of the world with seeing that is average on the best nights, non-existent on most. And those nights that are best tend to be extremely short summer nights at 48º north latitude. I am trying to create a rig to best utilize the few qu...
26 days ago
Both posts discuss strategies for maximizing astrophotography data collection despite challenging circumstances that limit imaging opportunities.
RCC - ways to improve acquisition and processing, 1 year into astrophotography
Monday, June 16th was exactly 1 year since my first astrophotography session with a 85mm lens on an astro-modified Nikon Z5 MILC and a SW SA GTi mount. Since then, I have tried various camera lenses and 3 smallish refractors. My current rig is a WO G...
Jun 18, 2025
Both posts discuss astrophotographers reflecting on their equipment and data collection experiences over the span of approximately one year.