Problem with M101

Jerry GerberDale Penkala
38 replies964 views
Jerry Gerber avatar
Over a period of 3 non-consecutive nights I imagined M101.   When I calibrated them (each night with their own flats) and integrated them in Pixinsight I immediately noticed the core of M101 does not look good at all. 

This unstretched image is obviously not color corrected, nor did I try to even work on it because to my eye the core looks overexposeed.  I combined 132 subs.  When I combine the first and second night's subs, this problem doesn't exist.  When I just process the 3rd night's 43 subs, the problem doesn't exist either.  but when I stack all 132 subs, the core looks wrong. 

Anybody know what is causing this and can it be corrected in software?

Thanks,
Jerry
Helpful Engaging
John Hayes avatar
Whoa…slow down there Jerry.  

1). You say that this is an unstretched image but it looks like a stretched, stacked image based on what I can see going on at the edge.  Which is it?

2)  Forget the colors and the core.  Your number one problem is that the calibrated data looks completely wrong.  Can you post and original image along with its calibrated image so that we can see what's going on?  Did you blink through your calibrated data to make sure that it was right before aligning?

You've got to first solve the calibration issue before doing anything else.

John
Helpful Concise
Jerry Gerber avatar
Hi John,

I did check all the subs using Blink and all looked good.  My mistake, this is a stretched integrated sub.  Please explain what's wrong with the calibrated data?  Do you want to see a single sub and it's calibrated version?    The gradiants are easy to remove but I didn't do that because of the core.  Adjusting color and brightness do not fix the core.

I'll post some more images tomorrow…
Dale Penkala avatar
Hello Jerry,
@John Hayes is correct you have a calibration issue going on here. If you are an Adam Block Fundamental member check out Video #17-19 under “problem solving techniques” “CMOS Darks & Flats. That might help you out here. In the mean time I wouldn’t worry about the colors you need to fix the calibration then you can work on DBE, SPCC. Its clear you have an over-correction issue going on in the core of your image.

Also you always want to make sure the calibration is as good as you can get it prior to any ABE or DBE as your image will always end up better if its calibrated properly before any post processing is done.

Dale
Helpful
Chris White- Overcast Observatory avatar
It may be a pixel rejection issue. Take a look at your rejection maps when you integrate. You might have data that you are unintentionally rejecting.
Well Written Concise
Jerry Gerber avatar
So, how do I fix it?   When I calibrate & integrate the first and 2nd night's images, no problem.   When I calibrate and integrate the 3rd night's imaging, no problem.   But when I calibrate and integrate all three nights images, I get this (a stretched, stacked jpeg):

andrea tasselli avatar
You should integrate just the 3 nights' master integrate frames, that is just those 3 with no rejection applied. Obviousy you need to register those master frames beforehand.
Jerry Gerber avatar
Could this be the problem:

The first two night's flats were 100ms exposures.  the 3rd night's flats were 50ms exposures.   I was thinking I could calibrate each night's images with each night's flats and then put all the calibrated lights in one folder and proceed to cosmetic correction, debayering, star alignment and final integration.  But now I am suspecting all the flats must be the same exposure length.  Is this true?

Thanks,
Jerry
andrea tasselli avatar
I doubt the flats exposure length bear much responsability in the outcome, if properly done. I find however puzzling that you have used different exposures for the flats when the optical train is unchanged and I suppose you used NINA's flat wizard and a flat panel right? At any rate you should show us the calibrated results for the third night to be able to make a judgement call on the culprit.
Dale Penkala avatar
I’m not 100% sure how WBPP would determine the difference between say session 1verses session3 but in APP you would keep the correct flats with that session.
If you are shooting flats with a different configuration then your lights then that would be your issue if I’m understanding you correctly.

Dale
Jerry Gerber avatar
andrea tasselli:
You should integrate just the 3 nights' master integrate frames, that is just those 3 with no rejection applied. Obviousy you need to register those master frames beforehand.

Thank you Andrea, I turned off rejection in the final integration and that solved the problem!!

There sure is a lot to learn in this craft...
Well Written Respectful
Jerry Gerber avatar
Dale Penkala:
I’m not 100% sure how WBPP would determine the difference between say session 1verses session3 but in APP you would keep the correct flats with that session.
If you are shooting flats with a different configuration then your lights then that would be your issue if I’m understanding you correctly.

Dale

Hi Dale,

The flats are taken right after imaging, same camera temperature, same image train, same focus, using a round LED panel.  The 3rd night's flats were taken at 50ms, whereas the 1st two nights the flats were shot at 100ms.    Here's the "final" result, obviously I still have much to learn about image acquisition and processing.   M101 has proven to be the most difficult object I've imaged so far...
Helpful
Dale Penkala avatar
Jerry Gerber:
Dale Penkala:
I’m not 100% sure how WBPP would determine the difference between say session 1verses session3 but in APP you would keep the correct flats with that session.
If you are shooting flats with a different configuration then your lights then that would be your issue if I’m understanding you correctly.

Dale

Hi Dale,

The flats are taken right after imaging, same camera temperature, same image train, same focus, using a round LED panel.  The 3rd night's flats were taken at 50ms, whereas the 1st two nights the flats were shot at 100ms.    Here's the "final" result, obviously I still have much to learn about image acquisition and processing.   M101 has proven to be the most difficult object I've imaged so far...

Glad you got it figured out Jerry, I think you could run some SCNR to help correct your colors in there but as long as your happy thats all that matters.

Dale
Jerry Gerber avatar
Dale Penkala:
Jerry Gerber:
Dale Penkala:
I’m not 100% sure how WBPP would determine the difference between say session 1verses session3 but in APP you would keep the correct flats with that session.
If you are shooting flats with a different configuration then your lights then that would be your issue if I’m understanding you correctly.

Dale

Hi Dale,

The flats are taken right after imaging, same camera temperature, same image train, same focus, using a round LED panel.  The 3rd night's flats were taken at 50ms, whereas the 1st two nights the flats were shot at 100ms.    Here's the "final" result, obviously I still have much to learn about image acquisition and processing.   M101 has proven to be the most difficult object I've imaged so far...

Glad you got it figured out Jerry, I think you could run some SCNR to help correct your colors in there but as long as your happy thats all that matters.

Dale

No, I am still not happy with it.   I want to add 4 more hours of integration time and then spend more time processing it..
Jerry Gerber avatar
I removed some of the green, I think this is an improvement.  I am still going for 4 more hours of acquisition time..
Dale Penkala avatar
Defiantly better, and yes 4 more hours will defiantly help!

Dale
Chris White- Overcast Observatory avatar
Chris White- Overcast Observatory:
It may be a pixel rejection issue. Take a look at your rejection maps when you integrate. You might have data that you are unintentionally rejecting.



Based on your test of integrating each night separately and then combining with no rejection suggests this is the issue. 

You can try raising the sigma high value in pixel rejection.  IIRC the default is 3, but since I pre-process manually and dont know how wpbb does it you might have to look that up yourself.  Raising this value should solve your problem.  Of course you can use the workaround that Andrea suggested, but it's not a great solution if you run into this problem frequently.
Helpful Concise
Jerry Gerber avatar
Chris White- Overcast Observatory:
Chris White- Overcast Observatory:
It may be a pixel rejection issue. Take a look at your rejection maps when you integrate. You might have data that you are unintentionally rejecting.



Based on your test of integrating each night separately and then combining with no rejection suggests this is the issue. 

You can try raising the sigma high value in pixel rejection.  IIRC the default is 3, but since I pre-process manually and dont know how wpbb does it you might have to look that up yourself.  Raising this value should solve your problem.  Of course you can use the workaround that Andrea suggested, but it's not a great solution if you run into this problem frequently.

Hi Chris,

I've actually never run into this problem before so it definitely perplexed me. By un-checking pixel rejection in the final integration of the calibrated subs it solved the problem. 

But I still don't know what caused it in the first place. The only difference between the 3 nights of data acquisition was the length of the flats exposures. 

Thanks, 
Jerry
Chris White- Overcast Observatory avatar
Jerry Gerber:
Chris White- Overcast Observatory:
Chris White- Overcast Observatory:
It may be a pixel rejection issue. Take a look at your rejection maps when you integrate. You might have data that you are unintentionally rejecting.



Based on your test of integrating each night separately and then combining with no rejection suggests this is the issue. 

You can try raising the sigma high value in pixel rejection.  IIRC the default is 3, but since I pre-process manually and dont know how wpbb does it you might have to look that up yourself.  Raising this value should solve your problem.  Of course you can use the workaround that Andrea suggested, but it's not a great solution if you run into this problem frequently.

Hi Chris,

I've actually never run into this problem before so it definitely perplexed me. By un-checking pixel rejection in the final integration of the calibrated subs it solved the problem. 

But I still don't know what caused it in the first place. The only difference between the 3 nights of data acquisition was the length of the flats exposures. 

Thanks, 
Jerry



Without seeing the data its difficult to guess, but if it is not a common problem for you I'd just forget about it and write this one off as weird but fixable.  Strange things can also be caused by slight changes to driver settings, or other gremlins..
Dale Penkala avatar
I still believe different flat lengths is your issue but thats MHO. Anytime I’ve had different flats that I’ve used in calibration it causes really bad integration images. Your original image shows an over correction and that can easily be from the wrong calibration frames. Thats still my thoughts on that. But as @Chris White- Overcast Observatory says maybe write this one off and start with a new set of data and move forward.
My personal recommendation is whatever you shoot your flat lengths at keep them consistant and make sure to take the same length of flat darks as well! Consistency is key!

Dale
Helpful Concise Supportive
Chris White- Overcast Observatory avatar
Dale Penkala:
I still believe different flat lengths is your issue but thats MHO. Anytime I’ve had different flats that I’ve used in calibration it causes really bad integration images. Your original image shows an over correction and that can easily be from the wrong calibration frames. Thats still my thoughts on that. But as @Chris White- Overcast Observatory says maybe write this one off and start with a new set of data and move forward.
My personal recommendation is whatever you shoot your flat lengths at keep them consistant and make sure to take the same length of flat darks as well! Consistency is key!

Dale



Dale,

As long as the illumination is on target the flat length really doesnt matter.  I take sky flats for example, and as the sky brightens my flats for a filter might range from 15 seconds to 3 seconds on the same run.  But who knows what upsets wpbb.  This is one reason I dont use it.  I like to monitor my data every step of the way through preprocessing.
Dale Penkala avatar
Chris White- Overcast Observatory:
Dale Penkala:
I still believe different flat lengths is your issue but thats MHO. Anytime I’ve had different flats that I’ve used in calibration it causes really bad integration images. Your original image shows an over correction and that can easily be from the wrong calibration frames. Thats still my thoughts on that. But as @Chris White- Overcast Observatory says maybe write this one off and start with a new set of data and move forward.
My personal recommendation is whatever you shoot your flat lengths at keep them consistant and make sure to take the same length of flat darks as well! Consistency is key!

Dale



Dale,

As long as the illumination is on target the flat length really doesnt matter.  I take sky flats for example, and as the sky brightens my flats for a filter might range from 15 seconds to 3 seconds on the same run.  But who knows what upsets wpbb.  This is one reason I dont use it.  I like to monitor my data every step of the way through preprocessing.

Understood Chris, my point is anytime I have used flats with different lengths I’ve always had issues. This is using WBPP as well as APP. I would get the same over correction issue and some of the goofy colors that comes of it. I’m not saying your wrong Chris this is just based off of my personal experience is all. I’ll also say that consistency is important to calibration.

Dale
Jonny Bravo avatar
The best way for people to help you is if you upload a completely unprocessed set of a flat, a dark, a light and a bias (or darkFlat) from each night to a file sharing site like Dropbox or Google Drive. You could also upload your masters from WBPP - again, completely unprocessed… just the XISF (or FITS if you use that format).

Looking at the "unprocessed" versions of your image, your flats are not properly correcting the data. That's why there looks to be a hole cut out of the center. Without actually having your data, my initial guess is mismatched darks/biases.

In your "final" image, you've absolutely destroyed the background by clipping it into oblivion and the colors are all way off in both the galaxy and the stars. Gathering 4 more hours isn't going to help you. Fixing the problems with the data you've currently got, will. Once that's addressed, you'll be starting from a much happier place in your processing smile
Well Written Helpful Insightful Engaging
Georg N. Nyman avatar
Jerry Gerber:
I removed some of the green, I think this is an improvement.  I am still going for 4 hour hours of acquisition time..

Jerry, I read through all responses here and please allow me also to comment on this result: Yes, definintly better, but far off regarding the color rendition. I know this galaxy quite well from my own imaging work - but something is still very wrong - the stars do not show their real color at all, you have an overall color cast in the stars towards yellow/green and the galaxy is regarding color also very odd - yes, there is quite some blue in the arms, but not that blue and not that much at all.
Without knowing your setup I can´t suggest anything reasonable, but I presume, you took your many raw with an OSC camera, right?  If yes, did you choose the correct Bayer pattern for processing? And what exposure time of the subs did you use?
If possible, I would like to help you with the colors, but as I said, not knowing what you did, I just can only guess..
CS
Georg
Helpful Supportive
Wido's AstroForum avatar
Hi Jerry,

Took your first stacked picture with the artifact because it still had some background - in the other pics, you clipped the background entirely and changed the colorization of M101. I then processed your picture in Pixinsight using the following tools:

- dynamic crop (to get rid of the borders)
- dynamic background extraction (to get rid of light pollution)
- background neutralization and calibration (to align the colors in RGB)
- multiscale linear transform and ACDR on the background (to smoothen the background)
- SNCR (to get rid of the green stuff)
- HDRtransform on the Galaxy
- Some final curve transformation and color saturation

The artifact is still there but it looks like you got pretty decent data, congrats!
tried to tune down the purple/magenta saturation so it is a bit less annoying.
You just need to stretch it a little bit less so the core stays intact.
Hope this helps!

Cheers!

Wido.



Helpful Supportive