oymd avatar

Hi Everyone

M42 is driving me nuts!

I am not able to properly blend my long subs master with the shorter subs master without creating HARSH and blotchy boundaries no matter what settings I try.

Background of image acquisition:

Images were all taken with 2600MC Pro, Gain 100, Offset 20 on Esprit 100ED, -5C

Optolong L-eXtreme filter

Ioptron CEM120EC, guiding at below 0.3”rms

I have my MASTERFLAT and MASTERDARKS for each set of data

All imaging was done over 1 week period.

The trapezium part of the nebula was blown out even on the 10s subs, so I aslo imaged 1400subs at 3 seconds…CRAZY, I know..!

1- 300s: 63 subs

2- 180s: 137 subs

3- 60s: 69 subs

4- 30s: 156 subs

5- 10s: 236s

6- 3s: 1400 subs

Dithering was done every THREE minutes of imaging time: (After every sub in the 300s and 180s subs). The 3s master has quite a bit of walking noise though. All other masters are pristine.

Total Integration is about 17 hours

Pre-processing:

WBPP with Drizzle, highest quality

I have 6 Masters:

1- 300s

2- 180s

3- 60s

4- 30s

5- 10s

6- 3s

Processing:

The actual data looks clean on each of the masters

1- Each Master file had a Statistics check, and the GREEN channel had the highest mean value

2- Each master was split into RGB

3- LINEARFIT was used on the separate RGB images, and the REFERENCE was the GREEN channel which had the highest mean value

4- CHANNELCOMBINATION and we are back to 6 Masters that have had Linearfit done.

5- BlurXterminator CORRECT MODE only to fix the stars on each of the masters

6- GRAXPERT to each of the masters

7- SPCC to each master, NARROWBAND mode, 7nm for each channel, with preview of background for neutralization. I also did a separate workflow where I SKIPPED SPCC

8- STARALIGNMENT to all 6 masters, with the 300s as the REFERENCE.

9- Now I have 6 REGISTERED Masters, saved as XISF 64 bit

10- HDR Composition: All the drama starts here.

  • 1- I did at least 10 different runs

  • 2- Tried using the 5 longest exposure masters

  • 3- Tried using the 6 masters, including the 3s master which had walking noise

  • 4- Tried the 300s + 10S

  • 5- Tried 180s + 10s…etc…

  • 6- Tried using the longest exposure first etc, and also tried the reverse, with the shortest exposure first

  • HDRCOMP settings:

  • Binarize threshold: I hovered with the mouse over the BRIGHTEST most saturated pixels in the core, with gave an R reading very close top 1.0, more like 0.89-0.95. Multiplied that by the STOCK 0.8 and used that for the THRESHOLD. I even tried 0.5, 0.6, 0.7, 0.9 and 1.0, ALL gives same result.

  • Mask Smoothness: Stock & went all the way up to 35, no change

  • Mask Growth: From stock all the way up to 10, no change

11- The HDR image looks OK, but the core is completely blown out, even when using 300s + 10s in HDR COMP

12- BLURXTERMINATOR now done, and PSF value extracted from the Luminance of the HDR image, using Script—>Image Analysis—>FWHMEccentricty and measuring MEDIAN FWHM

13- NOISEXTERMINATOR: Clicking PREVIEW shows a very HARSH image, with ABRUPT interfaces between various gradients, and I am not able to fine tune anything.

14- MULTISCALEADAPTIVE STRETCH: Same thing. Clicking PREVIEW shows the same, and fine tuning is impossible?!

IMAGE ATTACHED

📷 HDR Composition Blotchy.jpgHDR Composition Blotchy.jpg

15- STAREXTERMINATOR: remove stars and work on starless nebula

16- When I then go to HDRMULTISCALE TRANSFORM, any SETTING i USE ENDS UP SHOWING A STRANGE EFFECT ON THE CORE VS THE REST OF THE NEBULA, which makes me think the problem lies with the MASKS and COMBINATION that was done in the HDR Composition step.

IMAGE ATTACHED

📷 HDR Multiscale Transform.jpgHDR Multiscale Transform.jpg

What am I doing wrong? I have published one of end results here on Astrobin a few weeks ago, which I think is terrible?

https://app.astrobin.com/u/oymd?i=bzdink#gallery

I have rerun the workflow countless times, but cannot do it right

Is it the L-eXtreme? I doubt it. That would just cause unnatural colors, or distort the colors, but not show this harsh mask errors?

Please HELP!!

🙂

Willem Jan Drijfhout avatar
Do you run HDRComposition on linear images? This tool is designed to work on non-linear images. Stretch each stack for its intended purpose (short exposures for bright areas, long exposures for other areas). Only then blend them together using HDRComposition.
Well Written Helpful Concise
oymd avatar

Willem Jan Drijfhout · Jan 4, 2026 at 04:52 PM

Do you run HDRComposition on linear images? This tool is designed to work on non-linear images. Stretch each stack for its intended purpose (short exposures for bright areas, long exposures for other areas). Only then blend them together using HDRComposition.

Oh..I never heard that before?

I was under the impression that HDRComp should be run on LINEAR images before doing any stretching?

Are you sure?

Michael Regouski avatar

HDRComp should be run on linear images. I use it all the time with my FLI KL 4040 camera. One thing I found is that you need to make sure that you are using the 24 bit transfer function when screen stretching or your images will look like like what you are showing. I also save my imagaes to disk in 64 bits before loading them into HDRComp. Hopefully that is all it is.

Helpful
Willem Jan Drijfhout avatar
Willem Jan Drijfhout · Jan 4, 2026 at 04:52 PM

Do you run HDRComposition on linear images? This tool is designed to work on non-linear images. Stretch each stack for its intended purpose (short exposures for bright areas, long exposures for other areas). Only then blend them together using HDRComposition.

Oh..I never heard that before?

I was under the impression that HDRComp should be run on LINEAR images before doing any stretching?

Are you sure?

Oops, sorry, you are right. I mixed it up with HDRMultiscaleTransform, which works on non-linear.
Well Written Respectful
oymd avatar

Michael Regouski · Jan 4, 2026 at 05:37 PM

HDRComp should be run on linear images. I use it all the time with my FLI KL 4040 camera. One thing I found is that you need to make sure that you are using the 24 bit transfer function when screen stretching or your images will look like like what you are showing. I also save my imagaes to disk in 64 bits before loading them into HDRComp. Hopefully that is all it is.

I have done those two things actually.

24 but transfer has been enabled in PIXINSIGHT’s global options, and every image I save is in 64 bits.

Let me clarify. The HDR image itself does not exhibit the issues I complained about.

It’s only when I open Noisexterminator or the new Multiscaleadaptive Stretch and click the PREVIEW that I see that terrible blotchy preview image.

The problem is that with that preview image, I cannot fine tune the noisexterminator settings or the multiscale adaptive stretch settings, as the preview shows nothing usable? It only shows the blotchy harsh preview image.

oymd avatar

Willem Jan Drijfhout · Jan 4, 2026 at 05:46 PM

Willem Jan Drijfhout · Jan 4, 2026 at 04:52 PM

Do you run HDRComposition on linear images? This tool is designed to work on non-linear images. Stretch each stack for its intended purpose (short exposures for bright areas, long exposures for other areas). Only then blend them together using HDRComposition.


Oh..I never heard that before?

I was under the impression that HDRComp should be run on LINEAR images before doing any stretching?

Are you sure?


Oops, sorry, you are right. I mixed it up with HDRMultiscaleTransform, which works on non-linear.

The HDR Multiscale in my case was run on the non linear image after stretching, and shows that bizarre artifact on the core.

Mikołaj Wadowski avatar

I believe HDRComp is supposed to be ran as the first step in your workflow. Maybe you could do gradient extraction first, but I would assume the regions bright enough to need HDRComp aren’t really affected by gradients too much.

I just chucked all of my masters in, I didn’t even bother to remove the drizzleweight images. Ran it on default settings, got from this (straight 300s) image.pngto this (300s + 30s + 10s)

image.pngAll settings I tried worked really. Some had harsher masks but they were usable. My guess is that somehow all of your processing messed with the images enough to the point where hdrcomp wouldn’t work (perhaps blurx “fixing” the overblown stars in the longer exposures)

Regarding the rest of your workflow:

oymd · Jan 4, 2026, 12:03 PM

1- Each Master file had a Statistics check, and the GREEN channel had the highest mean value

2- Each master was split into RGB

3- LINEARFIT was used on the separate RGB images, and the REFERENCE was the GREEN channel which had the highest mean value

4- CHANNELCOMBINATION and we are back to 6 Masters that have had Linearfit done.

These steps make no difference since you’re doing color calibration later anyway.

oymd · Jan 4, 2026, 12:03 PM

8- STARALIGNMENT to all 6 masters, with the 300s as the REFERENCE.

Weren’t the stacks already aligned after WBPP? Unless you stacked them separately.

oymd · Jan 4, 2026, 12:03 PM

NOISEXTERMINATOR: Clicking PREVIEW shows a very HARSH image, with ABRUPT interfaces between various gradients, and I am not able to fine tune anything.

14- MULTISCALEADAPTIVE STRETCH: Same thing. Clicking PREVIEW shows the same, and fine tuning is impossible?!

Previews unfortunately don’t support 24bit STF as far as I’m aware. I don’t think there’s a workaround for all processes, since different tools react differently to transformed data, but you can try either applying the current STF or doing a simple linear stretch/multiplying the image by like 10 in pixelmath. Or just try new settings blindly and see how they look.

Helpful
oymd avatar

Mikołaj Wadowski · Jan 4, 2026 at 06:54 PM

I believe HDRComp is supposed to be ran as the first step in your workflow. Maybe you could do gradient extraction first, but I would assume the regions bright enough to need HDRComp aren’t really affected by gradients too much.

I just chucked all of my masters in, I didn’t even bother to remove the drizzleweight images. Ran it on default settings, got from this (straight 300s) image.pngto this (300s + 30s + 10s)

image.pngAll settings I tried worked really. Some had harsher masks but they were usable. My guess is that somehow all of your processing messed with the images enough to the point where hdrcomp wouldn’t work (perhaps blurx “fixing” the overblown stars in the longer exposures)

Regarding the rest of your workflow:

oymd · Jan 4, 2026, 12:03 PM

1- Each Master file had a Statistics check, and the GREEN channel had the highest mean value

2- Each master was split into RGB

3- LINEARFIT was used on the separate RGB images, and the REFERENCE was the GREEN channel which had the highest mean value

4- CHANNELCOMBINATION and we are back to 6 Masters that have had Linearfit done.

These steps make no difference since you’re doing color calibration later anyway.

oymd · Jan 4, 2026, 12:03 PM

8- STARALIGNMENT to all 6 masters, with the 300s as the REFERENCE.

Weren’t the stacks already aligned after WBPP? Unless you stacked them separately.

oymd · Jan 4, 2026, 12:03 PM

NOISEXTERMINATOR: Clicking PREVIEW shows a very HARSH image, with ABRUPT interfaces between various gradients, and I am not able to fine tune anything.

14- MULTISCALEADAPTIVE STRETCH: Same thing. Clicking PREVIEW shows the same, and fine tuning is impossible?!

Previews unfortunately don’t support 24bit STF as far as I’m aware. I don’t think there’s a workaround for all processes, since different tools react differently to transformed data, but you can try either applying the current STF or doing a simple linear stretch/multiplying the image by like 10 in pixelmath. Or just try new settings blindly and see how they look.

Thanks for the hints.

Yes, I stacked each exposure stack separately so had to do star alignment.

Good point on previews not supporting 24bit! I did not know that. That’s possibly why my previews look hopeless.

Will try your other points.

Thanks.

Michael Regouski avatar

When I use HDRComp I run it as the very first step after WBPP. Since my camera saves my exposure as a higher gain image and a lower gain image at the same time. So, the first thing I do after WBPP is HDRComp. I do save out my masters as a 64 bit image and then bring them into HDRComp and combine them.

Tomvp avatar

I followed this tutorial: https://youtu.be/WMf81H39H6g and ended up with this with only half an hour of subs: https://astrob.in/7854yz/0/ Maybe it helps?

Oh, and the preview of NXT regularly screws up my processing too. It’s a feature of PI we have to live with apparently.

Elmiko avatar

After you use HDR Composition on your linear masters, stretch the hdr Composition master, then apply hdr transformation. This will resolve the Trapezium well.

oymd avatar

Elmiko · Jan 4, 2026 at 09:42 PM

After you use HDR Composition on your linear masters, stretch the hdr Composition master, then apply hdr transformation. This will resolve the Trapezium well.

I have attached an image of me doing the HDR MULTISCALE TRANSFORMATION, and as you can see, it shows a funny strange effect on the core, and the outline looks like it is related to the mask that was applied by HDR composition.

Jan Erik Vallestad avatar

You mention that the core was saturated even at 10s subs. I find that quite strange. Could you share screenshots of the core from the various exposure lengths and keep STF turned off?

Nothing is saturated/over-exposed unless it’s 100% white pre-STF.

I would rely less on automated/semi-automated scripts and go back to the basics. I still find stretching using the histogram/curves carefully, then blend different iterations at the end to perfect it if necessary to be the best method of processing any images. In stead of using the preview window of each script/process, try making a smaller preview on the image and actually run the process to check out it’s effect. Some of the process previews aren’t very good.

Helpful Insightful Engaging