I'm just wondering what processing techniques you might use to improve my 5hrs on M106? Please feel free to comment or even play with the image:


M106 after 5 hrs integration


M106 after 5 hrs integration
Jeremy Phillips FRAS:
I would sit back, stare at it with a glass of wine in hand and tell myself "hey, I did that".
Blaine Gibby:
Also the gradients are either from bad flats or light pollution
Andy Wray:
I'm just wondering what processing techniques you might use to improve my 5hrs on M106? Please feel free to comment or even play with the image:
M106 after 5 hrs integration
I changed so many things over 5 nights and even had my system somehow decide to change my camera offset for no apparent reason which threw my flats and darks all over the place.
Mirosław Stygar:Andy Wray:
I'm just wondering what processing techniques you might use to improve my 5hrs on M106? Please feel free to comment or even play with the image:
M106 after 5 hrs integration
Any stack?
Hi Andy, I don't know what tools or applications you use but in general practical terms, with a few references to common processes, here is what I did in about 5 minutes to your 5hr image. It was a bit rushed but you might be able to use bits here and there in your work flow.
First I separated the stars from the object using Starnet++ in PI.
Stars process:
Remove blue color cast by de saturating blue or resetting the black point
Check for and remove gradient
Apply noise reduction to star mask
Boost star color and brightness without affecting the background color by using a mask or manipulation of individual stars
Check for new noise
Object process:
Remove green color cast by de saturating green or applying SCNR in PI remember to check for over saturation of magenta
Check for and remove gradient
Apply noise reduction to object image
Additional mild stretch using GHS in PI or other levels / brightness controls
Boost overall color saturation and vibrance of the object without over saturation of the background
Apply mild sharpening
Check for new noise
reintegrate the object image with the starmask image
Mild crop of the sides to standard 2:3
Hope this helps,
CS's
scott
andrea tasselli:
Just for starters, what have you used the remove the gradients and when in the preocessing pipeline?
Andy Wray:
usually use DBE in the linear stage. In this case I had a few problems as follows:
* Lots of clouds
* Moved from a normal guidescope to an OAG mid-capture ... that meant my optical train got rotated quite significantly to retain the 56mm backfocus
* I didn't tighten the optical train enough when I installed the OAG, so 2/3rds of the way through I had to rotate it and had tilts issues for the previous 2 days
* I didn't capture flats on the nights when I didn't realise I had a rotation problem
* For some reason the camera driver decided to drop my offset from the default of 50 to 10 during the process ... I have never manually set the offset, so my darks didn't match. I had to use pixelmath to subtract the 40 offset off my master darks to try and compensate. I'm now manually setting the offset rather than leaving the driver to do it
All-in-all: I'm having to rely on ABE/DBE and anything else I can think of to cater for my previous mistakes.
andrea tasselli:Andy Wray:
usually use DBE in the linear stage. In this case I had a few problems as follows:
* Lots of clouds
* Moved from a normal guidescope to an OAG mid-capture ... that meant my optical train got rotated quite significantly to retain the 56mm backfocus
* I didn't tighten the optical train enough when I installed the OAG, so 2/3rds of the way through I had to rotate it and had tilts issues for the previous 2 days
* I didn't capture flats on the nights when I didn't realise I had a rotation problem
* For some reason the camera driver decided to drop my offset from the default of 50 to 10 during the process ... I have never manually set the offset, so my darks didn't match. I had to use pixelmath to subtract the 40 offset off my master darks to try and compensate. I'm now manually setting the offset rather than leaving the driver to do it
All-in-all: I'm having to rely on ABE/DBE and anything else I can think of to cater for my previous mistakes.
Let's assume you used DBE in this specific instance. What were the parameters you've used, if you can remember?
Andy Wray:
A tolerance of 3, and a smoothing factor of 0.6. Sample radius of 30, Maybe 7 samples per row. I would then move the samples to make sure they hadn't captured stars or were over faint areas of the galaxy and then manually add samples in areas of steep gradient
andrea tasselli:Andy Wray:
A tolerance of 3, and a smoothing factor of 0.6. Sample radius of 30, Maybe 7 samples per row. I would then move the samples to make sure they hadn't captured stars or were over faint areas of the galaxy and then manually add samples in areas of steep gradient
3 is way too much. I wouldn't go beyond 1.5. Work on the minimum sample weight instead (e.g. from 0.75 -> 0.1). Smoothing factor is also too high. You need to go down not up, try using .125. Sample radius should be a more modest 15-20. Samples per row for APS-C sized sensors should be at least 66. Remove all samples overlaying on the galaxy including the halo.