To start, my apologies for another post on exposure length, but sometimes it's hard to find precisely what you're looking for...
This is sort of a continuation of my previous post on underexposed red pixels, but the factors have changed slightly. I've now modified my camera (LPF2 removal) and am getting a lot more red data. It's much easier to bring the red channel's "skyfog" peak up above the bottom of the histogram even with an aggressive CLS filter. My question now is, should I be aiming for 0% underexposed red pixels when choosing an appropriate exposure time?
I've been using RawDigger to check the over/underexposure information on my RAW files. With the red channel peak well above the bottom of the histogram, I'm still getting 5-10% underexposed red pixels. Increasing exposure time brings that number gradually down, but at the cost of overexposing my green and blue channels slightly (burning-out brighter stars) and increasing my number of rejected subs (I'm shooting unguided, and I have to toss up to a third of my subs when I start shooting over 60 seconds).
To sum up, I'm wondering if it's common practice to tolerate a certain amount of underexposed pixels in order to properly expose stars in green & blue channels. If so, it would definitely lead to a greater number of useable subs and improve my star colour. Or is any amount of unexposed pixels so undesirable as to necessitate longer exposures?
Really curious as to how you all handle this. Thanks!
Clear skies,
Mark
PS: To be clear, this is only an issue with shots using a filter. I shoot broadband targets without a filter and have no issues with underexposure.
This is sort of a continuation of my previous post on underexposed red pixels, but the factors have changed slightly. I've now modified my camera (LPF2 removal) and am getting a lot more red data. It's much easier to bring the red channel's "skyfog" peak up above the bottom of the histogram even with an aggressive CLS filter. My question now is, should I be aiming for 0% underexposed red pixels when choosing an appropriate exposure time?
I've been using RawDigger to check the over/underexposure information on my RAW files. With the red channel peak well above the bottom of the histogram, I'm still getting 5-10% underexposed red pixels. Increasing exposure time brings that number gradually down, but at the cost of overexposing my green and blue channels slightly (burning-out brighter stars) and increasing my number of rejected subs (I'm shooting unguided, and I have to toss up to a third of my subs when I start shooting over 60 seconds).
To sum up, I'm wondering if it's common practice to tolerate a certain amount of underexposed pixels in order to properly expose stars in green & blue channels. If so, it would definitely lead to a greater number of useable subs and improve my star colour. Or is any amount of unexposed pixels so undesirable as to necessitate longer exposures?
Really curious as to how you all handle this. Thanks!
Clear skies,
Mark
PS: To be clear, this is only an issue with shots using a filter. I shoot broadband targets without a filter and have no issues with underexposure.