LRGB exposure times for deep sky - what makes life easy?

17 replies1.1k views
Christian Großmann avatar
Hello to you experts,

last night I was able to collect data on M106. I always used 300s for my frames, no matter what object I tried to image or what LRGB-Filter I use. Because I saw a lot of images with beutiful star colors beside the main subject, I tried to reduce my exposure time to 180s  (ASI294MM, binned 2x2) and took a complete set of LRGB-Data. I tried to process the data and ran into a problem. A look at the histogram shows a very narrow spike on the lower end (stacked images, of course). The difference to my recent images is only the narrower spike, which makes sense because of the shorter exposure time. Sadly, I was not able to process the combined image properly. It seems, that the data is so narrow, that I am not able to get any colors out of it. Every tweak I make results in a white galaxy or white stars. If I go a bit further, I always get strange color casts.

On the other hand, I also took some images of the NGC884 and NGC869 star clusters at an exposure time of 120s per frame (ASI294MM, binned 2x2). There are a lot of stars in the image and again, I am not able to get some decent colors out of it. It is strange, because there should be plenty of colors at least in the dimmer stars. Am I missing something here?

In both cases, the Photometric Color Calibration Process in PI shows a white balance graph with nearly all points on the zero mark for both graphs. Is this correct?

So my question is: How do you collect data for your deep sky objects with LRGB-filters? Do you expose for the object itself and then another data set for the stars? I always realized in my images, that the stars were nearly white. I guess, you will process the stars seperately from the main subject. Is there a way, I could propably rescue my image from last night?

I use PixInsight on a windows machine and for the final tweaks, I prefer Photoshop.

I need your help, because clear skies these days are rare and I don't want to spend all the time getting into the ball park. Maybe you could save me some time…

BTW: Narrowband images are another story. I just need some help with the LRGB-Stuff.

Thanks for your advices…

CS

Christian
Kostas Papageorgiou avatar
Hello!
I dont know your processing workflow but do you process the RGB separelty from Luminance? 
Give saturation on RGB then add the L..
I dont have either colorfull stars.. but its a start…. i will watching this post to! smile
andrea tasselli avatar
The trick to get the LRGB easy is to have same image scale and similar dynamic range. In my case I don't go beyond a 3:1 exposure ratio between luminance and RGB (per channel).
Concise
scarabeaus1 avatar
Hi,

I'm not as advanced in processing as others may be, but I've never taken different exposures for stars. I usualy use 120s, sometimes 180s. I've once read that the colors of the stars may be sometimes white as it depends on many factors that are not controllable as air polution (dust,…) hight of the object and so on. Maybe you also had some thin clouds that were not visible but affected the data. What type of scope do you use? Was there dew on the optics? But without seeing your data it is hard to tell.

Greatings
Alex
Christian Großmann avatar
@Kostas Papageorgiou 

I tried your way and processed the RGB-Version first. Again all stars are white. So I checked the frames of M106, that I took with the different filters. I randomly opened 1 image per filter to compare them. They look very similar. So maybe the filter wheel does not switch the filters properly? I would expect some more difference especially between blue (upper left) and red (lower image).

Could this be the case?



The processed image of NGC884 is here:

https://www.astrobin.com/jf0qyz/


@scarabeaus1 
Thanks for the reply. I use an 8" Newt. I think your post is right, but I never experienced this in any other image I took. So far, I guess I have a problem with the filter wheel...

Thanks again
scarabeaus1 avatar
You can also check the values of some (not staurated) stars over the different filters. If they are moreless the same, than you might have a problem with the filter wheel indeed.
Kostas Papageorgiou avatar
Christian Großmann:
Could this be the case?


Yes they look very similar. check your filter wheel...
James E. avatar
Christian,

I just started on M106, but only have Lum data so far.  I also thought your M106 R-G-B data looked too similar per filter, BUT I did find some LRGB individual channel data posted on galactic-hunter. com and indeed ALL the filters look exactly alike just like yours.  So, while the filter wheel may be the problem, it’s difficult to tell from your M106 posted images.

If your filter wheel is not physically moving even though the software days it is, then you likely may be combining one filter’s worth of data and your object and stars will be all white. Your NGC 884 image has ALL white stars - and not just unsaturated, the image is black and white. Since you have PixInsight, AFTER combining RGB and stretching just look at the R:G:B values usually located on the lower bar as you mouse around the image or simply zoom in on background - it show different R:G:B values or you can zoom into the background and you should see colorful noise.

Probably already tried this, but visually check the filter wheel is actually moving with a software filter change request.  Your filter wheel may be jammed or unresponsive.  I had a ZWO filter wheel that jammed with 1.25” filters and the CN “fix” at the time was to reverse the filter wheel cover to provide more clearance.

Jim
Helpful Engaging Supportive
Christian Großmann avatar
James E.:
Christian,

I just started on M106, but only have Lum data so far.  I also thought your M106 R-G-B data looked too similar per filter, BUT I did find some LRGB individual channel data posted on galactic-hunter. com and indeed ALL the filters look exactly alike just like yours.  So, while the filter wheel may be the problem, it’s difficult to tell from your M106 posted images.

If your filter wheel is not physically moving even though the software days it is, then you likely may be combining one filter’s worth of data and your object and stars will be all white. Your NGC 884 image has ALL white stars - and not just unsaturated, the image is black and white. Since you have PixInsight, AFTER combining RGB and stretching just look at the R:G:B values usually located on the lower bar as you mouse around the image or simply zoom in on background - it show different R:G:B values or you can zoom into the background and you should see colorful noise.

Probably already tried this, but visually check the filter wheel is actually moving with a software filter change request.  Your filter wheel may be jammed or unresponsive.  I had a ZWO filter wheel that jammed with 1.25” filters and the CN “fix” at the time was to reverse the filter wheel cover to provide more clearance.

Jim

Hi @James E.,

thinking about the start of my imaging session, I also had some problems while creating the flats. The exposure times I usually use to take the flats had to be changed. Because I use an adjustable FlatPanel, I didn't realize a problem, although it was a bit strange. But NINA changes the brightness of the panel itself and so I took it as it was. Looking at my frames now, I am quite sure the Filter wheel may be the problem. I have to check, if I am home from work. I once had slight problems with the wheel (ZWO) and experienced this behavior before. Because I used to set the exposure times of my flats manually back then, the problem was obvious. But not so yesterday...

I also thought, there are small coding holes in the wheel, that are used by the wheel's controller to decode the current position. Maybe that is not the case and I was wrong. But because of that, I thought the wheel was ok. You always have to check twice ;-)

I have a second identical filter wheel. I will swap the filters and try the other one. Because the temperature tonight was really low (about -7 degrees and lower), this could also be an electrical problem. I have to check...

NGC884 and M106 were taken by the same sequence. Thats why the data is similar. I assume, the problem was there right from the start.

It's sad, because the conditions were so nice the whole night through. But the time is not wasted, if there is something to learn.

CS

Christian
Helpful Respectful Supportive
Olaf Fritsche avatar
Maybe I'm thinking too simplistically, but I actually take an extra image for the stars and one for the object in the case of faint objects. I then combine the two. 

I also use an ASI294MM Pro with filters from ZWO. At exposure times beyond 1 min the stars become white, at shorter times they have different colors. 

The exposure time for an DSO depends on its brightness. Right now I am collecting data for M82, and I can't use my luminance data with 120 s exposure time, because the center of the galaxy is burnt out white with it. An exposure time of 30 s, on the other hand, shows structures.
Helpful Concise
Mike Cranfield avatar
Hi. I am a very new astrophotographer so still very much feeling my way but happy to pitch in with my thoughts. 

First off, those stars look nicely shaped - good work there!

You say you use Pixinsight. If you open up the unstretched image and run your cursor over the stars you should be able to see the pixel values you are getting in each of the separate grayscale red, green and blue images. Are these values less than 1.0 for all but perhaps the brightest stars?  If most stars are hitting a value of one or close to that at their core then you are overexposing for star colour. 

If the pixel value at the core of most stars is significantly less than one then you should be able to work with the data and get some colour.

In this case the key thing is to be careful with how you stretch your data. It is easy for the stretch to move the values in the heart of stars very close to one which again will lead to you losing colour. The arcsinh stretching process can be helpful here in stretching data and keeping colour although this process is notorious for creating artifacts in stars. 

An alternative to using arcsinh stretch is to use a script I have written called GeneralisedHyperbolicStretch (GHS to its friends!). This gives much more control over the stretch than, for example, histogram transformation. It is available for free download from here: https://github.com/mikec1485/GHS/releases/tag/v1.0

I am working on a version 2 of this script which I hope to release shortly which will add functionality that will allow you to apply the arcsinh process using the GHS equations giving you the best of both worlds!  Stay tuned for that!

Good luck and clear skies.
Helpful Engaging Supportive
Christian Großmann avatar
@Mike Cranfield 

Hi,

this sounds very nice and I will give it a try. Because of the season and my observing place, I will have to take images of clusters and similar for a while. This is the best opportunity to work on this kind of proscessing.

I am not happy with my star shapes so far. I have to adjust my scope from ground up. The flats show a bright spot that is not centered. The the star spikes are basically two spikes closely together which should occur due to poor collimation. I also have the problem that the coma corrector shows longer stars in opposite corners of the diagonal axis (not on all 4 corners). So not perfect at all. There is also the problem with my guiding. At very low temperatures, I see elongated stars. The reason may be flexure genereted by the guide scope. It seems less than a problem in warmer weather. The relatively short exposure time of 120s makes life a bit easier. On the 20 minute frames I usually take for narrow band, this is much more obvious. But I might slightly improve . The focuser of my cheap Newt is not that great. I guess, the weight of the camera, the corrector and the huge filter wheel is a bit too much for it.

But I am able to take some beautiful images with my rig. That is motivation enough, except for the nights when thing don't work as expected. But who am I telling this...

CS

Christian
Andy Wray avatar
FWIW:  I'm using 30 second exposures for luminance and 90 second exposures for RGB.  I'm second-guessing that you are blowing out your stars (i.e. they are fully saturated).  PixInsight will tell you if you are just by pointing at the centre of them.  [Update]  I forgot to say that you could just use RGB and forget luminance and see if that has retained any colour in the stars.  The luminance will definitely blow away the colour at multi-minute exposures.
Helpful Concise
kuechlew avatar
I downloaded your jpeg and ran the statistics tool of PixInsight on it. Obviouslly running it on the 32 bit PI XISF files will provide more acurate data, so you may give it a try. While the channels are certainly similar, there is a small difference which doesn't fit to the theory of the stuck filter wheel. Measuring the RGB values of your stars it looks like even fairly small ones are burned out and can therefore only appear white. Actually you can see colour halo in the fainter region of the stars. So the colour is there and you either overstretched the data or you have to reduce your exposure times further. 



Clear skies
Wolfgang
Helpful Insightful Respectful
kuechlew avatar
Small correction: While there is strong evidence for blown out stars my statement that the stuck filterwheel is ruled out was wrong, since I don't know your integration times and there could be a small deviation in the channels due to different noise levels. 

I want to add that I used Mikes GHS script to good effect and can wholeheartedly recommend it.

Clear skies 
Wolfgang
Christian Großmann avatar
@kuechlew 

Thanks for the effort and the reply. But I think the difference in the single color channels may result from different mages. I used 4 different sets of data that may contain the same filter light. But they are processes independently and the stretch may be a bit different. So combining them should result in some slight differences. At least I assume this...

But I agree to shorten the exposure time in the future.
Christian Großmann avatar
Andy Wray:
FWIW:  I'm using 30 second exposures for luminance and 90 second exposures for RGB.  I'm second-guessing that you are blowing out your stars (i.e. they are fully saturated).  PixInsight will tell you if you are just by pointing at the centre of them.  [Update]  I forgot to say that you could just use RGB and forget luminance and see if that has retained any colour in the stars.  The luminance will definitely blow away the colour at multi-minute exposures.

@Andy Wray ,

someone suggested this before. I tried to use only the RGB and there was no difference. The saturated colors for the brighter stars are surely related to the exposure time that was too long. But the tiny stars also show no color, what is a bit uncommon to me. At least some of them should contain some difference in the data. 

Your choice of exposure times is somehow irritating to me. The 30 seconds on the L channel is totally clear. With the 90 seconds for each color channel, you are only half a stop away from my setting and looking at my data, this should also be too much for me. Maybe it is because of the binning I selected. I have to check if I better go down to binning 1 which is not optimal for my scope. On the other hand, I switched my filter wheels yesterday and took a set of new flats. The exposure time is now where I expected it to be and there is a difference in the amount of light that hits the sensor behind each filter. Of course, it should be! But that tells me again, that I might have a problem with the filter wheel. So if the filters were set correctly, I assume you are right and the 90 seconds should maybe work for my setup, too.

@all

The support from all of you here is really great. I like the fact, that people all over the world can join these conversations (and do!). I'd really like to meet some of you guys and we'll have a beer or two and talk about our stuff all day long. Would be great! This community is a place, where some of the major problems in this world seems to be totally pointless. I like that. It's a bit like home...
Christian Großmann avatar
I'd like to give a short status of the solution for my problem. Tonight, I took some more data of M106 with my other filter wheel attached. The weather was problematic and there were clouds out all the time. But I managed to get some frames for each color and combined the data. Now it looks like a color image. One thing I noticed directly are the color changes in the diffraction spikes. It clearly shows, that the filters didn't change during my last session.

So thank you all for your help. I learned some new stuff and I will use your hints in the future…
Respectful