Psychology of color perception

12 replies378 views
Alexandre Salvador avatar
What should I aim for, while post processing in Astrophotography?

My photo shows a green night sky, which I don't see when I raise my eyes to the sky. So I know that something's wrong with my photo, and I worry about which background subtraction method provides a result that's closer to reality. How should I balance the reds and blues and greens in the histogram, so that the end result is truer to reality.

Then I read this in Wikipedia:
"All color perception is created by the brain of the observer. As such, no star really has any colour at all. Therefore color is not a fundamental property, it is in the brain of the observer. Stars emit energy at many different wavelengths, and humans may perceive color in stars. Instead of talking about the 'true' color of stars, we must talk about how a particular object appears to a particular observer, in a particular context."

Playing with colour curves, I can get the stars to be any colour I like, over a background that's just as close to black as I see fit.
And given that "all color perception is created by the brain of the observer", then my perception is just as valid as anyone else's.

But what if I'm also concerned that the fruit of my labour contains useful scientific data?
The wavelenght of light emitted by the objects in the sky can help determine their composition and speed in relation to us, so there must be some truth in colors.
But then, for the science, they also take images in infrared and x-ray, while in my hobby I just dwell in the visible part of light.

I would say, "I aim for photos to look the same as I would see if I was floating in a space suit, looking at the target with my own eyes."
But who has had such an experience, to say if I've got it right or not? And would the person floating in a space suit next to me describe the object the same as I saw it?

So, should I just aim for a result that's pleasing to my sense of aesthetics, and stop worrying about "accuracy"? Or should I keep searching for that special method/software/trick that takes my work even closer to what's really in the sky?
Helpful Insightful Engaging
Alan Brunelle avatar
Hi Alexandre,

You don't state what you are using for camera or software, but for the camera part, I will assume that you are using a color camera.  The typical filter (Bayer a name useful for finding this on your software) pattern on a camera that "shoots" color in one image is Red, Green, Green, Blue. in a square (2 dimensional) pattern.  The patterns can be different from camera sensor type to type.  But hopefully you get the idea.  Since there are only three primary colors, to get a balance in the colors using just three pixels would mean that the 2D pattern would alternate as you moved across the face of the sensor.  So they duplicate one color to make this consistent.  This allows for a simpler reading of the sensor by the computer that does the processing.  The computer in most point and shoot cameras do the color correction automatically before it displays the image on the camera's screen.  (The computer actually does a lot more actually for point and shoot cameras.)  For astronomical processing software, the interpretation of the color in the raw image does not make the correction, so the image will look green because of the double contribution.  So you need to make the correction to make it to your satisfaction.  Luckily, most astrophotography processing software has tools and setting to do this.  But you will need to research this yourself depending on the software you use.

As far as how people interpret and display the colors on their images.  Well, that is a can of worms, I will not get into here.  However, again, there are functions within the asto software that can read your image, identify the part of the sky in the photo, identify the objects in the photo, and then apply the correct color to all the visible elements.  Even still, all astro photographers still have the option to alter these colors and their balance, among other image properties.  Again, do some research on this to decide what you want to do.

Finally, yes the brain does do the interpretation of color, but the fact is the cones in our retinas have fairly consistent responses to specific ranges of light wavelengths and in doing so yields our color perception.  It is pretty consistent from person to person, hence the human population generally agrees that a red light is red and matches other reds, etc.  And the color cameras we use and color monitors we use also give us colors we agree upon.  In fact our cones do not really match the Red Green Blue of our devices that well, but still the processing in our brains make compensation for that.

There are many fine resources in print and on line to learn this stuff.
Helpful
Shawn avatar
Not just astrophotography, the regular digital photographies you take everyday on your phone go through a long image processing pipeline from signals the sensor received to the image displayed on your screen. The most important parts in terms of color reproduction are white balance (WB) and color correction matrix (CCM). WB adjusts the gains of RGB channels so that the white color you see in the real scene appears white on your screen. CCM mixes the three channels because the sensor RGB spectral sensitivities are different from human retina's sensitivities and are different from the spectra of the RGB pixels on your display.

The complication of WB is that human brain has the ability to see an object in the same color under different lighting conditions (color constancy). For example, a white paper under the sun and under warm white indoor lighting both look white even though the actual spectra that enters your eyes are quite different. So you can't just have a fix set of channel gains.

To optimize the CCM, you will have to do a bunch of calibrations on sample colors under different lighting conditions. And yes, CCM may also depend on the scene colors and WB.

If you have a galaxy in your image, if you can adjust the RGB so that the background and the galaxy both average to be gray, the star colors should be OK. PixInsight also has a process PhotometricColorCalibration that can help with color calibration.

And finally, I think unless you want to make measurements based on the color, you can always adjust the image to your liking. Astrophotography is half science half art.
Helpful Insightful
Trace avatar
Dr. Roger Clark has shared a science based perspective on colors in astrophotography which resonates well with me.  It's unfortunate, but the discussion of color often evokes much emotion and controversy.  I like to think of it in terms of what colors are represented by the camera in a daylight image, an image that reminds me of what I see or perceive with my unaided eye.  That's what I hope I'm approximating in my astrophotography, albeit with some liberty taken at times to brighten or add some saturation for the artistic quality without distorting the natural color too far.

astrophotography-color-and-critics

DSLR Processing the missing matrix

CS/Trace
Helpful Insightful Respectful Engaging
Die Launische Diva avatar
Trace:
Dr. Roger Clark has shared a science based perspective on colors in astrophotography which resonates well with me.  It's unfortunate, but the discussion of color often evokes much emotion and controversy.  I like to think of it in terms of what colors are represented by the camera in a daylight image, an image that reminds me of what I see or perceive with my unaided eye.  That's what I hope I'm approximating in my astrophotography, albeit with some liberty taken at times to brighten or add some saturation for the artistic quality without distorting the natural color too far.

astrophotography-color-and-critics

DSLR Processing the missing matrix

CS/Trace

My humble advice is to take such perspectives with a grain of salt. But I am just a random guy on the internet with background in physics, photography, and with some time spent in a research observatory
Ruediger avatar
I think, we should accept that we basically produce art and not science. Neither our equipment, nor our workflows can claim to be even close to something which is required for science. Hence I would suggest you process your images in that manner you enjoy them. And as always in art: there are people who like it too, or others who dislike it. 
As you had pointed out, it is a matter of psychology, but also age, experience and even cultural background is important. Since even some of these attributes are mutual disjunctive you will never serve everybody’s taste. There is no scientific way to measure art. So do yourself a favor: enjoy yourself 😃

CS
Rüdiger
Helpful Insightful Respectful Engaging Supportive
Giovanni Paglioli avatar
Finally I hear something true! I've spent almost 25 years trying to figure out a bunch of underestanding about human visual perception making many conferences related to astrophotography all around the world. There are many misunderstandings still linked to old emulsion photography too. We often think about vision just like we think about acquisition with our CCD/CMOS cameras. Human eye/brain don't act this way. The term true colors Is also a misunderstand since it comes from the hardware/software world. In the past computers where very limited in displaying colors on screen, when tecnology become able to display 3channels (RGB) with 8bits per channel (256 discrete levels of intensity), we stated that they where enough to be defined "true colors" to the eye. That absolutely not means the exactly rapresent reality! Displays can rapresent emission colors (additive color space) in a limited way and out eyes/brain only "see" Just about 40 shades of Gray. We don't measure the RGB channels, our brain Is full of work to do constantly, hearing, moving, seeing, thinking ecc. So It try all the time to diminish the load eliminating everything which Is not "important". How it determines what is important and what's not? It Is on an evolutionary basis! Everything that it could be of danger for our survival is important, the rest is less or not important at all and it's discarded! We say in digital that we are taking images but Is not true or, better to say correct, we are instead measuring quantities, we are doing photometry or spectra. So where is the "thruth"? In the data! If we really whant to be as close as possibile to reality, we just have to read the datas and use our abstraction skill to try to figure it out. In the exact moment we try to "render" the Digital datas to an output in the range of our senses, we are adapting and opening a window to look thru,  we have to limit and intellectually interpret these otherwise not human categories. We normally visualize datas on screens but we can also play these numbers as sounds! They will appear of no meanings to us but they still are the same exact content! There Is no common sense or perception? Anyone could rapresent things free without being told he's wrong? No… There are common "stymulus" and we percieve these in the same way and this is the results of evolution in Darwin's therms. Sky Is Blue? No, It scatters light and absorb lower energy wavelenghts and we see it blue but ionized oxygen Is teal color which contains green. Nightsky as we see it from the space station Is green by layers but this looks strange to us. We have to think what we intend to show of our datas and render these datas in a way that Is the most efficient to transmit these concepts to others. We are amused by visual experience having seen the same subject so many times and out brain takes a picture of that experience to have the most satisfactory experience from that stimulus.  The way to do this is underestand the most common way of perception and use it while enhancing these peculiarities that we intellectually want to show. It really is an exciting argument of confrontation but it's very long talk to be done here… I've done many workshops on this subject  and, if You're interested, I could try to make one to share our knowdlege. Let me know if it could be and option and if what I've just wrote has been of any help.

Ciao da Joe!
Trace avatar
Die Launische Diva:
My humble advice is to take such perspectives with a grain of salt. But I am just a random guy on the internet with background in physics, photography, and with some time spent in a research observatory


Thanks for the gentle advice and sharing.

CS/Trace
Alexandre Salvador avatar
Hello all

While writing my initial post I was worried that maybe I wouldn't be able to correctly convey my thoughts and concerns.
But from your answers I could see that you fully got my point , and also share my concerns to some extent.

I took some interesting ideas too
Alan Brunelle:
However, again, there are functions within the asto software that can read your image, identify the part of the sky in the photo, identify the objects in the photo, and then apply the correct color to all the visible elements.


That's new to me. I'm guessing some software is able to plate solve my image, and use the knowledge of the celestial location to correct the colours. That's certainly scientific enough for me. Will have to look into that. And yes, I'm only using color cameras. I have a stock Canon 800D and a cheap Omegon guide camera.

Also,
Shawn:
If you have a galaxy in your image, if you can adjust the RGB so that the background and the galaxy both average to be gray, the star colors should be OK. PixInsight also has a process PhotometricColorCalibration that can help with color calibration.


That's certainly got my attention. I'm not considering paid software just yet, but that's something I can try with Gimp. I've been afraid to mess with  individual R/G/B channels, but now I see that maybe that's not such a "heresy".
Trace:
Dr. Roger Clark has shared a science based perspective on colors in astrophotography which resonates well with me.  It's unfortunate, but the discussion of color often evokes much emotion and controversy.  I like to think of it in terms of what colors are represented by the camera in a daylight image, an image that reminds me of what I see or perceive with my unaided eye.  That's what I hope I'm approximating in my astrophotography, albeit with some liberty taken at times to brighten or add some saturation for the artistic quality without distorting the natural color too far.


I knew I'd be touching a potentially controversial  subject - or can of worms - because I see that processing technics and the personal choices there involved are central to discussions in internet fora and youtube channels dedicated to AP. But I'm glad I had people voice my exact concerns. Also, all replies have shown the utmost elevation and thank you for that.
Ruediger:
I think, we should accept that we basically produce art and not science. Neither our equipment, nor our workflows can claim to be even close to something which is required for science. Hence I would suggest you process your images in that manner you enjoy them. And as always in art: there are people who like it too, or others who dislike it. 
As you had pointed out, it is a matter of psychology, but also age, experience and even cultural background is important. Since even some of these attributes are mutual disjunctive you will never serve everybody’s taste. There is no scientific way to measure art. So do yourself a favor: enjoy yourself 😃


Indeed, it's not that I'm using my hobby to write my thesis Thank you very much for your kind words! \o/
Giovanni Paglioli:
So where is the "thruth"? In the data! If we really whant to be as close as possibile to reality, we just have to read the datas and use our abstraction skill to try to figure it out.


"Stretch the histogram just enough not to loose any data" - or words to that effect - that's advice I commonly see in fora and youtubers I follow. Preserve the data but apply one's judgment to narrow it down to the human band of vision... But your post gave such meaningful and comprehensive view on this subject, I almost feel bad to quote just one sentence from it!

Thank you all for your high quality and elevated replies. I feel honored to be a part of this community!
Alan Brunelle avatar
"I took some interesting ideas too
Alan Brunelle:
However, again, there are functions within the asto software that can read your image, identify the part of the sky in the photo, identify the objects in the photo, and then apply the correct color to all the visible elements.



That's new to me."



Alexandre,

I was referring to the Photometric Color Calibration in Pixinsight.  But I suspect that this function can be found in most asto-dedicated packages.  It relies on the image being plate solved.  

In my opinion, there is nothing wrong with someone simply using adjustments to individual channels to the liking of their eye and to satisfy themself.  It can be done to good results, just as it can be done poorly!  One can used other published images of the work to compare, or try to get a decent balance on star colors, etc, etc.  Use a well calibrated monitor when doing this.  You will see the occasional image posted on AstroBin that has a strange color cast.  That likely is the result of someone using a poorly calibrated monitor during processing.  Ultimately, what you do depends on what YOU want to achieve.
Helpful
Arun H avatar
My humble advice is to take such perspectives with a grain of salt. But I am just a random guy on the internet with background in physics, photography, and with some time spent in a research observatory

Would completely agree with the advice to take Roger Clark's advice with a grain of salt. When I was first starting out, his website and programs were helpful. But I rapidly found them to be quite limiting. He tends to be negative on lots of tools (eg. PixInsight) other than the ones he uses. But a comparison of the results he gets versus the images generated by the users of the tools he likes to criticize should easily guide you as to whom to believe.
Well Written Insightful Concise Supportive
Justin avatar
Alan Brunelle:
"I took some interesting ideas too
Alan Brunelle:
However, again, there are functions within the asto software that can read your image, identify the part of the sky in the photo, identify the objects in the photo, and then apply the correct color to all the visible elements.



That's new to me."



Alexandre,

I was referring to the Photometric Color Calibration in Pixinsight.  But I suspect that this function can be found in most asto-dedicated packages.  It relies on the image being plate solved.  

In my opinion, there is nothing wrong with someone simply using adjustments to individual channels to the liking of their eye and to satisfy themself.  It can be done to good results, just as it can be done poorly!  One can used other published images of the work to compare, or try to get a decent balance on star colors, etc, etc.  Use a well calibrated monitor when doing this.  You will see the occasional image posted on AstroBin that has a strange color cast.  That likely is the result of someone using a poorly calibrated monitor during processing.  Ultimately, what you do depends on what YOU want to achieve.

Just to weigh in on Photmetric Color Calibration.
To get an idea of how it works without headfirst into the formidable ocean that is Pixnsight, you can try a free Astro editing software called Siril, available here: https://siril.org
It has a much shallower learning curve (although it certainly is less powerful) than Pixinsight but has some very useful features, especially if you’re just getting started. One of these is Photometric CC. It does a more “accurate” job of getting the color balance right (including fixing the green tint issue) than just messing with the RGB channels in Photoshop or Gimp. 
I shoot with an unmodded Canon SL2 and the color results that I have gotten are great. Using Siril really did improve my image processing and I think it’s a great next step in the already difficult world of processing. 

As to the philosophy of processing, from what I’ve seen, there are two schools of thought; one attempts to keep their image as “true“ to the scientific data as possible and, the other looks at it as merely an expression of the artist and takes liberties with the data accordingly. I believe a health balance of the two positions is probably the best place to stand. However, as others have said here, any manipulation of our data as astrophotographers is most likely not very useful scientifically. This means that it falls to the discretion of the astorphotographer to make choices on color saturation and balance to his or hers taste. Find what works for you and embrace your “style”. This is something I’ve been trying to pay more attention to and I’ve enjoyed experimenting with different looks and finding what I like and don’t like. Just enjoy the process of processing and produce work that you can be proud of.
Helpful Insightful Respectful Engaging
Chris Bailey avatar
Photometric Colour Calibration certainly takes a scientific approach to Broadband RGB imaging, narrowband is a whole different ball game. Where I struggle most is with colour gradients. PixInsight now has a raft of tools to deal with moon or light pollution induced gradients but I still find they sneak through, especially with the extreme manipulation needed to pull out faint objects.