H-beta in narrowband imaging

16 replies790 views
Nick Grundy avatar
Hi All, I was wondering if anyone uses an H-beta filter in their narrowband imaging? I feel like i've seen more options from vendors in H-beta lately that seem targeted for imaging. (compared to previously only hearing this with visual)

https://agenaastro.com/baader-cmos-optimized-5-5nm-h-beta-narrowband-filter-50mm-round-unmounted-fchbn-rd50-2961084.html

Does anyone use this filter regularly? Has it shown significantly different results in final images or processing? It seems like the odd man out in a 7slot EFW

CS!
Engaging
Sean Mc avatar
From what I’ve read, the h beta emission sources are the same objects as the ha emissions, but ha is many times stronger. If that’s the case it doesn’t seem useful to waste time on the weaker h beta that’s going to illuminate your image in the same place h alpha is. ¯\_(ツ)_/¯
andrea tasselli avatar
Same source (Hydrogen) doesn't mean same location/distribution in space never mind the Hb emission is in the opposite end of the visible spectrum.
Brent Newton avatar
As someone who is primarily interested in accurate color (what that means in itself can be contentious) but still has to lean on narrowband for better contrast outside a 2 hour drive to a Bortle 2 site - I just started using the 8.5nm from Baader. I have no interest in SHO or typical false color palettes (and for my own pursuits I don't want to lean on the True Color-adjacent blends like HOO) and never even unboxed the Sii that I bought with my Antlia 3nm set, so I had a slot for Hb already. I may have to consider this 5nm one though, took me several months just to find the 8.5nm in stock. 

My experience in finding others using such filters is also limited - most entries on Cloudy Nights or similar often immediately derail with similar responses of "Why bother if Hα already does all the heavy lifting." I have also heard from similar forums would simply appear as a weaker version of a typical Hα shot (as Hß is about ~30% strength of Hα in a gas tube) and so far my single image that has used it seems to confirm this (its possible the filter is backwards, I have not had sufficient skies to test that), but what interests me is attaining accurate color in my Hydrogen regions.

For the past few years I have simply done what others do and apply the Hα data in reduced amounts to the Blue channel, but this is still guesswork since the presence of interstellar dust or other sources of emission light (Oiii) can also skew the color. I will typically make attempts to capture RGB from dark skies and at least eyeball the Hα contribution in post between R and B to match the RGB as close as possible, but again - guesswork. I'm seeking more of an "automatic" means of doing this, like running LinearFit and then applying continuum-isolated Hα/Hß shots to the broadband RGB without having to micromanage it.

On a similar note, the same also happens to Oiii, which is more of a teal green than Blue (1.0 Green vs 0.57 Blue in a gas tube), but scattered light often makes the Blue appear stronger in broadband RGB shots of some emission regions, or even white due to Oiii-strong regions tending to also being strong in Hα, leading to a bleached color due to strength in all 3 channels.

Basically all I can currently say is that I look forward to more skies for experimentation on this.
Helpful
Freestar8n avatar
I was given an H-Beta filter so I tried it on the Tarantula nebula to get direct measurements of the range of Ha/HB ratios in the nebulosity.  This was 8 years ago and the result is



For that image Ha was mapped to red and HB was mapped to a corresponding RGB triplet - and the results were added.  There was a fairly strong range of ratios across the image, which was backed by some papers I found.  I was just getting the point across that creating a faux HB channel as 0.3x the Ha won't be right.  There are some who say it is guaranteed by quantum mechanics but that isn't true in terms of the ratios you actually receive in the image.

Aside from demonstrations like this I don't think there is much value for HBeta for deep sky imaging.  For visual it has a huge advantage that the signal, though fainter than Ha, is more centered in visual response - so it ends up brighter visually.

And if you really want quantitative results it's hard to calibrate the different channels.  You can tell directly from the data that the ratios vary - but it's hard to know exactly what the ratios are without some kind of calibration.  And calibration is hard to do for narrowband.

Frank
Helpful Insightful Engaging
Guillermo (Guy) Yanez avatar
Pat Prokop just launched a new video where he is comparing the SHO color palette on the pacman nebula using standard SHO filters with the SH-O/Hbeta. You may want to have a look at it: https://youtu.be/U21wwNqnmcU?si=9PuVpZEX0RPUAV_C

Cheers!
Guy
Well Written Concise Engaging Supportive
HR_Maurer avatar
There could be some small deviations in Ha/Hb ratio, depending on electron temperature and also on re-absorption processes. As far as i know. However, Hb is more relevant for visual observation.
Christian Großmann avatar
Hi Nick,

your question is really interesting and for a while now, I try to think about an answer. As far as I know from my physical knowledge, absorption lines like Ha, OIII, SII and others appear, after electrons in the elements got hit by some energy (photons) that is strong enough to shift them to another energy level. But they can't be there for long and will fall back to it's initial level. Because these levels are always equal apart from each other, the energy that the electrons loose is always the same (dependent on the level, of course). That's why we see this energy as a specific wavelength which represents the lost energy. So far I am quite sure I'm right.

Now comes the interpretation of this, which I am not sure I am right with. But at least to me it makes sense:

If I am right, the Ha line is the result of shifting the electrons in the Hydrogen atoms to the next stage which needs the least amount of energy (hence the name alpha). H-Beta is the energy to shift the electron to the next higher level. So more energy is needed. That should be the reason, why there is so much more Ha light there than H-Beta. Statistically, the Ha light should be much brighter, because it's easier to move the electron to the next higher energy level which happens more often than shifting the electron two levels.

It's not the answer to your main question to find someone using H-Beta, but it helps to understand, what was said before: that H-Alpha and H-Beta should be quite equally distributed but the brightness of both absorption lines should be different. That may be the reason, why people tend to take images in Ha rather than H-Beta, because its easier to get and it usually can be found, where the other absorption line is, too.

I hope, I'm right with my interpretation. Maybe someone could clear things up. It would be nice to know, if I'm right. Otherwise it would be nice to get some more background. I'd love to be corrected.

It's hard to explain this in english, because I didn't know the exact words for the physics.

CS

Christian
Helpful Respectful Engaging
Sean Mc avatar
Hb apparently is used for visual observation because that spectrum is detected by our eyes way better than Ha even though Ha is a much higher signal.
wizzlebippi avatar
Christian Großmann:
Hi Nick,

If I am right, the Ha line is the result of shifting the electrons in the Hydrogen atoms to the next stage which needs the least amount of energy (hence the name alpha). H-Beta is the energy to shift the electron to the next higher level. So more energy is needed. That should be the reason, why there is so much more Ha light there than H-Beta. Statistically, the Ha light should be much brighter, because it's easier to move the electron to the next higher energy level which happens more often than shifting the electron two levels.

It's not the answer to your main question to find someone using H-Beta, but it helps to understand, what was said before: that H-Alpha and H-Beta should be quite equally distributed but the brightness of both absorption lines should be different. That may be the reason, why people tend to take images in Ha rather than H-Beta, because its easier to get and it usually can be found, where the other absorption line is, too.

Think through this again.  Hydrogen has to absorb energy to emit light in Ha, and even more to emit Hb.  Without an energy source, this glow will dissipate.  Hb will eventually decay into Ha without external energy input, meaning only the hotter parts of a Hydrogen rich emissive nebula will emit Hb.  The distribution isn't equal.  

Despite its popularity, the SHO "Hubble Palette" is only an approximation, representing 3 of the brighter spectral lines Hubble can isolate and image.  If I remember correctly, Hubble carries about 12 narrowband filters, including both Ha and Hb.
Well Written Helpful Insightful Engaging
Christian Großmann avatar
Think through this again. Hydrogen has to absorb energy to emit light in Ha, and even more to emit Hb. Without an energy source, this glow will dissipate. Hb will eventually decay into Ha without external energy input, meaning only the hotter parts of a Hydrogen rich emissive nebula will emit Hb. The distribution isn't equal.


This makes sense, of course. That's why these nebulae are typically connected with something hot in its centers.
Wei-Hao Wang avatar
The intrinsic Ha/Hb ratios of most nebulas are all close to 3.  If you see a large variation of this, it's caused by dust absorption, which is stronger for Hb and weaker for Ha.  Indeed, astronomers use the Ha/Hb ratio as an indicator for amount of dust absorption.  So, for example, if you take a wide-field shot that includes M16, M17, and M8 and do a careful processing (include flat fielding), you will see M8 is the bluest while M17 is the reddest.  This is basically caused by dust absorption in these nebulas (stronger absorption in M17, much weaker in M8), not quite the hydrogen itself.  Those who image the LMC and SMC will also found that the nebulas there are bluer than Milky Way nebulas.  That's basically also caused by dust absorption.

If you ignore the difference caused by dust, Hb basically traces Ha.  Hb is weaker than Ha, and the variation in Ha/Hb ratio will never be as large as the variation in Ha/OIII or Ha/SII ratio in a nebula.  These (weaker, and less variation in Ha/Hb) imply it is hard to get a pretty picture based on Hb.
Well Written Helpful Insightful
Bob Lockwood avatar
Ok, I'm confused. Like Nick, I was wording if an Hb filter could be used in place of the O3 for the blue channel, but everyone hear is comparing it to the Ha, red channel. What I'm seeing is that Hb is so close to the O3 wavelength that it could be used? Why are so many seeing it as red, when it's clearly in the blue channel's wavelength. Far from an expert hear, just wondering. I also like the idea of the HbO3 filter.

HR_Maurer avatar
Hi Bob,
if you look onto a RGB image of a HII region, the color of the hydrogen tends a bit towards a pinkish or violet tint. This is, because beneath the H-alpha emission there is always a certain amount of h-beta emission included. You wont find any h-alpha emission without h-beta and vice versa, and that's why h-beta doesnt add more structual information to an h-alpha image. Those emission lines are linked. If you compose a bicolor image from h-alpha and h-beta, you will always obtain a structure without a lot of color vatiation.
In contrast, if you compose a bicolor from H-alpha and O-III, you will obtain a large amount of color variation, because the oxygen has a different spatial distribution than the hydrogen.
You could combine h-beta with O-III, and for example assign red to h-beta and blue to O-III.
Helpful Insightful Respectful
Bob Lockwood avatar
Ok, that makes sense, it's probably also why the Chroma graph is also showing it in violet, not blue. And Teal for the O3.
Christian Großmann avatar
Be careful to look just at the wavelength of the different filters. The light is sent out by different elements. Comparing Ha to Hb is one thing, because the light is emitted by the same atoms. OIII is found near Hb in the spectrum of the visible light, but it origins from a very different element. With narrowband, you basically record the presence of the differnt elements like hydrogen, oxygen or sulfur. With the color palettes you use, you basically show the distribution of them. A really good example is the giant squid nebula embedded in the flying bat. You could not see the squid if you record data with Ha and/or Hb only, because it is only emitting light coming from OIII. It's one of those rare examples, where hydrogen isn't even part of that object.
Helpful Insightful Respectful Engaging
andrea tasselli avatar
Left is Hb and right is Ha. Draw your own conclusion on whether they are distributed the same (same exposure length):