How do I choose a camera sensor/barlow combination that will approximate the view given by a particular eyepiece?

10 replies110 views
Rick Evans avatar

I know that, for example, a 9mm eyepiece has a focal length of 9mm. Dividing a telescopes focal length by the focal length of an eyepiece used gives approximate magnification. We don’t think in terms of magnification when using a camera sensor instead of an eyepiece though.

A camera sensor is placed at prime focus of a telescope system, so the focal length should be given by aperture x F ratio. I think that the camera sensor size is inversely proportional to the image scale produced. I think that image resolution is at least in part determined by telescope aperture and sensor pixel size and pixel number and there is a formula for calculating it.

So, with all that as background, I’m just curious to know how to determine which camera sensor characteristics would give me the equivalent detail that I would observe if using an eyepiece of a particular focal length with the same telescope.

Engaging
Quinn Groessl avatar

There’s probably a math way to figure it out, but I suck at math and like visual things.

astronomy.tools has an FOV calculator. Go to there, and in imaging mode select the telescope and eyepiece you want to simulate and add it to your view. Then go to the imaging tab and play around with different cameras with that same telescope until you find one that matches well.

Helpful
Rick Evans avatar

Quinn Groessl · Dec 31, 2025, 12:47 AM

There’s probably a math way to figure it out, but I suck at math and like visual things.

astronomy.tools has an FOV calculator. Go to there, and in imaging mode select the telescope and eyepiece you want to simulate and add it to your view. Then go to the imaging tab and play around with different cameras with that same telescope until you find one that matches well.

Thanks…. I gave it a try and it looks like using a 3mm eyepiece with my 140mm f/6.7 refractor gives a very similar view to using a 3x barlow on that scope with my ASI678MC camera. It is a very useful tool. There probably isn’t an equation or if there were it would be too complicated so this seems like a practical way to make a comparison.

Rick

Helpful
Tony Gondola avatar

In very basic terms, it would come down to field of view.

I’m not sure about “which camera sensor characteristics would give me the equivalent detail that I would observe”. The capabilities, sensitivity of a camera verses the eye-brain combo is so very different it makes the question almost meaningless. In my biased opinion, the camera will always show you more.

andrea tasselli avatar
For any EP the actual FOV is the nominal FOV divided by the magnification. Because either that info (nominal FOV) is absent or the actual position of the field stop might be in doubt then the best way to retrieve that information is to let a star drift across the FOV and measure the time it takes to do it. For  that and the RA drift in angular units you get the exact actual FOV and from that the nominal one which can then be used with different scopes/EP combos or with a barlow. Obviously the mount isn't tracking.
Helpful Insightful
John Hayes avatar

Rick Evans · Dec 31, 2025, 12:27 AM

I know that, for example, a 9mm eyepiece has a focal length of 9mm. Dividing a telescopes focal length by the focal length of an eyepiece used gives approximate magnification. We don’t think in terms of magnification when using a camera sensor instead of an eyepiece though.

A camera sensor is placed at prime focus of a telescope system, so the focal length should be given by aperture x F ratio. I think that the camera sensor size is inversely proportional to the image scale produced. I think that image resolution is at least in part determined by telescope aperture and sensor pixel size and pixel number and there is a formula for calculating it.

So, with all that as background, I’m just curious to know how to determine which camera sensor characteristics would give me the equivalent detail that I would observe if using an eyepiece of a particular focal length with the same telescope.

Let me see if I can clarify a few things for you. Optically, the question you are asking is a little like, “If apples are red and oranges are citrus, how much do grapes weigh?

Optical magnification is a measure of how much angles are magnified. So, a 1000 mm focal length telescope equipped with an eyepiece that has a 20 mm focal length will magnify angles by 50 times. It is the angular magnification that makes things look bigger (or if you like, closer). Angular magnification has nothing to do with the field of view that you see through the telescope. The field of view is determined by whatever limits the off-axis rays in the focal plane. In an eyepiece, that is call the “field stop” and you can easily see it if you look in the back side of the eyepiece. When you remove the eyepiece and place a sensor in focal plane, the size of the sensor determines the field of view. Many sensors are larger than the field stop in most eyepieces, which means that a camera will see a bigger patch of sky than you can with your eye. (There are of course exceptions to this when you select a smaller sensor size.)

The sharpness of the image that you can see with your eye is determined by the resolving power of your eye, which is typically about one minute of arc for most folks with 20/20 vision. If you work that backwards through the telescope, the resolution of the system is typically about 1/m in arc-minutes, where ‘m’ is the magnification. This is not an unlimited value because things like optical aberrations, atmospheric seeing, and the diffraction of light place an upper bound on what can be resolved.

When you put a camera on a telescope, the angular resolution is determined by the ratio of the spacing of the pixels in the sensor divided by the focal length. Here again, optical aberrations, atmospheric seeing, and the diffraction of light all place the same upper bound on what can be resolved.

So tell me the aperture of your telescope, the focal length and what eyepiece you are using and it’s easy to compute the pixel spacing required to match what you see visually. Here’s an example:

Telescope: 200 mm, F/10, FL = 2000 mm

Eyepiece: 20 mm, m = 100x, visual resolution ~ 0.1’ = 0.6”

Sensor resolution = arctan(p/f) ===> p = f*tan(sensor resolution)

p = Pixel spacing = 5.82 microns (spacing needed to match 1’ visual resolution)

Just be aware that typical atmospheric seeing might limit even the visual resolution to around 2” so this system would be over-sampled unless the seeing were spectacularly good. Most modern CMOS sensors have a pixel spacing of 3.76 microns so this sort of sampling is pretty easy to achieve. NOTE: I’ve been pretty loose with terminology here. Sensor sampling is not the same as resolution but that’s a completely different discussion.

John

Well Written Helpful Engaging
David Jones avatar

John Hayes · Dec 31, 2025 at 09:50 PM

Rick Evans · Dec 31, 2025, 12:27 AM

I know that, for example, a 9mm eyepiece has a focal length of 9mm. Dividing a telescopes focal length by the focal length of an eyepiece used gives approximate magnification. We don’t think in terms of magnification when using a camera sensor instead of an eyepiece though.

A camera sensor is placed at prime focus of a telescope system, so the focal length should be given by aperture x F ratio. I think that the camera sensor size is inversely proportional to the image scale produced. I think that image resolution is at least in part determined by telescope aperture and sensor pixel size and pixel number and there is a formula for calculating it.

So, with all that as background, I’m just curious to know how to determine which camera sensor characteristics would give me the equivalent detail that I would observe if using an eyepiece of a particular focal length with the same telescope.

Let me see if I can clarify a few things for you. Optically, the question you are asking is a little like, “If apples are red and oranges are citrus, how much do grapes weigh?

Optical magnification is a measure of how much angles are magnified. So, a 1000 mm focal length telescope equipped with an eyepiece that has a 20 mm focal length will magnify angles by 50 times. It is the angular magnification that makes things look bigger (or if you like, closer). Angular magnification has nothing to do with the field of view that you see through the telescope. The field of view is determined by whatever limits the off-axis rays in the focal plane. In an eyepiece, that is call the “field stop” and you can easily see it if you look in the back side of the eyepiece. When you remove the eyepiece and place a sensor in focal plane, the size of the sensor determines the field of view. Many sensors are larger than the field stop in most eyepieces, which means that a camera will see a bigger patch of sky than you can with your eye. (There are of course exceptions to this when you select a smaller sensor size.)

The sharpness of the image that you can see with your eye is determined by the resolving power of your eye, which is typically about one minute of arc for most folks with 20/20 vision. If you work that backwards through the telescope, the resolution of the system is typically about 1/m in arc-minutes, where ‘m’ is the magnification. This is not an unlimited value because things like optical aberrations, atmospheric seeing, and the diffraction of light place an upper bound on what can be resolved.

When you put a camera on a telescope, the angular resolution is determined by the ratio of the spacing of the pixels in the sensor divided by the focal length. Here again, optical aberrations, atmospheric seeing, and the diffraction of light all place the same upper bound on what can be resolved.

So tell me the aperture of your telescope, the focal length and what eyepiece you are using and it’s easy to compute the pixel spacing required to match what you see visually. Here’s an example:

Telescope: 200 mm, F/10, FL = 2000 mm

Eyepiece: 20 mm, m = 100x, visual resolution ~ 0.1’ = 0.6”

Sensor resolution = arctan(p/f) ===> p = f*tan(sensor resolution)

p = Pixel spacing = 5.82 microns (spacing needed to match 1’ visual resolution)

Just be aware that typical atmospheric seeing might limit even the visual resolution to around 2” so this system would be over-sampled unless the seeing were spectacularly good. Most modern CMOS sensors have a pixel spacing of 3.76 microns so this sort of sampling is pretty easy to achieve. NOTE: I’ve been pretty loose with terminology here. Sensor sampling is not the same as resolution but that’s a completely different discussion.

John

John - always giving some of the most useful and complete scientific and mathematical answers to these sorts of questions. Thank you, sir.

Well Written Respectful Supportive
Rick Evans avatar

After reflecting on all this a bit, it occurs to me that maybe the simplest answer to my question is to just hold my cell phone up to the eyepiece and take a photo.

Rick

Well Written
John Hayes avatar

Rick Evans · Dec 31, 2025, 10:32 PM

After reflecting on all this a bit, it occurs to me that maybe the simplest answer to my question is to just hold my cell phone up to the eyepiece and take a photo.

Rick

Rick,

That may work but depending on the brightness of the object, you may not get a very good result. The autofocus system in your cell phone may also “fight” to achieve the right focus. Go ahead and try it. That’s how we learn; but there is a reason that virtually none of the images that you see on Astrobin are taken with cell phones.

John

Well Written Concise
Rick Evans avatar

Yes, very true, results would not approach lucky imaging, but they might capture the image scale enough to provide a better comparison to what the eyepiece view had been when going on to do lucky imaging as the next step. It might help better define what the image scale had been.

Rick

John Nedelcu avatar

There are several things you’ve touched on here. Without going into the maths too much, the FoV that a camera and telescope combination will give will be a function of the FL of the telescope and the sensor size. These can be calculated by using the formulas:
📷image.pngYou will also have to keep in mind the practical resolving power (R) of your scope (its Dawes Limit), calculated using the formula:

📷 image.pngYou may be thinking of the crop factor applied by camera sensors when compared to the 35mm full frame (APSC being 1.5x or 1.6x, Micro 4/3 being 2x). For AP, the above FoV calculation is more suitable. Alternatively, you can use online tools such as this: https://astronomy.tools/calculators/field_of_view/
These will do the above calculations and show you what the expected FoV is.

Definitions:

  • FoV: How much of the sky your setup can capture in one frame.

  • Resolving power: The smallest detail your telescope can theoretically separate. This is affected by atmospheric conditions and is around 2-3 arcseconds.

  • Image Scale: How much sky each pixel on your camera represents. (we aim for between 1-3 arcsec/pixel due to atmospheric conditions). The perfect theoretical sampling would be an image scale that is exactly half the Dawes Limit (also referred to as Nyquist sampling).

  • The Dawes limit was derived empirically, not theoretically. The theoretical maximum resolving power of a telescope is given by the Rayleigh Limit = 138/Diameter [mm].

Well Written Helpful Insightful