How do you measure final resolution?

Tony GondolaJohn Hayesandrea tasselliWei-Hao WangArun H
35 replies934 views
Tony Gondola avatar
I think we all agree that between the effects of drizzle and all the other image improvement techniques that are commonly used, final star FWHM is a useless measurement of overall resolution. So how do you measure the real resolution of a fully processed deep sky image? Do you calculate a hard floor depending on optical theory and say that anything more isn't real detail or do you go in and measure the finest details in the image and accept what you find. I would really love to know if there a process that everyone can accept.
Well Written Insightful Respectful Engaging
Kevin Morefield avatar
I would think that the initial FWHM of your masters (specifically the Luminance) would be the answer.  Subsequent processes are just revealing the information you captured. 

Kevin
andrea tasselli avatar
Get the MTF of your scope+image combo and that would tell all there is to know in terms of resolution. FWHM is only a relative measure.
Tony Gondola avatar
Kevin Morefield:
I would think that the initial FWHM of your masters (specifically the Luminance) would be the answer.  Subsequent processes are just revealing the information you captured. 

Kevin

I'm not sure that's valid because what I think you're saying is that you can never increase the actual resolution beyond that of the stacked, unprocessed image. I know in lunar and planetary imaging that's not true. Why would it be true in DSO imaging? The resolution in say, Chris Go's images of jupiter never seems to be questioned.
Tony Gondola avatar
andrea tasselli:
Get the MTF of your scope+image combo and that would tell all there is to know in terms of resolution. FWHM is only a relative measure.

Sure, as an indication of potential contrast it would be but how would you measure an actual image that's not of a resolution chart?
John Hayes avatar
Tony Gondola:
I think we all agree that between the effects of drizzle and all the other image improvement techniques that are commonly used, final star FWHM is a useless measurement of overall resolution. So how do you measure the real resolution of a fully processed deep sky image? Do you calculate a hard floor depending on optical theory and say that anything more isn't real detail or do you go in and measure the finest details in the image and accept what you find. I would really love to know if there a process that everyone can accept.

Tony,
Before declaring FWHM a “useless measurement” of resolution, it’s a good idea to understand what “resolution” is.  Optical resolution is defined as the ability to separate two “very small” features. The easy part is that a “very small feature” means that the feature is small enough to be defined by the point spread function of the system, which is a star image.  The difficulty is in agreeing what “to separate” means and there are a number of different definitions.  The Rayleigh criteria puts the peak of the Airy pattern so over the first minima of an adjacent Airy pattern.  This produced a small dip between the two peaks.  The Sparrow criteria moves the two PSF pattern close enough that the dip disappears.   The Dawes criteria is experimentally determined by determining what a group of sharp eyed observers could separate (for mostly equally bright stars) and it turns out to be pretty close to the Sparrow criteria.  It is generally agreed that the resolution is determined by using two equally bright stars.

The bottom line is that FWHM is a direct measure of image resolution.  The PSF in an image is not an Airy function.  It is the time averaged point spread function of the optical system formed by the telescope+atmosphere+sensor, which is very closely approximated by a Moffat function.  The closest that two points can get before they merge into a single oblong point is equal to the FWHM.  That is the resolution of the image.

John
Well Written Helpful Insightful Engaging Supportive
Tony Gondola avatar
I was hoping you'd weigh in on this. It sounds like the same game that visual observers have been playing for years. My question then becomes: if processing can improve your ability to separate two point sources, is that an actual increase in the resolution of the image. It sounds like it would be by the above definition or am I missing something?
Kevin Morefield avatar
Tony Gondola:
Kevin Morefield:
I would think that the initial FWHM of your masters (specifically the Luminance) would be the answer.  Subsequent processes are just revealing the information you captured. 

Kevin

I'm not sure that's valid because what I think you're saying is that you can never increase the actual resolution beyond that of the stacked, unprocessed image. I know in lunar and planetary imaging that's not true. Why would it be true in DSO imaging? The resolution in say, Chris Go's images of jupiter never seems to be questioned.

What I'm saying is that information Chistopher Go captured was in the selected set of data he produced that master with already.  The process Autostakkart! goes through to break down the image into sectors, select the sharpest sections and reassemble the image are part of the stacking process.  Once that master is produced you would measure it for a definition of resolution.  Not sure what you would measure on planetary since there generally aren't stars, but the detail you captured is all there before running wavelets, BlurX or other sharpening or deconvolution.    

I believe all of the post-stacking processes are applying contrast and various scales but not increasing resolution.  I'd be interested to see if @John Hayes feels that's an apt description.

Kevin
Tony Gondola avatar
Kevin Morefield:
Tony Gondola:
Kevin Morefield:
I would think that the initial FWHM of your masters (specifically the Luminance) would be the answer.  Subsequent processes are just revealing the information you captured. 

Kevin

I'm not sure that's valid because what I think you're saying is that you can never increase the actual resolution beyond that of the stacked, unprocessed image. I know in lunar and planetary imaging that's not true. Why would it be true in DSO imaging? The resolution in say, Chris Go's images of jupiter never seems to be questioned.

What I'm saying is that information Chistopher Go captured was in the selected set of data he produced that master with already.  The process Autostakkart! goes through to break down the image into sectors, select the sharpest sections and reassemble the image are part of the stacking process.  Once that master is produced you would measure it for a definition of resolution.  Not sure what you would measure on planetary since there generally aren't stars, but the detail you captured is all there before running wavelets, BlurX or other sharpening or deconvolution.    

I believe all of the post-stacking processes are applying contrast and various scales but not increasing resolution.  I'd be interested to see if @John Hayes feels that's an apt description.

Kevin

I agree and that's really the heart of my question.
andrea tasselli avatar
Tony Gondola:
Sure, as an indication of potential contrast it would be but how would you measure an actual image that's not of a resolution chart?


Because resolution means nothing unless is properly specified in terms of contrast and a MTF chart will give you that, at any available frequency with the potential contrast.  See here (I made it so that it is relevant to your system): 


You can play around with it here: MTF Analyzer – RC Astro
Insightful
andrea tasselli avatar
Tony Gondola:
'm not sure that's valid because what I think you're saying is that you can never increase the actual resolution beyond that of the stacked, unprocessed image. I know in lunar and planetary imaging that's not true. Why would it be true in DSO imaging? The resolution in say, Chris Go's images of jupiter never seems to be questioned.


*As a self-declared expert in the field I'd assure you that isn't true. What you do is to increase the contrast using wavelets decomposition but otherwise the spatial resolution is both in agreement with the maximum instrumental spatial resolution that the optical system can achieve (at the time) and the diffraction limit.
Tony Gondola avatar
andrea tasselli:
Tony Gondola:
Sure, as an indication of potential contrast it would be but how would you measure an actual image that's not of a resolution chart?


Because resolution means nothing unless is properly specified in terms of contrast and a MTF chart will give you that, at any available frequency with the potential contrast.  See here (I made it so that it is relevant to your system): 


You can play around with it here: MTF Analyzer – RC Astro

That's quite good as it is very much in line what I measure as the limit in a good quality single sub., about 1.4"
John Hayes avatar
There's a lot of stuff flying around here so let me try to sort it out.
Tony Gondola:
My question then becomes: if processing can improve your ability to separate two point sources, is that an actual increase in the resolution of the image. It sounds like it would be by the above definition or am I missing something


Deconvolution is a valid way to extract a "sharper" image and if it is done properly, that always results in more image resolution.  Deconvolution decreases FWHM, which means that you can more clearly resolve close point sources.   You can easily demonstrate this by looking at FWHM before and after running BXT.
Kevin Morefield:
What I'm saying is that information Chistopher Go captured was in the selected set of data he produced that master with already.  The process Autostakkart! goes through to break down the image into sectors, select the sharpest sections and reassemble the image are part of the stacking process.  Once that master is produced you would measure it for a definition of resolution.  Not sure what you would measure on planetary since there generally aren't stars, but the detail you captured is all there before running wavelets, BlurX or other sharpening or deconvolution.    

I believe all of the post-stacking processes are applying contrast and various scales but not increasing resolution.  I'd be interested to see if @John Hayes feels that's an apt description.

I pretty much agree Kevin.  Autostakkart uses a piece-wise method of extracting the very best images from a stack of images without relying specifically on any PSF data.   That means that the actual PSF may vary a bit over the field but that is normal for almost any optical system.  The thing to remember is that the image is formed by convolving the object with the PSF of the system.  In the case of a planetary image, we generally don't see an image of a point source but if we did, it would be very small and it's that small size that dictates the ultimate resolution and hence sharpness that you get out of the image.  Without a point source, it's hard to measure anything to put a number on it.  It might be possible to train a neural net to estimate the PSF from a planetary image--sort of the reverse of what BXT does; but although it's interesting, that's not a very useful thing to do.
andrea tasselli:
Get the MTF of your scope+image combo and that would tell all there is to know in terms of resolution. FWHM is only a relative measure.

As you may know, I love physical optics but MTF is not the same as image resolution.  MTF is the real part of the OTF.  In transform space, the transform of the image is simply given by the OTF * transform of the irradiance distribution of object.  The MTF and PTF show how spatial and phase components are transferred through the system but they are not a direct measure of spatial resolution.  Spatial resolution is defined by the ability of an optical system to separate two close point sources.  A system with a "terrible" MTF (say due to defocus) will always produce a worse image than one with an ideal MTF so MTF affects resolution; but that's not how resolution is determined.


John
Helpful Insightful
andrea tasselli avatar
I'd have to say that insofar as extended objects are concerned the measure of the resolution as given by FWHM values is only a partial approximation of the actual capability to resolve details other than point sources. I'd also disagree that resolution is uniquely defined by the ability of separating point sources although this is a possibility. Besides, in all practical aspects (at least for direct imaging systems) the real part of the OTF is what matters, not the phase information.
John Hayes avatar
andrea tasselli:
I'd have to say that insofar as extended objects are concerned the measure of the resolution as given by FWHM values is only a partial approximation of the actual capability to resolve details other than point sources. I'd also disagree that resolution is uniquely defined by the ability of separating point sources although this is a possibility. Besides, in all practical aspects (at least for direct imaging systems) the real part of the OTF is what matters, not the phase information.

Ok.  First off, although we generally don't care about the PTF, when there's a phase reversal, MTF contrast passes through zero so phase can have a big effect...as shown below.   The phase is also very important when it comes to actually using the OTF to compute anything--not to mention that it's what you measure with an interferometer.  MTF is good as a diagnostic tool but it's only half of the information contained in the OTF.



Second, the commonly accepted definition for resolution is a measure of how close can you get two equally bright point sources and still be able to tell that there are two points.  if you are going to disagree with how everyone else defines optical resolution, please tell us how you define it.  And to be clear:  I'm not asking for what factors affect resolution.  The question is how are you are defining resolution in a measurable way?

John
Helpful Insightful
andrea tasselli avatar
The standard definition is in cycles/length or cycles/radian AFAIK. How that applies to astronomical imagery is however debatable and but I don't think the average FWHM value would define it at all frequencies and in all locations.
Wei-Hao Wang avatar
I hope people here can first realize "resolution" has few widely accepted definitions in professional astronomy and they are not something arbitrary.  Why?  It will sound silly if one day an amateur astronomer claims that his small telescope in backyard has a resolution higher than that claimed by professional astronomers for a 2m telescope at a good astronomical site.  Such a silly thing can happed if the amateur side is too arbitrary in how they (we) use the term "resolution."

In professional astronomy, the most commonly used measure to describe image resolution is stellar FWHM, both for resolution dominated by diffraction and for resolution that's dominated by seeing.  For diffraction, the Rayleigh limit is also often used, but FWHM of the central Airy disk is equally valid.  

Then, for FWHM measurement, it is widely accepted that it should only be measured before any sharpening or deconvolution, at least in the optical wavelength regime.  Post processing (sharpening, deconvolution, whatever) can help to see details close to or even beyond the resolution limit, and this is definitely allowed in professional astronomy.  However, since different post processing can be used by different researchers on the same dataset, we don't consider FWHM or anything after such sharpening process an objective way to describe the resolution of that dataset.  For example, one might arbitrarily tunes up the deconvolution parameters to give a FWHM that's 10x smaller than the FWHM before deconvolution, but this is likely going to create lot of image artifacts. How to balance the aggressiveness of deconvolution and the amount or degree of artifacts?  The balance is a personal choice and this choice shouldn't affect the "resolution" of a dataset.

So, short answer: FWHM before sharpening/deconvolution.
Helpful Insightful
Tony Gondola avatar
Wei-Hao Wang:
I hope people here can first realize "resolution" has few widely accepted definitions in professional astronomy and they are not something arbitrary.  Why?  It will sound silly if one day an amateur astronomer claims that his small telescope in backyard has a resolution higher than that claimed by professional astronomers for a 2m telescope at a good astronomical site.  Such a silly thing can happed if the amateur side is too arbitrary in how they (we) use the term "resolution."

In professional astronomy, the most commonly used measure to describe image resolution is stellar FWHM, both for resolution dominated by diffraction and for resolution that's dominated by seeing.  For diffraction, the Rayleigh limit is also often used, but FWHM of the central Airy disk is equally valid.  

Then, for FWHM measurement, it is widely accepted that it should only be measured before any sharpening or deconvolution, at least in the optical wavelength regime.  Post processing (sharpening, deconvolution, whatever) can help to see details close to or even beyond the resolution limit, and this is definitely allowed in professional astronomy.  However, since different post processing can be used by different researchers on the same dataset, we don't consider FWHM or anything after such sharpening process an objective way to describe the resolution of that dataset.  For example, one might arbitrarily tunes up the deconvolution parameters to give a FWHM that's 10x smaller than the FWHM before deconvolution, but this is likely going to create lot of image artifacts. How to balance the aggressiveness of deconvolution and the amount or degree of artifacts?  The balance is a personal choice and this choice shouldn't affect the "resolution" of a dataset.

So, short answer: FWHM before sharpening/deconvolution.

That might be a good scientific solution but it leaves me wanting as far as describing a final image is concerned.
Well Written
John Hayes avatar
andrea tasselli:
The standard definition is in cycles/length or cycles/radian AFAIK. How that applies to astronomical imagery is however debatable and but I don't think the average FWHM value would define it at all frequencies and in all locations.

I'm sorry Andrea but that is not how optical resolution is defined--or measured.  You are merely listing units of spatial frequency (in either object space or image space).  You are correct that it's not unusual for FWHM values to vary over the field and that means that the resolution also varies over the field.  A good example of that occurs with a comatic field, which will produce the best image quality at the center--falling off as you move outward.  Astigmatic imaging may produce better resolution in one direction relative to the other--depending on how the astigmatism is balanced against defocus.

Optical resolution is defined by the minimum detectable separation between two point source.  The figures below are some calculations that I did a number of years ago for an article showing both the Rayleigh and Sparrow criteria for a perfect optical system.  Obviously there is some wiggle room for deciding what criteria works best to detect two close stars and that's why there are three criteria that generally get tossed around.  For an astronomical telescopes, the optics, the atmosphere, and the sensor all contribute to the ability to clearly separate two close stars.  Under the atmosphere, we aren't dealing with Airy patterns; we are dealing with Moffat functions and FWHM by it's very definition is a "Sparrow" criteria for separating two stars.



 

John
Helpful Insightful Engaging
John Hayes avatar
Wei-Hao Wang:
So, short answer: FWHM before sharpening/deconvolution.


I agree with everything you said...in the raw image.  However, deconvolution alters the width of the PSF in the image and that improves spatial resolution in the image.  That's why we do it!  The caveat is that it has to be a "true" deconvolution process.  Simply shrinking the diameter of the stars doesn't count.

John
Well Written Insightful Respectful Concise
John Hayes avatar
Tony Gondola:
That might be a good scientific solution but it leaves me wanting as far as describing a final image is concerned.


That might mean that you aren't asking the right question...


John
Wei-Hao Wang avatar
Tony Gondola:
That might be a good scientific solution but it leaves me wanting as far as describing a final image is concerned.

I think you can still say "my final image has a FWHM of xxxx."  People can always think there is some connection between this FWHM and some kind of image sharpness.  Just don't call this "resolution."
Wei-Hao Wang avatar
John Hayes:
I agree with everything you said...in the raw image.  However, deconvolution alters the width of the PSF in the image and that improves spatial resolution in the image.  That's why we do it!  The caveat is that it has to be a "true" deconvolution process.  Simply shrinking the diameter of the stars doesn't count.

John

Hi John,

Strictly speaking, "true" deconvolution doesn't exist for data with finite sampling and noise.  For such data, a unique mathematical solution for deconvolution does not exist.  We only have approximate deconvolution.  Because of this, astronomers don't often quote the FWHM or whatever limit after deconvolution as "resolution." But I think it's definitely true that a good deconvolution algorithm allows us to see something beyond the Rayleigh limit or seeing limit, at least on the high S/N part of the image.
Well Written Helpful Insightful Concise
Tony Gondola avatar
Wei-Hao Wang:
Tony Gondola:
That might be a good scientific solution but it leaves me wanting as far as describing a final image is concerned.

I think you can still say "my final image has a FWHM of xxxx."  People can always think there is some connection between this FWHM and some kind of image sharpness.  Just don't call this "resolution."

I think that's as close as I'm going to get and I'm certainly comfortable with it as it does describe at least a certain aspect of the image after processing, something you can measure and compare.
John Hayes avatar
Wei-Hao Wang:
John Hayes:
I agree with everything you said...in the raw image.  However, deconvolution alters the width of the PSF in the image and that improves spatial resolution in the image.  That's why we do it!  The caveat is that it has to be a "true" deconvolution process.  Simply shrinking the diameter of the stars doesn't count.

John

Hi John,

Strictly speaking, "true" deconvolution doesn't exist for data with finite sampling and noise.  For such data, a unique mathematical solution for deconvolution does not exist.  We only have approximate deconvolution.  Because of this, astronomers don't often quote the FWHM or whatever limit after deconvolution as "resolution," although it's definitely true that a good deconvolution algorithm allows us to see something beyond the Rayleigh limit or seeing limit, at least on the high S/N part of the image.

Wei-Hao,
I agree that there is no analytic solution to deconvolution that fully recovers the irradiance distribution of the object.   Instead we use algorithms that do "partial deconvolution" to recover an image that has more detail and higher spatial resolution.  The algorithms used by HST and JWST iterate against an error function to produce a "best guess" that minimizes potential errors.  BXT uses a different strategy that produces a similar (I'd argue better) result.  Neural net solutions are typically far less sensitive to noise than iterative methods.  Regardless, whenever you partially deconvolve an image, you are producing a result that is representative of using a narrower point response.  I can understand that there might be a convention to only refer to the FWHM of the original data; but, if a deconvolution algorithm has been applied, that number won't apply to the processed data.

John
Well Written Helpful Insightful Engaging