Stacked sensors

7 replies167 views
Tony Gondola avatar

I wonder if the new stacked image sensors that are coming out will make it possible to image and guide with the same chip. No more OAG’s or separate guide scopes. A full frame sensor can achieve a readout rate of 20 fps with this technology so it seems possible. With the right software it would be transparent to the user. The total integration time could be summed in camera while a short duration guide frame is taken every few seconds. The guide exposures could then be rolled back into the internal summing that makes up the total sub so that the total sub length really hasn’t changed. Am I missing something or is this the future?

Well Written Respectful Concise Engaging
Spacey avatar

if you mean read data while still exposing then it will require additional and substantial read and memory logic on the CMOS sensor which reduces performance. this is a good topic for a discussion with AI to give you some insights on the evolution of Sony’s CMOS technology

Robin Bosshard avatar

Wouldn’t we hit the DUO dilemma again? With NB filters the short guiding frames will struggle to get a good SNR (if you still manage to get’em to be short)…

Tony Gondola avatar

Spacey · Mar 24, 2026, 12:41 AM

if you mean read data while still exposing then it will require additional and substantial read and memory logic on the CMOS sensor which reduces performance. this is a good topic for a discussion with AI to give you some insights on the evolution of Sony’s CMOS technology

No, you need to read up on what a stacked sensor does. The main advantage is it gives very fast readout speeds with even a full frame sensor giving 20 fps. In DSLR, where the technology is currently deployed it allows much faster autofocus and does away with rolling shutter effects. The rest is just imagining what this technology might make possible on the astro camera side.

Helpful Concise
Tony Gondola avatar

Robin Bosshard · Mar 24, 2026, 01:04 AM

Wouldn’t we hit the DUO dilemma again? With NB filters the short guiding frames will struggle to get a good SNR (if you still manage to get’em to be short)…

yes, you certainly would but if you can fold the data from the guide frames into the total for the sub it really wouldn’t matter. In other words, the guide frames could be as long as you need them to be without impacting intergration time.

Alex Nicholas avatar

It’s been tried in a sense in the past by Starlight Xpress, using interlaced pixel row readout (yes, I’m aware this is not the exact same thing - but, you know, elephant that is a perfect sphere in a vacuum etc.)

At the end of the day - if you’re imaging and guiding with the same camera, regardless of the technology used to achieve it, your issue is going to be guiding through filters…

I’ve done it with 3nm filters and an F/10 SCT with an SBIG ST-10XME in 2010~2011 or there abouts… You CAN… but you don’t want to if there is a better way… Spoiler Alert! There is.

My primary question is this. What problem would this solve that isn’t better solved by an OAG, or OnAG?

None.

Change for the sake of change is wasteful, and sure, somewhere in between the lines, innovation exists… But honestly, Guiding through your filters is sub-par, so any form of dual sensor/stacked sensor/interlaced readout guiding is going to be sub-par.

Well Written Helpful Insightful Respectful Engaging
Spacey avatar

Tony Gondola · Mar 24, 2026, 02:49 AM

Spacey · Mar 24, 2026, 12:41 AM

if you mean read data while still exposing then it will require additional and substantial read and memory logic on the CMOS sensor which reduces performance. this is a good topic for a discussion with AI to give you some insights on the evolution of Sony’s CMOS technology

No, you need to read up on what a stacked sensor does. The main advantage is it gives very fast readout speeds with even a full frame sensor giving 20 fps. In DSLR, where the technology is currently deployed it allows much faster autofocus and does away with rolling shutter effects. The rest is just imagining what this technology might make possible on the astro camera side.

I’m going to tell you that you need to read up on what a stacked sensor is, and what it is not.

Tony Gondola avatar

Alex Nicholas · Mar 24, 2026, 03:33 AM

It’s been tried in a sense in the past by Starlight Xpress, using interlaced pixel row readout (yes, I’m aware this is not the exact same thing - but, you know, elephant that is a perfect sphere in a vacuum etc.)

At the end of the day - if you’re imaging and guiding with the same camera, regardless of the technology used to achieve it, your issue is going to be guiding through filters…

I’ve done it with 3nm filters and an F/10 SCT with an SBIG ST-10XME in 2010~2011 or there abouts… You CAN… but you don’t want to if there is a better way… Spoiler Alert! There is.

My primary question is this. What problem would this solve that isn’t better solved by an OAG, or OnAG?

None.

Change for the sake of change is wasteful, and sure, somewhere in between the lines, innovation exists… But honestly, Guiding through your filters is sub-par, so any form of dual sensor/stacked sensor/interlaced readout guiding is going to be sub-par.

The major advantage to my mind is the ability to use the entire sensor for guiding. That would resolve the filter issue to some degree simply because more/brighter stars would be available.

Well Written Respectful