I wonder if the new stacked image sensors that are coming out will make it possible to image and guide with the same chip. No more OAG’s or separate guide scopes. A full frame sensor can achieve a readout rate of 20 fps with this technology so it seems possible. With the right software it would be transparent to the user. The total integration time could be summed in camera while a short duration guide frame is taken every few seconds. The guide exposures could then be rolled back into the internal summing that makes up the total sub so that the total sub length really hasn’t changed. Am I missing something or is this the future?