Hi folks,
Yesterday I was processing a new set of data of M101. I thought maybe I could try adding data from one year ago in order to take advantage of the additional signal. But. To my surprise, the data did not match. Focal lengths were different but I did my usual routine with Astro Pixel Processor which always works with data taken on nebulae (but with some months difference between the shots, not a year or more).
The result was as follows: the stars were matched but the galaxy was not, it had moved. Data from this year and data from one year ago on the galaxy show a drift. So my question is… I guess this means the galaxy is in a different position regarding the stars which are in our Milky Way than one year ago. Does this mean there is a time limit beyond which you cannot match two images because the local star field and the background have drifted apart?
I had not thought about it but of course it makes sense, with time objects change their relative positions (even their shapes).
Maybe this is common knowledge among astrophotographers but I had never thought of this possibility.
If so, how much time do you consider is the average for an object to show this drift? I suppose it is more evident at higher focal lengths and probably in galaxies which are bigger masses than in nebulae which are lighter, more ethereal and their relation with the star field is tighter because they are in the same galaxy, our Milky Way.
I bring up this topic because I think it might be of interest to all.
Yesterday I was processing a new set of data of M101. I thought maybe I could try adding data from one year ago in order to take advantage of the additional signal. But. To my surprise, the data did not match. Focal lengths were different but I did my usual routine with Astro Pixel Processor which always works with data taken on nebulae (but with some months difference between the shots, not a year or more).
The result was as follows: the stars were matched but the galaxy was not, it had moved. Data from this year and data from one year ago on the galaxy show a drift. So my question is… I guess this means the galaxy is in a different position regarding the stars which are in our Milky Way than one year ago. Does this mean there is a time limit beyond which you cannot match two images because the local star field and the background have drifted apart?
I had not thought about it but of course it makes sense, with time objects change their relative positions (even their shapes).
Maybe this is common knowledge among astrophotographers but I had never thought of this possibility.
If so, how much time do you consider is the average for an object to show this drift? I suppose it is more evident at higher focal lengths and probably in galaxies which are bigger masses than in nebulae which are lighter, more ethereal and their relation with the star field is tighter because they are in the same galaxy, our Milky Way.
I bring up this topic because I think it might be of interest to all.