ASI2600 File Size Management

Linwood FergusonJerry YesavagevercastroStuart TaylorJohn Hayes
39 replies2.1k views
Ed Litoborski avatar
So I just upgraded from the ASI533 mc pro to the 2600 mc pro and just to be able to run WBPP on the 1st light image, I had to run out and get a 5tb external HD.  I had a 1tb
Question is: going forward, what can I expect?  What do you all use?  Any file size long term storage tips and tricks?
Thanks in advance and clear skies.
MikeY_Astro avatar
I use an expandable NAS to store the data. At 50MB per image, it adds up quick. For my workstation, I have a 1TB SSD for OS and apps, and a 1TB SSD for images that I'm currently processing. Once I'm done processing, I move the data to the NAS.
Well Written Helpful Concise
Stuart Taylor avatar
I also have a 2600 and as Mikey says, the unbinned files are 50MB or so each. I keep the most recent datasets on my laptop until they are processed. Then I archive off all the raw subs, flats etc to a Seagate 5TB USB 3.0 external drive
Helpful
Dave Erickson avatar
I operate remote, using ASI6200 cameras, subframe files are 120 meg. For each camera I have 1TB local storage on the respective control computer, and a 10TB NAS at the remote site for backup. For processing I have a 14Tb working drive and a larger rack mounted NAS-mirror drive for archival storage.
Linwood Ferguson avatar
Buy more disk. 

I use a large working area, for all current projects, and then offload the subs for finished projects to two separate USB drives.

But fundamentally, like video, this is a hobby that takes large amounts of disk.  It's just part of the expense.  There is no magic.
Concise
Vivian Budnik avatar
I use Dropbox to store everything. You need to buy space there, but everything is secure, retrievable at any time, and not that expensive.
best,

Vivian
Linwood Ferguson avatar
Vivian Budnik:
I use Dropbox to store everything. You need to buy space there, but everything is secure, retrievable at any time, and not that expensive.
best,

Vivian

That's only viable for people with really fast internet connections (including up).  While I'm 300mbs or so down, I'm only 11mbs up.  That takes a LONG time for anything large.  Though the US is so far behind other places in internet speeds, that may be more of a US thing.
Quinn Groessl avatar
Vivian Budnik:
I use Dropbox to store everything. You need to buy space there, but everything is secure, retrievable at any time, and not that expensive.
best,

Vivian

The problem I have with that is the cost. I could put together a NAS with about 10TB of storage for about $525. From what I can see dropbox costs $16.58/month for only 3TB of storage. So it would only take 2.5ish years to break even.

It's the same reason I bought Pixinsight instead of going with Photoshop. $260 once vs $21 for the rest of time. Only 12 months for that one.
Helpful Concise
Dan Kusz avatar
Hello Ed,

I have just over 320 hours of integration and all associated calibration frames for each image in 2022. Using my 2600mm I have taken up 1.8TB of storage space. I use a 5TB HD and have 2 others for back up. I generally use only one hard-drive per year. So in 2023 I Have a fresh 5TB drove ready to go. It might me a bit much, but I have had HD failures and lost 1000's of frames so I am extra cautious with my files. I may start using a cloud service as back up for ultimate redundancy.

CS
Dan.
Helpful
vercastro avatar
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.
Stuart Taylor avatar
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.

This sounds interesting. I use NINA (2.0) but I can't see this feature. Can you point me to it?
Is the compression lossy at all? There is presumably some downside?
Well Written Engaging
vercastro avatar
Stuart Taylor:
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.

This sounds interesting. I use NINA (2.0) but I can't see this feature. Can you point me to it?
Is the compression lossy at all? There is presumably some downside?

Under Options > Imaging > "Save image as" set to xisf. Then set compression to LZ4 and SHA-1, and enable "Byte Shuffling".

The compression is lossless. The downside is that you can only open the files in PixInsight (which may not be negative depending on your workflow) and they take slightly more CPU horsepower to decompress.
Well Written Helpful Concise
TurtleCat avatar
I aggressively cull subs to keep file size down but they all zip very nicely. Usually a 20-40% reduction in size so periodically I'll just zip up the raw files for a project I'm basically done working on. I'm also not someone who keeps files around forever so I'll delete stuff that's hopeless, lol.
Linwood Ferguson avatar
<deleted>  (Sorry, I was mentioning .xisf before noting it was already discussed)
Ola Skarpen SkyEyE avatar
Dropbox
Stuart Taylor avatar
Stuart Taylor:
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.

This sounds interesting. I use NINA (2.0) but I can't see this feature. Can you point me to it?
Is the compression lossy at all? There is presumably some downside?

Under Options > Imaging > "Save image as" set to xisf. Then set compression to LZ4 and SHA-1, and enable "Byte Shuffling".

The compression is lossless. The downside is that you can only open the files in PixInsight (which may not be negative depending on your workflow) and they take slightly more CPU horsepower to decompress.

great! Thanks. I only use Pixinsight so that sounds good!
Monty Chandler avatar
The good news is 16TB drives are cheap.    As a photog I have many of these.   I use EMC Retrospect to keep a backup of all the data as well.
John Hayes avatar
I operate remote, using ASI6200 cameras, subframe files are 120 meg. For each camera I have 1TB local storage on the respective control computer, and a 10TB NAS at the remote site for backup. For processing I have a 14Tb working drive and a larger rack mounted NAS-mirror drive for archival storage.

Like Dave, I use a NAS system to back up the 120MB files from my remote system in Chile.  It is set up to be fully automatic so that the data just appears on my NAS drive as it is taken.  That one scope typically generate 5-8 GB/day and with nearly 300 clear nights/yr that requires a capacity of 1.5 - 2.5 TB/yr.  I'm about to install a second scope which will double the storage requirements.  NAS systems are FAR less expensive than using something like Google Drive or Drop Box.  My system currently has 4x12 TB drives configured as SHR + RAID5 to give 36 TB of storage.  I have an additional, identical system offsite for backup.  I just use these NAS systems only for image data so they should last for roughly 18 years with one scope and another maybe 10 years with two scopes.  The system is expandable if I need more space.  Here's the system that I use:  https://www.synology.com/en-us/products/DS923+.  With four Seagate Ironwolf 12 TB drives this system cost around $1500.  You'll also want to install a good quality UPS and in my case, I had to upgrad my internet service to increase the upload speed to make it easier to download data remotely.  I may eventually upgrade my home internet to fiber to get symmetric data speeds (same upload and download speeds) but that's in the future.   It's not super cheap but it's a one-time investment that lasts for a decade.  If you look at what it would cost to save that much data on Google or Drop Box over that period of time, its 

One other thing about both Google Drive and Drop Box is that both serves have dropped their backup services and replaced them with "Sync" service.  Sync service is not suitable as pure data backup service since it keeps the target drive synchronized with the source drive.  That means that if you erase data on the observatory drive, it gets erased on the synchronized drive--and you lose it all!  That's bad.  NAS systems can be configured the same way -or- they can be configured as a true backup system, which is what you want!  The other problem with both Google Drive and Drop Box is that it's hard to down load a folder with all of the ~6 GB of data taken in a single night.  There are ways to do it but it's common to have the download stop and need to be restarted.  It is NOT a smooth process.


-John
Helpful Engaging
Linwood Ferguson avatar
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.
Well Written Helpful Insightful Concise Engaging Supportive
John Hayes avatar
Linwood Ferguson:
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.

Agreed.  This is one strategy but the other that I've suggested to SGP is to allow sub-frame imaging.  This is easily done in MaximDL and a few other programs but it is notably lacking in SGP, which I use for my 20" in Chile.  I often work on small targets that don't require the full field and I could save on both storage space and bandwidth requirement if I could simply specify a cropped region of the sensor that I want to work with.  That capability would allow lucky imaging on very small targets even with a remote system.  It will be interesting to see how long it takes for the guy at SGP to get around to it.

John
Well Written Insightful Concise
jewzaam avatar
Linwood Ferguson:
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.

For the his reason I shoot with gain 0 for broadband filters. Better dynamic range and longer exposures. As long as you can swamp noise, why not?
Linwood Ferguson avatar
John Hayes:
Linwood Ferguson:
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.

Agreed.  This is one strategy but the other that I've suggested to SGP is to allow sub-frame imaging.  This is easily done in MaximDL and a few other programs but it is notably lacking in SGP, which I use for my 20" in Chile.  I often work on small targets that don't require the full field and I could save on both storage space and bandwidth requirement if I could simply specify a cropped region of the sensor that I want to work with.  That capability would allow lucky imaging on very small targets even with a remote system.  It will be interesting to see how long it takes for the guy at SGP to get around to it.

John

NINA has added some support for that, though another alternative with programs like Pixinsight (with good batch processing) is you can do a batch crop on all your images after the fact.  That doesn't save transfer speed or space on the imaging computer, but for long term storage it does help.

A downside of subframe or crops is if done pre-calibration you need similarly sized flats and darks and such.
Helpful Concise
John Hayes avatar
Linwood Ferguson:
NINA has added some support for that, though another alternative with programs like Pixinsight (with good batch processing) is you can do a batch crop on all your images after the fact.  That doesn't save transfer speed or space on the imaging computer, but for long term storage it does help.

A downside of subframe or crops is if done pre-calibration you need similarly sized flats and darks and such.

I use SkyGuard for guiding and when I set up my system, NINA didn't support it.  Post acquisition cropping is certainly a possibility but it doesn't solve how much data would be required on the local data buffer with really short exposures, how long it would take to transfer all that data, and I would run into the daily max data limit set by the observatory.  Flats and darks could easily be cropped but they are also easily retaken so I don't consider that to be a problem.  The actual programming required to add this capability to SGP is pretty trivial compared to a lot of things already in that software.  The real problem is that the have a thousand users all screaming for their own features or bug fixes so it could take a long time for them to ever get around to adding this capability.

John
Helpful Insightful Respectful
Matteo Beretta avatar
I work on a local 4TB SSD. Then, once elaborated, I move all on external 14TB HDD. I keep masters and final image on the internal SSD.
The 14TB HDD has multiples bakup on a NAS. It has slower connection but with more space (about 36TB).
Paolo avatar
Under Options > Imaging > "Save image as" set to xisf. Then set compression to LZ4 and SHA-1, and enable "Byte Shuffling".

The compression is lossless. The downside is that you can only open the files in PixInsight (which may not be negative depending on your workflow) and they take slightly more CPU horsepower to decompress.

This was an amazing tip, thank you very much!!
I wasn't aware of this feature. It drastically decreased the amount of required space for my qhy268m (-60% for calibration frames, -42% for lights)
Related discussions
What prog. do use for Pre processing
Hi all, For ASI2600MC pro.. what program do use for stacking and calibration ? and do you notice any weird colors around the stars or any stars distortion ? Thanks
Discusses ASI2600MC pro file management and WBPP stacking software.
Jan 13, 2021
Pixinsight Error with Image Integration WBPP/FBPP
Do any software-savvy people out there know why Pixinsight WBPP and FBPP (tried with both) fail during the image integration phase? I can't figure this out and I don't know where to even start looking. I'm using raw files for flats, light...
WBPP software issues relevant to author's preprocessing workflow.
Oct 16, 2024
Considering a new PC for Stacking/Image Processing
Hello Folks, I'm thinking about getting a new PC to stack and process my astro images… I currently do all of my work on a M2 Mac mini, 8GB Ram, 256GB SSD, and another 500GB external drive….. Now I'm thinking of getting a Mac Studi...
New PC considerations for handling large astro image files.
Aug 18, 2024
1st Mono Imaging Session Help Please
New to mono and hoping for some insight. Just got my 2600mm paired with Antlia 2.5nm filters. Weather here in NYC has not been agreeable but I was able to get about 4 hours of SII (300s) and 4 hours of Ha (300s). No OIII yet due to the moon and weath...
Author upgraded to 2600mc pro; similar camera discussion relevant.
Aug 18, 2024