Compressing and copying large files on Windows Server?
Solution 1
qpress for fast compression with low cpu load, i used it for 130GB SQL 2008 backup dump. It compressed to 34GB file at ~35MiB/s which I think is impressive.
Solution 2
If your utilities are choking on the size of the backup file, just make more backup files, each will be smaller. You can have the SQL Server backup command to use multiple files. Microsoft calls this a "striped" backup set. Here's a stupid example:
backup database foo to disk='c:\foo.01.bak', 'c:\foo.02.bak', 'c:\foo.03.bak'
You can specify as many files as you would care to. Full documentation on the backup command is here.
Alternatively, if you have some money to burn you can use CA's Lightspeed or a similar product. This is an install on the server, but it will compress all server backups. This will result in shorter backup times, which DBAs and sysadmins usually like. CA provides a distibutable command-line utility to decompress a backup that uses their format, so you can send the backup file to anywhere.
Solution 3
Could you use rsync, without the ssh? I don't see where encryption is a requirement, and rsync runs okay on Windows.
How about gzip through netcat? The file(s) should just go through, without running into the problems of the intermediate steps.
Just thinking out loud...sorry I don't have something definitive.
Solution 4
After having compressed the db, could you consider using robocopy to copy over the file?
Related videos on Youtube
Aaron
Dad, hiker, photography enthusiast, and system administrator. I <3 Smalltalk
Updated on September 17, 2022Comments
-
Aaron over 1 year
I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail.
The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB.
The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including:
- gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin)
- bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin)
- WinRAR 3.90
- 7-Zip 4.65 (7za.exe)
I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups.
I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file.
Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately.
For example, this is how 7-Zip dies:
H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error
-
Chris_K about 14 yearsWhat exactly is the error or issues you're seeing? Yes, I see the 7z example above but you mention that other tools have failed as well. Is it more like the "can't copy big files in windows 2003" type issues? Are you seeing memory usage going up when copying?
-
John Gardeniers about 14 yearsWhen you say installing an SSH server is not an option, does that apply to both ends or just the remote?
-
Aaron about 14 years@Chris_K: No, that's the thing- memory usage is relatively tame. It depends on the tool, but they use between 250 MB and 1 GB. Lots of RAM left on this particular box. @John: Unfortunately, not an option on both ends. I could push for it on the remote, but it'd be a hard sell.
-
tony roth about 13 yearsis H: a local disk?
-
tony roth about 13 yearsand nothing in the eventlog that corresponds to this
-
TomTom almost 12 yearsWhat does event log say? This sounds like maybe an access problem? Corrupt file system? and NOT related to 7zip etc.
-
Philip about 14 years+1, 7zip + RoboCopy
-
Aaron about 14 yearsWith all due respect, did you even read the question? I haven't been able to compress with 7zip, or any tool yet. See my question for a list of the compression tools I have atried. Compressing and copying with robocopy is usually what I do- I'd love to do it now- but 7zip fails during the compression.
-
Aaron about 14 yearsGood to hear how other folks do it, but I'd be hard pressed to get approvals for changes that big. :P Thanks!
-
Aaron about 14 yearsPlease read my question. That's what I typically do, but winrar, 7zip, gzip, etc have all failed me. I wish they were just 4 GB- I'd do just that. That strategy has worked fine for 50 GB backup files, but these are much larger, 100-250 GB. Big enough to that 7zip and WinRAR choke.
-
Aaron about 14 yearsNo encryption is fine with me. Honestly, I haven't tried rsync yet for this particular problem. I haven't had the best luck using rsync over SMB shares- in my experience it's been quite a bit slower and less robust than xcopy or robocopy. I'll give it a try though- at this point, the only option I have is physically shipping drives around, so I'm willing to try anything! Hadn't thought of netcat plus gzip- it's an idea worth exploring at least. No need to apologize, you've provided the best ideas so far! :)
-
Aaron about 14 yearsI'll take a look- thanks! I prefer commandline, but I can deal with a GUI- I don't need to automate these jobs, just an occasional ad-hoc thing I need to do.