I learned about a really handy command-line tool today that you can use in Linux, Unix, or FreeBSD to split any large file to any size that you like. These two commands, split and cat, can work together to split your files for easy transfer or storage. Then, you can concatenate them back together again to make one file.
If you’ve got a small maximum filesize to your E-Mail attachments or your backups take up slightly more than a DVD or CD, then you can definitely use this. And, it’s really, really simple.
First, split the file using the command “split”, specifying how many megabytes you want to split your file into.
# split -b 1024m <filename>
This will split your file into as many parts as needed, in 1024MB chunks, typically with the name starting as xaa, xab, xac, and so on. You can then burn these files to a disc or E-Mail them (though if you’re E-Mailing it might be better to specify a smaller chunk size. For example, you’d use “-b 1m” for 1MB chunks).
Keep in mind that once you’ve split your files, in order to put them back together again you’ll need all of the files. If you’re missing one of the files, this operation will not work. But that makes sense, doesn’t it? :)
Once you’ve split your files you can easily put them back together again using the command “cat”.
This is even simpler.
# cat xa* > <filename>
This command tells cat to concatenate all the files starting with “xa” and put them into one new file.
Many zip, rar, or other archiving tools will allow you to split the file you’re zipping across multiple volumes, but the benefit of this is that you can split any file you like, even MP3 or jpg files. Split to your heart’s content!
I think this is better
cat `echo source_part_a* |sort` > target_file_name
Isn’t it going to sort anyway from the filesystem during that glob?
It always has for me. Maybe using some other shell or obscure distribution has different rules… but I doubt it.