Sometimes I make games. Sometimes I make websites. This is my blog.
I just want to back up my Minecraft world files. All told, it’s about 70 GiB in size. It shouldn’t be this hard.
First, let’s talk about the Windows client. I had the files on my removable hard drive and the Windows Google Drive client on my first laptop. I wanted to install the client on my second laptop and sync the same set of files no matter which laptop the hard drive was plugged into.
This means that, even though I had the whole thing, up-to-date, and sitting on my drive, I was not allowed to continue syncing from there. Instead, I had to create a brand new folder and re-download all of my files.
As soon as the Google Drive client was done installing into the new folder, I moved all the files back into my proper Google Drive folder and began the syncing process. After letting it go over night, I woke up to it duplicating all of my files. It went ahead on its own and instead of comparing the files that were already there, it downloaded all the files again and appended (1) to the file name.
I want to download all my files from my Google Drive. This seems like it would be a straight-forward process. Unfortunately, it is not allowed. A Google Drive customer cannot download more than 2GB at one time without installing the Google Drive Windows client. And, even if they were able to, the files would convert automatically to some crazy old-school MS Office formats. It’s either that, or skip the download.
This is basic stuff, guys… We figured this out before I was born in 1983. Unfortunately, the most basic use-cases are beyond Google Drive. One folder, copied into another, should over-write whichever file-names are already there. Unfortunately, Google Drive does not do this and simply duplicates the files in the new folder. A user cannot ever merge two folders together.
With the Windows Google Drive client, there is no way to set a speed limit on uploads or downloads. That means that whenever a file uploads, it’ll saturate your connection with no ability to override (save pausing syncing entirely), bringing your Internet to a complete halt.
Any reasonable cloud drive provider has this figured out pretty early. Google Drive is years old.
The Google Drive web interface looks nice but it’s really slow and doesn’t provide the basic use-case coverage that Windows Explorer does (or Dolphin, or Konqueror, or Midnight Commander, etc.) such as cut/paste, etc. If I’ve got 70 GiB of files that I need to upload online, SFTP is a proven and stable means. No need to reinvent the wheel.
This method would also work great in place of having to install the Google Drive Windows client to download folders over 2GB in size.
There is much more to tell. This is just the start.
Instead, I’m going to take a look at some other backup-and-share solutions. Any suggestions?
It appears that, as much as I want it to be, git is just not the right tool for this job. Instead, I’ve picked up a “100GB” account at Google Drive and will share them there. It’s almost as good and offers the ability to download old revisions as well. I’ve released the world files under the Creative Commons Attribution-NonCommercial 4.0 International License.
Ask and ye shall receive. I’m still going to keep my files on Google Drive because I’ve already invested the time to put them there. But the next time I need to do this kind of thing, I’ll be looking at Put.io.
Here is the original post for history’s sake:
Hellblade Mobs, my Minecraft server, has been in operation since November 2010. For the first 7 months, it was just me and a few friends, white-listed. One world, no hMod, no Bukkit, no plugins.
One May 24, 2011, that all changed: we went public.
Since then, thousands of players have come and gone, and the server still maintains a healthy buzz. We’ve got hundreds (if not thousands) of memories invested in these blocks, spread across eight worlds. It would be horrendous (and likely fatal for the server) if something happened and we lost it all.
Being a developer with a very crummy Internet connection (10Mbps down, 1Mbps up), I’ve always been nervous about this situation. The world files are GB in size and downloading them takes forever, not to mention the act of uploading them to a backup service like Dropbox. Every 6 months or so, I get so nervous that I begin the laborious task of FTPing my files down to my external drive and making a copy of them on an old PC in the hopes that I never have to use it. Halfway through the life of the server, we switched hosts to the fabulous Nuclear Fallout service (referral link!) and it took forever to do.
There has to be a better way!
A recent event with griefers has brought to light once again of making the server files public, which is something I’ve always wanted to do. I thought of the logistics: if only there were some service that I could upload new copies of the worlds over time, uploading just the diffs, and the files could be made for public download whether one wanted an old copy of just a tarball of the newest one… Instantly I thought: why can’t I just put the worlds on Github?
Turns out: I can. Git supposedly is not designed to handle repositories GB in size, but it seems the only effect this has is to slow things down a bit. Compared to the previous situation, I’ll take a 15-minute “add” command with no complaints, thank you very much.
If you’re into Minecraft, you’re more than welcome to come play with us! Connect to minecraft.xandorus.com.
Being a web developer, I usually use several different computers on different operating systems across the lifetime of any project. Personally, I have 5 computers plus one server: Access to a Vista PC, a Windows 7 virtualized installation, my main Mandriva Linux desktop, a Eee 701 PC with Eeebuntu, a Mandriva Linux laptop, and a FreeBSD development server.
Moving files from one computer to the next used to be a time-consuming and ultimately prohibitive process. If I wanted to, say, take a break from working on my PC and work at the Red Brick Cafe for a few hours, I’d have to download my work files to a USB memory card then export the MySQL database and do the same transfer again to the USB memory card.
Or, I could burn a CD. Of course, how does one get the updated files back off the laptop and onto the PC when arriving back at home? This arduous process basically meant that freedom of choice in the work environment was severely hampered and was often more trouble than it was worth. But not any more.
Enter Dropbox (note: referral URL!).
Dropbox is a free service that is basically a shared folder in the cloud. It makes sharing files amongst any computer, whether it be Mac, Linux, or Windows, easy as drag and drop. And I really mean that. I love things that speed up my work processes because the less time I spend in administration mode the more time I can accomplish tasks in programming mode. Dropbox exemplifies this manifesto.
Any file you put in the Dropbox folder on a computer will instantly be available on any computer that install Dropbox on. Even better, revisions are kept so if you make a mistake with a file and don’t have backups, you can pull the file in question from the archives to restore it. What makes Dropbox different from any other revision or archiving setup is that this is all done without any administration by the user. Literally if you drag a file into the folder, all this stuff is done for you. No committing changes, no crazy hoops to jump through.
Oh, and the 2GB storage starter account is completely free. It’s the one I use daily. I don’t even think I’ve hit 25% capacity yet.
Take a look at Dropbox at http://www.dropbox.com/
I learned about a really handy command-line tool today that you can use in Linux, Unix, or FreeBSD to split any large file to any size that you like. These two commands, split and cat, can work together to split your files for easy transfer or storage. Then, you can concatenate them back together again to make one file.
If you’ve got a small maximum filesize to your E-Mail attachments or your backups take up slightly more than a DVD or CD, then you can definitely use this. And, it’s really, really simple.
First, split the file using the command “split”, specifying how many megabytes you want to split your file into.
# split -b 1024m <filename>
This will split your file into as many parts as needed, in 1024MB chunks, typically with the name starting as xaa, xab, xac, and so on. You can then burn these files to a disc or E-Mail them (though if you’re E-Mailing it might be better to specify a smaller chunk size. For example, you’d use “-b 1m” for 1MB chunks).
Keep in mind that once you’ve split your files, in order to put them back together again you’ll need all of the files. If you’re missing one of the files, this operation will not work. But that makes sense, doesn’t it? :)
Once you’ve split your files you can easily put them back together again using the command “cat”.
This is even simpler.
# cat xa* > <filename>
This command tells cat to concatenate all the files starting with “xa” and put them into one new file.
Many zip, rar, or other archiving tools will allow you to split the file you’re zipping across multiple volumes, but the benefit of this is that you can split any file you like, even MP3 or jpg files. Split to your heart’s content!