Feb 242014

My ideea was to set up a script running on Google’s servers that would automatically get files from my site and back them up on Google Drive. I would have gotten a free off-site backup. Unsurprisingly, it’s not working due to limits imposed by Google, even though I did manage to cheat on the 10MB limit for UrlFetchApp and file creation. Instead it now fails either because it’s execution takes too long, or because it tries to write to Drive too many times in a short period.

Some basics first.

What is Google Apps Script? JavaScript based, it can be used to make Google’s cloud do stuff for you. Basically web apps can be written that run on Google’s servers and use Google products like Maps or Calendar.

What is Google Drive? (Really?) It’s a site where people can store files. 15GB for free. Like Dropbox.

There are limits though and they are set pretty low. Understandable, as it would be easy to abuse otherwise. Like, for example, the maximum size that can be fetched from an URL is 10MB. The maximum size of a file that is created via scripting is also 10MB. Not really useful for backups.

Right. On to the actual script and why it (still) doesn’t work.

To be clear, I’m talking about random binary files bigger than 10MB.

First, there is a very simple way to save an URL to Drive. Using UrlFetchApp, get the result as a Blob, create a file passing the Blob as parameter. That’s it, takes two lines of code and it will do the right thing. Problem is, the limit is 10MB. Less even, since the 10MB limit includes HTTP headers.

Creating a file bigger than 10MB is also not possible.

There is a way to append to a file, but that method expects a string as a parameter. Trying to pass it the contents of a binary file will result in garbled and useless output. But this was the only way that I found to defeat the 10MB per file limit.

So the solution seemed to be to read a base64 encoded file in chunks and reconstruct it on Google Drive. That’s what the script does. I use the HTTP Range header. The file could be split on server and multiple files would be fetched, but it ended up the same in my tests. At first I tried to read a standard bzip2 file, base64 encode it in the script and concatenate the result. The archived file was about 25MB. Unfortunately the encoding took too long and the script would end with an error saying that it exceeded the maximum execution time. So I pre-encoded the file on my server using openssl, which resulted in a 32MB file. Trying to read that file in chunks of 4MB (because the maximum length of a string seems to be 5MB) resulted in errors saying that the script is accessing Drive too many times in a short period. So no go there, either. I did manage to get a 14MB file on Google Drive, but that really isn’t too useful.

Anyway, the Google Apps Script is pasted below. My tests were done on a free account, maybe on a payed account it works, or maybe the limits will be relaxed at some point. Maybe there is a simpler solution, Google can probably be tricked, but it really isn’t worth the effort, there are plenty other ways to solve this particular problem.

  One Response to “Backup files on Google Drive using Google Apps Script”

  1. Hi ! Can you please tell what are the other ways to upload a file directly to drive ?
    I know about ctrl.org/save but I want to write my own script.

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">