Rsync with Google Drive
DISCLAIMER: Uploading to a cloud storage system implies that you trust the maintainers of that system and everyone in-between to not mess with/read your data. If this is a concern for you, but you want to upload to Google Drive, anyway, please consider using some of the encryption methods mentioned here (or another encryption method of your choosing): http://how-to.linuxcareer.com/using-openssl-to-encrypt-messages-and-files (Thanks to mostlyharmless for this info.)
UPDATE Feb. 12 2016: The gsync project has had some problems with show-stopping bugs, so you might want to try using rclone, instead. This Linux user has had good experiences with rclone. It’s worth noting that, although this HowTo is Google Drive-specific, rclone works with about a dozen cloud storage services, including Amazon S3 and Dropbox. —DaneM
First, get rclone from http://rclone.org. Download the appropriate file for your operating system and architecture. Decompress the file and either run it from the directory it creates or copy it to somewhere in your PATH, like «/usr/local/bin/» and run it from any location on the command-line. Like gsync, this is a command-line program, so you will need to either run it from the terminal or write a shell script (BASH/SH recommended) to make it do what you want via a double-click. If you choose to do the latter, remember to make your shell script executable with «chmod +x «, or by using your preferred desktop environment’s Properties dialogue.
When you run «rclone config», you will get to pick a name and type for your remote storage medium (referred to as «drive» in this HowTo), and it will ask you to do a few steps to authenticate with your Google account. Follow the instructions! If you run into trouble, go here:
The most basic (but probably not very desirable) way to put all the stuff from one directory on your computer onto Google Drive, overwriting older versions, is to do this:
This will upload everything once, then only upload changes (like rsync does), later on. However, you probably don’t want to execute the command in exactly this way, unless you want all your data to dump into the root folder of the remote location (like a tar bomb). So, instead, format the command like this:
This will put all the stuff from «/» into the Google Drive folder called, «MyBackupFolder/», creating subdirectories as needed. «-v» provides you with information about what the program is doing, while it executes, and is optional.
Note the the trailing slash on the source but not the destination (see below, in the gsync section).
If you want to put everything from the remote drive onto your computer, erasing old files on your local machine, and replacing them with newer ones from the remote source, just put the source—the remote storage location—first in the command; and put the destination—your local machine—second. (THIS IS DANGEROUS, AND WILL ERASE YOUR LOCAL DATA!) Also, you can transfer data from one place on remote storage to another place on remote storage using the same method: «rclone [options]». Refer to the documentation for instructions specific to each cloud.
Finally, if you want to backup a whole lot of directories in an automated fashion, write out a bunch of these commands—one for each directory to backup—and put it into a text file starting with «#!/bin/bash» or «#!/bin/sh» on the first line. (All of your rclone commands belong on lines AFTER the first, each on separate lines.) Then save the file, make it executable (as above), and run it from the command line or double-click it in a GUI environment (like KDE, Gnome, XFCE, etc.).
GSYNC INSTRUCTIONS (No longer recommended.)
It took me a while to find a good, simple, reliable way to backup my stuff to Google drive, using rsync; so I’ve decided to share my method with the good people at LQ. In short, the answer is to use «gsync» (NOT «grsync», which is different and broken/incomplete). It supports (so far as I can tell) ALL the same options as rsync (glee!), and lets you do it with Google Drive! You can upload to, and download from GD in this way, by picking which to use as SOURCE/DESTINATION folders.
Do not neglect the bit about authenticating!! If you do, none of the gsync stuff, below, will work!
For reference, here’s the command I use to backup my stuff between my LOCAL hard drives—from «/mnt/DATA/» to «/mnt/DATA2»:
sudo rsync -c -r -t -p -o -g -v --progress --delete -l -s /mnt/DATA/ /mnt/DATA2
You can check what the options do using «man rsync.» I don’t always use the «-c» option, since it’s slow (but more thorough for checking the data). This command will delete files that are missing from the destination, and overwrite duplicates. Use with care! Note the trailing slash on the source folder; this is important! A trailing slash on the source folder means to copy the stuff IN the folder into the destination (resulting in «/mnt/DATA2/filesandstuff»). No slash means to copy the folder, itself, into the destination (which would result in «/mnt/DATA2/DATA/filesandstuff», which is probably not what you want). The destination folder ignores trailing slashes. (Thanks to suicidaleggroll for this clarification.)
The «gsync» version is this—from «/mnt/DATA/IMPORTANTSTUFF» to «drive://IMPORTANTSTUFF»:
sudo gsync -c -r -t -p -o -g -v --progress --delete -l -s /mnt/DATA/IMPORTANTSTUFF/ drive://IMPORTANTSTUFF
Please note that you should probably not upload an entire 1TB+ drive to GD unless you have and want to use up all that storage space on the cloud. Therefore, I’ve specified the subdirectory of «/mnt/DATA/IMPORTANTSTUFF» to represent the important files/folders that I absolutely have to have backed-up remotely. You’ll need to run a separate command for each folder (including subdirectories) that you want to upload in this fashion; make sure to change both the source and destination in the command when you do. (I haven’t yet figured out how to do them all as a batch job, short of writing a script for it.) Also, I use root (sudo) for this and the rsync command because it helps manage permissions properly—but if you’re certain that the current user/login owns all the files involved, you don’t need it (and probably shoudn’t use it, as a general security/safety precaution).
Finally, if you want to be able to walk away from it and know how long it actually took when you come back, you can prepend the «time» command to the beginning of the gsync or rsync command, like so:
sudo time gsync -c -r -t -p -o -g -v --progress --delete -l -s /mnt/DATA/IMPORTANTSTUFF/ drive://IMPORTANTSTUFF
If you would like to automate this process using a desktop-clickable script, and are having trouble getting it to work with sudo, check out Bash Tips: Acquire root access from within a BASH script using sudo.
How to rsync my Google Drive with a local folder?
Since the built-in nautilus Google Drive is so super slow, I was wondering if there is a way to have my Google Drive contents automatically downloaded to a local folder and then when I make a change it would automatically sync? Basically, I would like to create a /home/user/Google-Drive where to sync my existing Google Drive. So that all my Drive files would be locally on my machine.
I’ve used Insync for this purpose for years without any problems. It’s a one-time $40 purchase (two week free trial) and worth the investment. Alternatively, there’s a google-drive-ocamlfuse package available in a third-party PPA that works reasonably well 🤔
2 Answers 2
Apart from a closed source proprietary solution, rclone may be a good bet.
- You can mount a network drive with rclone . The extent of caching can be tuned, and as such, it may work better than the build in nautilus Google Drive.
- If that still does not cut it, you could effectively mirror with a local copy using rclone , which is a dedicated tool for this, or even rsync . The automation part, however, would need to be scripted. On log-in, you could automatically synchronize the local copy. Google drive could then be updated every now and then, e.g. through a command sheduled in cron job, or perhaps using inotify : this can be set up to watch files or a directory tree for changes, and trigger commands when this happens. The limitation of such approach is that only one user can be working with the network drive this way to avoid conflicts.
There is a dearth of simpe workable solutions of late since the deprecation of some from the past but an alternative does exist in the form of ExpanDrive for Linux
As in the case of insync it is a paid solution (similar cost) with 1 week free trial but the cost after that is not onerous and indeed, if you decide not to elect to have upgrades then the application remains free with a limit of 20 minutes per session in use. Access to your data always remains whether you pay for the licence and or support/updates or not.
The app loads an icon into the system tray and will handle multiple online accounts such as Box, Google Drive, One Drive, Amazon S3, Dropbox &etc. This is superior to other solutions which require navigation to Other Locations in Nautilus and is very similar to the way the Dropbox desktop app performs (although you can still navigate to Other Locations if required to access your files).
Basically, I would like to create a /home/user/Google-Drive where to sync my existing Google Drive.
As indicated, ExpanDrive creates a folder in your /Home folder: /Home/ExpanDrive and a separate folder in there for each online account. You can simply specify that folder when saving files and they will be synced to your Google Drive automatically (as is the case with other online accounts).
Installing ExpanDrive on ubuntu is very simple. Download the .deb file from the link to your local drive, click on it and follow the prompts to install to your system.
The latest version was released on 16 June 2021 so bang up to date.
As this is potentially a paid for solution (if you choose to, as explained) and not FOSS, just to be clear, I have no association with this package or the developers in any form other than as a user.