Posted on

Back Up Google Drive with CrashPlan

Today, as I put another spreadsheet with Store Locator Plus and Enhanced Results features onto Google Drive, I realized something.   If Google Drive crashes, very unlikely but stranger things have happened, I don’t have a backup copy of ANY of my Google Drive documents.  There are a lot of things I have out there that I really don’t want to re-create.   It is even more important now that I am using Google Drive to store spreadsheets that are an integral part of my WordPress plugin documentation.

While I’m thinking of it, let’s go down that road for a moment.

Using Google for WordPress Tables

A few days ago I wanted to start building an “add on pack comparison” to the site.   It helps me organize my thoughts on what features belong in which plugins, reminds me of where I put those features, and also educates the consumer on what plugin they may be interested in.    I decided a wide table with side-by-side columns for each plugin was the best option.   Since it is not a true price comparison I needed a flexible grid display.

I tried a number of table plugins that are out there in the WordPress extensions directory.   Unfortunately a large number of those plugins were defunct, many not updated in years.   The few that were updated were adequate but too hard to man-handle to look  just the way I wanted, requiring extensive CSS updates and HTML man-handling to behave as I desired.  Sure, some of those, like TablePress, had options to make those efforts easier but still not effortless.

Then I stumbled across a post that discussed inserting a Google spreadsheet in the middle of a page.   You create a spreadsheet, format it how you like, and then publish to web.   Select the “embed code” and get the unique iframe tag to put on your site.  While I was leary of the iframe idea, it worked beautifully.   Now I can format the colors, fonts, and column data exactly how I want with the ease of updating the Google spreadsheet.  It is far easier to click a cell and the color box and see the background change than tweak CSS all day.

If you are trying to put tables in WordPress you may want to check that out.     Create the spreadsheet in Google, go to “publish to web”, publish, go to “get embed code”, copy the iframe HTML snippet, and paste into your page or post (in text mode).   Tweak the width and height parameter to fit your site.  Done.

embedded google spreadsheet
embedded google spreadsheet

Check off “auto-republish” and every time you make a change it will reflect on the website within a few hours (or you can force a manual republish if you need if faster).

Nice.

Backing Up Google Drive

So back to the backup issue.   I have a lot of doc, some integral to my public site, on Google Drive.   I NEED to back those up.    How did I do it?  Turns out easier than I thought.

First, I run CrashPlan as my backup service.  MUCH better than Mozy which is over-priced, IMO.   A MILLION TIMES better than Carbonite, which is slow as heck, throttles the computer, has horrible restore times, and worse support response times.   In fact if you are considering backup my only key recommendation is do NOT use Carbonite.  There is a reason you hear about them all the time, they are hiding a poor design and poor service with a huge marketing budget.

Second, getting the Google Drive content to the CrashPlan backup.   Easy.   Install Google PC Drive.   When you log into Google Drive there is a subtle link in the left sidebar for this app.    It is an program that will be installed on your local computer.   It creates a folder on your computer which is the “local sync” for the Google Drive content.  You can select which folder you want to keep in sync.   I just let it do the whole thing since I have plenty of space on the 1TB drive in my notebook computer.

Now I have a local copy (first stage backup) of everything on Google Drive.   Even better, if I create something in that folder OR on Google Drive it will be auto-replicated on both sides.   That makes for a good first stage backup strategy.

Second, since this folder goes under your user directory by default CrashPlan should automatically note the new content and mark it for backup, which it did on my system.  If it does not do this you can manually add the Google sync folder to the backup plan.

crash plan and google drive
crash plan and google drive

Easy.

I like easy.

UPDATE : NO SO EASY….

Henry Houh contact me last week about an issue with this type of backup.     It turns out Crash Plan will not back up ANYTHING by default when using this configuration.  Why?    Crash Plan runs as the user “SYSTEM” not as your normal login user.     Google Drive runs as you.

In my case the Google Drive folder was created with “Full Control” permission for me but NO permissions for the system user “System”.

The fix?

Go to the Google Drive folder, not the shortcut.

Right-click and select properties.

Select the security tab.

Click the Edit button.

google drive properties

Type “System” in the add user box.

Click “Check Names”.

Click OK.

Click on System in the list of users.

Check off “Full Control” under the allow column.

Click OK.

Click OK.

google drive with system properties

Now your Google Drive content will be backed up to Crash Plan.

Posted on

Moving Sites with VaultPress

VaultPress

I am going to be moving the main www.charlestonsw.com site soon and decided to try it with a new process.  I recently subscribed to VaultPress backups and thought I’d give that a try as a simple way to move the main WordPress site.   It should carry over all the WordPress directory files as well as the database settings.  My supporting apps living outside the WordPress world need to be brought over separately.

VaultPress has a new feature where you can restore your backup to an alternate location using an FTP/FTPS login that you provide.    Overall the process went smoothly with a few caveats.   VaultPress will likely resolve these issues soon, so be sure to check with them for any feature changes.   The issues I ran into and their fixes are noted here:

Home & SiteURL

The home and siteurl variables in the newly restored wp_options table of WordPress were wrong.   They were changed to my new site (wp.charlestonsw.com) that was my temporary placeholder, however the leading http:// was dropped off of both.   I prefer MySQL command line so I went there and ran this command:

mysql> update wp_options set option_value=’http://wp.charlestonsw.com’ where option_name=’home’ or option_name=’siteurl’;

.htaccess File

The VaultPress backup does not grab hidden files, like .htaccess.    I guess they assume this is already setup on the new site, however the new site permalink rules may not match the old site.  IMO this should be part of the backup/restore set.    There are two ways to fix this.   You can get your old .htaccess file “by hand” and restore it to the new server.    The other option is to look at the setting in our old admin panel for permalinks, then go to the new admin panel and select something DIFFERENT, save, then set it back.  This will re-create the necessary file assuming your directory permissions are correct.    This is usually a good idea when you change site URLs anyway as it clears the cache for a variety of plugins and internal “gears” for WordPress.

 

Thus far those are the only two “gotchas” I’ve found.   Of course I need to make sure I get my other databases and scripts moved, but I can use typical scp or rsync commands to pull server-to-server and a mysqldump/restore.  Luckily the other items are my internal reports only so it should cause less headaches if something goes missing for a day or two.

 

 

Posted on

Backing Up WordPress

Backing up your website is one of the most important ways you an protect your business from extended down time.  Server crash.  Hackers hack.

When you select a backup solution you want to make certain you not only backup the source files but the database as well.  Without the database your WordPress site is nothing but a shell.    Most hosting backups will only backup the files on your server.  Better hosting backups will know about any SQL database, including the WordPress MySQL database, IF it was created through the hosting control panel.  Command line created database files are hit-or-miss depending on the sophistication of the backup system.

Another issue is scheduling of backups and off-site storage.  Most shared hosts do not have scheduling software.  That means manually backing up your site which means eventually you forget or are too busy and backups do not get completed.   Many shared hosts also do not backup to an “off server” location.  That means your backup file is living in a directory right next to your main site.   Server crashes and your backup is toast.

While there are a number of backup plugins available for WordPress, I have found the better solutions are always a paid option.    They have an interest in keeping the system working.  In the past 3 years I’ve used at least 4 different free or “backup to your own Amazon S3 account” (where you only pay Amazon storage fees) solutions.   Every one either broke, couldn’t handle certain fails, failed restores, or … in the  most recent case with Updraft Plus, filled up the log file with 39GB of incorrect parameter warnings.

At the end of the day I find myself once again going back to the folks at Automattic and their Jetpack series of add-ons.   I am now using VaultPress and it is by far the most refined UX (user experience) of all those I’ve used.  Yes, there is still room for improvement on the user interface but it is pretty darn solid (and a lot better than most of my own plugin interfaces).    The backups from JetPack get shot out over the ether and into the “WordPress cloud”, the myriad of servers and redundant systems that run the public WordPress blog “megaverse”.    I’ve not seen that service go offline even once in the past 3 years (unlike Amazon S3) so I have some faith in the backup being there when I really need it.    Of course I will do my monthly full backup and pull it to my 2TB USB-3 drive I have sitting right here so I can get my hands on it, but for the daily course of business I will be trusting VaultPress from Automattic to take care of my site.

Thus far $15/month is well worth the peace of mind and hopefully avoids the 39GB error logs and “suddenly unavailable” issues I’ve seen in too many other plugins.   Backup is too important to “go cheap”.

 

Posted on

Carbonite Problems

About 2 years ago I switched from Mozy as my preferred backup services provider to Carbonite.   This weekend I am contemplating going back to Mozy, despite their 100%+ price increase, or finding another provider.    The reason?  Carbonite customer service is deplorable and their backup solution is a detriment to productivity.

Customer Service

In the past year I’ve had reason to contact customer service three times.  All three times my question was not answered properly and customer service was basically clueless.   Not surprising since they appear to be an outsourced call center with zero clue about the actual product.  It is blatantly obvious they are following a support script, and a poorly written one at that.

In one example I wanted to set my backup to not run 9-5 M-F and keep running it on the weekend.  This was possible prior to upgrading to the latest version.   I told the agent I had looked at the “backup schedule” option in the control panel and only saw a way to pause the application for one time-slot 7-days/week.  There is no longer an obvious way to have a different schedule on the weekend.   The response?  A scripted answer on what to menu options to click to get to the “pause my backup” schedule in the admin panel.  Great, thanks “Joe” from India!  Totally useless.  My mistake was trying to follow that up with “thanks, I already knew that, what I am asking is….” and re-explaining the issue in simpler terms.   This time I received a two-line response “follow the backup scheduling instructions.  I hope I provided good customer service, please rate me.” along with a ratings link.   WTF.

Service Issue : Throttling

Another reason I am thinking of nixing Carbonite is the lame backup speed.   I am on a 50M/10M line with NO contention.  I routinely get 45M+/8M+ speed tests.    However Carbonite backup speed is ALWAYS throttling at around 128K to 512K (on a good day).   Even routine 512K would be an improvement.    As such I need to keep my system online 24×7 to ensure my 2GB VMware disk blocks are backed up.   Sadly a 4GB file is taking 16h+ to backup and thus I NEVER have a complete backup.    This is the WORST situation as my VMWare development images are the most critical item to push to backup and they are never complete.

Service Issue : VMWare Contention

The final push to make the move to another backup solution is the obvious contention with VMWare.   Disable ALL Carbonite services and everything works great.   Suspend a VMWare Workstation image and shut down the computer and you are out the door in less than 3 minutes.  Enable Carbonite and you might was well go play darts for 15 minutes or more.     Every time, without fail, suspending, terminating, or otherwise existing the VMWare workstation updates part of a 2GB disk file and Carbonite freaks-the-hell-out.    The entire system becomes non-responsive and essentially locks up.    I had this issue on my high-end Asus laptop and now it is happening on my high-end HP laptop.

Every time I disable Carbonite, including the background services, the problem is completely non-existent.   Carbonite’s response last time I brought this up was “this is a problem with VMWare, contact them”.   Really?   Didn’t happen with Mozy, EVER.  Doesn’t happen if I turn off Carbonite.

It is quite obvious Carbonite is not managing their disk changes monitoring algorithm very well.    I’m not sure what exactly is going on, but why it takes 15m+ to detect a simple file change and mark it for backup is beyond me.   The appear to be either scanning every BIT in the file for a true differential backup, I’d be SHOCKED if they are that sophisticated, or they are re-scanning the entire drive because one file change (more likely).   Either way it sucks and kills productivity and my ability to exit the office and get home to my family.

Summary

I’m tired of the lame excuses, slow backup time, and disruption to my daily routine by using a simple backup service.  I’m on the hunt once again for a viable backup service.  Mozy?   Tomahawk?   What else is out there?  What do you use?  Why?    Have some hints, share!

 

Posted on

Backing Up A Linux Directory To The Cloud

We use Amazon S3 to backup a myriad of directories and data dumps from our local development and public live servers.  The storage is cheap, easily accessible, and is in a remote third party location with decent resilience.  The storage is secure unless you share your bucket information and key files with a third party.

In this article we explore the task of backing up a Linux directory via the command line to an S3 bucket.   This article assumes you’ve signed up for Amazon Web Services (AWS) and have S3 capabilities enabled on your account.  That can all be done via the simple web interface at Amazon.

Step 1 : Get s3tools Installed

The easiest way to interface with Amazon from the command line is to install the open source s3tools application toolkit from the web.  You can get the toolkit from http://www.s3tools.org/.  If you are on a Redhat based distribution you can create the yum repo file and simply to a yum install.  For all other distributions you’ll need to fetch and build from source (actually running python setup.py install) after you download.

Once you have s3cmd installed you will need to configure it.  Run the following command (not you will need your access key and secret key from your Amazon AWS account):
s3cmd --configure

Step 2 : Create A Simple Backup Script

Go to the directory you wish to backup and create the following script named backthisup.sh:

#!/bin/sh
SITENAME='mysite'
# Create a tarzip of the directory
echo 'Making tarzip of this directory...'
tar cvz --exclude backup.tgz -f backup.tgz ./*
# Make the s3 bucket (ignored if already there)
echo 'Create bucket if it is not there...'
s3cmd mb s3://backup.$SITENAME
# Put that tarzip we just made on s3
echo 'Storing files on s3...'
s3cmd put backup.tgz s3://backup.$SITENAME

Note that this is a simple backup script.  It tarzips the current directory and then pushes it to the s3 bucket.  This is good for a quick backup but not the best solution for ongoing repeated backups.  The reason is that most of the time you will want to perform a differential backup, only putting the stuff that is changed or newly created into the s3 bucket. AWS charges you for every put and get operation and for bandwidth.  Granted the fees are low, but every penny counts.

Next Steps : Differential Backups

If you don’t want to always push all your files to the server every time you run the script you can do a differential backup.   This is easily accomplished with S3Tools by using the sync instead of the push command.   We leave that to a future article.