UPDATE: There was an error in one of the steps, the file /etc/ppp/options does not have to be edited, but /etc/ppp/pptpd-options does. The steps are now correct.

Like many Canadians I am jealous of the American’s and their ability to watch Hulu or listen to free music with Pandora. Both services claim they are working on making their site available to the world but I don’t like waiting.

[ad name=”Google Adsense 468×60″]

The way these types of sites figure out that you are not an American is by your IP address. I don’t know of any way of using an American IP address on my computer at home but it just so happens I have several Cloud Servers that are located in the US which of course have American IP addresses.

I have heard of people outside the US using the Proxy server or VPN server method but I had no idea it was so easy to setup. If you already have a cloud server up and running you could literally have it working in about 5 minutes. Setting it up from scratch should take about 10 minutes.

Below are the steps I followed to setup a Ubuntu based VPN server that allows me to access these coveted American sites from either my Mac or PC.

Cloud Computing & Cloud Hosting by Rackspace

I use Rackspace Cloud Servers for all my cloud server accounts but any VPS or dedicated server provider (provided they’re servers are located in the US) will work. I used Ubuntu 10.04 but any version of Ubuntu should work.

Connect to your server via SSH and start typing commands

If you just created a new Rackspace Cloud Server you’ll want to change your password.

passwd

Next update the package list and upgrade any packages that need updating.

apt-get update
apt-get upgrade

Now install the PPTP server package.

apt-get install pptpd

Specify the local and remote IP addresses. Default should work unless your local network is 192.168.123.0

nano /etc/pptpd.conf

Add these lines (or uncomment and modify existing ones)

localip 192.168.123.1
remoteip 192.168.123.234-238,192.168.123.245

Create a user account to connect to your server

nano /etc/ppp/chap-secrets

Add a user to the file in the following format:
username pptpd password *
For example:

john pptpd abc123 *

would create a user named john with a password abc123.
[ad name=”Google Adsense 468×60″]
Now restart the pptpd service

/etc/init.d/pptpd restart

You should be able to connect to your server via PPTP but you won’t be able to access any websites outside your server without a few more steps.

Setup DNS servers in the PPP Server options

nano /etc/ppp/pptpd-options

Uncomment and change the 2 lines starting with ms-dns
This sets up your server to make DNS requests via OpenDNS

ms-dns 208.67.222.222
ms-dns 208.67.220.220

Open the system configuration file and setup IP forwarding

nano /etc/sysctl.conf

Uncomment the following line

net.ipv4.ip_forward=1

To make the system configuration changes take effect:

sysctl -p

Edit this file

nano /etc/rc.local

Add these two lines above exit (0) in this file:

/sbin/iptables -t nat -A POSTROUTING -s 192.168.123.0/24 -o eth0 -j MASQUERADE
/sbin/iptables -I FORWARD -p tcp -syn -i ppp+ -j TCPMSS -set-mss 1356

Server is done. You can connect to this server using any PPTP client.

Cloud Computing & Cloud Hosting by Rackspace

Share

No Cloud Files

Rackspace Cloud Files was down for about an hour today. This had no effect on the connected CDN but it meant that all of my sites which use the Cloud Files API wouldn’t work. I contacted support, who told me there was a problem with the Cloud Files servers and that they would be posting the outage on their Rackspace Cloud Files Status blog shortly. Cloud Files must have been down for at least 15 minutes before they posted anything. I wish they would post to their status blog as soon as they have identified there is a problem. At least that way people like myself wouldn’t have to tie up their support channels with a question that could have easily been answered on their status page.

New Features?

While checking my Cloud Files account I noticed that there seem to be references to options which will enable users to move backup images to Cloud Files. I couldn’t actually find the buttons that would allow the move, but the interface shows image locations now and claims there is a move button.

To create an On-Demand image, click the New Image button below. Images located Cloud Files will remain even after deleting their parent server. Images located With Server will be deleted if you destroy their parent server. To move an image to Cloud Files, click the Move link in the table below.

I’ve been waiting for this feature for a long time. Storing my servers’ backups in Cloud Files means I can create a new server, try something for a few hours, back it up and then delete my server. Then a few days later I can load that saved server image from Cloud Files and continue where I left off. At the moment, as soon as I delete a server it’s gone for good.

I still am a big fan of Rackspace’s Cloud services and I am eagerly awaiting the ability to store my backups in Cloud Files.

Share

I have just finished implementing Rackspace Cloud Files as the storage method for user uploaded product files on my Just1Registry online wedding registry service. I am very pleased with the results. Having the photos stored on a CDN has greatly improved page load time for my users and lessens the load on my servers.

Since the switch to Cloud Files worked so well, I thought that it would be great to use Cloud Files to store regular MySQL database dumps of my production server. I use Rackspace Cloud Servers for most of my sites, and yes, I know they have full image backups available (which I do use), but I wanted another layer of backup. More precisely, a simpler one. The only data on my servers that needs protecting is the database. The code is safe in my Springloops Subversion repository, and as I said previously, user uploaded product photos are tucked inside Cloud Files. So it seems a waste to have to restore my server to a new server instance just to grab a copy of my database.

My quest for Rackspace Cloud Files based backup solutions started where all my quests start . . . I asked Google.

I came across this article which put me on the right track, but it wasn’t the full solution I was looking for. After a little more research (specifically learning more about Duplicity), I put my own bash script together and was dumping my database to Cloud Files in no time.

The following is a step by step procedure to set up a script on Ubuntu 9.10 that will dump a MySQL database and upload the dump to Rackspace Cloud Files.

Things you’ll need:

  1. Rackspace has released source code for their API in multiple languages. In this instance we need the Python libraries. You can download python-cloudfiles here from github or use wget and the link I provide later on.
  2. Duplicity makes our job so much easier – it’s a beautifully simple tool. You could download it here but I’d recommend doing an apt-get.
  3. Get your Rackspace Cloud API key from your Rackspace Cloud account. If you don’t have an account . . .  get one!
  4. If you’re backing up a MySQL database you might want to make sure MySQL is installed on your system 🙂

The first step is to log in to your server. You will need root access for most of the following steps.
Next install duplicity.

sudo apt-get install duplicity

Next download python-cloudfiles from github. Extract the files and run setup.

wget http://github.com/rackspace/python-cloudfiles/tarball/1.7.0
tar -xzf rackspace-python-cloudfiles-8a1e850.tar.gz
cd rackspace-python-cloudfiles-8a1e850
python setup.py install

At this point, we are ready to create the script that will perform the MySQL dump and Cloud Files upload. You will need to have created a container in your Cloud Files account and you will also need your Cloud Files API Key and username.
Create the following script. Note the Cloud Files portion of this script was taken from here.

sudo nano /root/backup-mysql.sh

Contents of backup-mysql.sh

#!/bin/bash

# name of database to dump and username and password with access to that database
MYSQL_DB="mydatabase"
MYSQL_USER="username"
MYSQL_PASS="password"

#create output file name with database name, date and time
OUTPUT_PATH="/backup/mysql"
NOW=$(date +"%Y-%m-%d")
FILE=${MYSQL_DB}.$NOW-$(date +"%H-%M-%S").sql.gz

CLOUDFILES_CONTAINER="mysql-backup"
export CLOUDFILES_USERNAME=Your Cloud Files User name
export CLOUDFILES_APIKEY=API_KEY_YOU_GOT
export PASSPHRASE=The Passphrase for your encrypted backup

# dump the database and gzip it
mysqldump ${MYSQL_DB} -u ${MYSQL_USER} -p${MYSQL_PASS} | gzip -9 > ${OUTPUT_PATH}/${FILE}

duplicity ${OUTPUT_PATH} cf+http://${CLOUDFILES_CONTAINER}

Give script appropriate permissions:

sudo chmod +x /root/backup-mysql.sh

Create a cron job

sudo nano /etc/crontab

Add this line to /etc/crontab

15 * * * * root /root/backup-mysql.sh >/dev/null 2>&1

Since I already have daily backups I have set up my script to run every hour. This may be a problem in the future but seems to be fine for now. Note: The reason I am sending the output of the backup script to /dev/null is because even on successful backups Duplicity generates output which is automatically emailed to me when the job is run. I dislike getting hourly emails.
The script could use some work but it does its job at the moment.

To restore files from Cloud Files use the following script:

#!/bin/bash

# path to restore files to
OUTPUT_PATH="/backup/mysql"

CLOUDFILES_CONTAINER="mysql-backup"
export CLOUDFILES_USERNAME=Your Cloud Files User name
export CLOUDFILES_APIKEY=API_KEY_YOU_GOT
export PASSPHRASE=The Passphrase for your encrypted backup

duplicity cf+http://${CLOUDFILES_CONTAINER} ${OUTPUT_PATH}

The duplicity command has many powerful options, you can view duplicity man page here.

Share