No Cloud Files

Rackspace Cloud Files was down for about an hour today. This had no effect on the connected CDN but it meant that all of my sites which use the Cloud Files API wouldn’t work. I contacted support, who told me there was a problem with the Cloud Files servers and that they would be posting the outage on their Rackspace Cloud Files Status blog shortly. Cloud Files must have been down for at least 15 minutes before they posted anything. I wish they would post to their status blog as soon as they have identified there is a problem. At least that way people like myself wouldn’t have to tie up their support channels with a question that could have easily been answered on their status page.

New Features?

While checking my Cloud Files account I noticed that there seem to be references to options which will enable users to move backup images to Cloud Files. I couldn’t actually find the buttons that would allow the move, but the interface shows image locations now and claims there is a move button.

To create an On-Demand image, click the New Image button below. Images located Cloud Files will remain even after deleting their parent server. Images located With Server will be deleted if you destroy their parent server. To move an image to Cloud Files, click the Move link in the table below.

I’ve been waiting for this feature for a long time. Storing my servers’ backups in Cloud Files means I can create a new server, try something for a few hours, back it up and then delete my server. Then a few days later I can load that saved server image from Cloud Files and continue where I left off. At the moment, as soon as I delete a server it’s gone for good.

I still am a big fan of Rackspace’s Cloud services and I am eagerly awaiting the ability to store my backups in Cloud Files.

Share

I can’t say I learned a lot at school, but I did pick up a few tips that helped me polish up my self-taught coding technique.

During my study of software engineering, my instructors always stressed two things: modular design and documentation. Simple, yet excellent concepts that will save you–and anyone after you–hours and hours of time. I’ve been applying these principles to all of my code ever since I learned about them, but I’ve only recently begun to understand just how important they really are.

I’ve been working with some very unmodularized and undocumented code recently, and I’ve begun to develop an immense dislike for the original author. I’m supposed to add simple little features here and there. By themselves, the features are simple–around 10 to 100 lines each–and would take maybe 10 minutes each to write. To add them to a well-documented, modular coded project might take an hour or two including some testing. But incorporating these tiny features into an existing project base consisting of unmodularized and undocumented code takes 10 to 20 times that. Believe me, I know from experience. I am faced with the constant urge to rewrite the entire code base from scratch. I know it would take longer to rewrite everything, but I have to say it would be much less painful.

Logical code is easy to follow and easy to change. If something is a little strange, then you fall back on the documentation to figure out why. But when the code is an absolute pile of garbage with no documentation, looking like it was written by a first-time coding high schooler, you spend most of your time trying to figure out why the previous author did what they did, how they did it, and why they were ever allowed to touch a computer keyboard in the first place.

 

[ad name=”Google Adsense 468×60″]
The problem increases exponentially when dealing with web projects, especially ones coded in PHP. Modular design in web projects has always been possible but somewhat awkward, considering that up until PHP 5 we had very poor object support. Only in the last few years have there been any decent frameworks to use that employ an MVC structure. My favourite one, of course, is Zend Framework. Before the Zend Framework, I personally had been working on an MVC-like PHP framework of my own to help alleviate the somewhat cumbersome structure that even my PHP projects were sometimes taking. After trying the Zend Framework, I never touched my own framework again.

Having come to realize the power of a good PHP framework and the importance of documentation and modularized code, it just makes me want to cry–and wish horrible things upon the original author whose code is making my job so much more difficult than it should be.

Share

After reading a great comment on my blog post 11 benefits of having a prosthetic eye covering, I decided to check out Paul’s site. I rather enjoyed reading the post, Adapting to Monocular Vision. It would have been really helpful to have read this immediately after I lost sight in my right eye. I would have been more prepared by knowing what to expect. I can relate to almost everything mentioned, which shouldn’t be surprising considering it was written for people like me. It’s nice to know I’m not alone.

The most frustrating thing for me was putting ketchup on hot dogs. It’s incredibly difficult to line up the ketchup bottle with the hot dog. It took me three attempts: first time I missed the plate, second and third time I missed the bun. My sister was watching and I could tell she was quite amused by the way she was rolling around on the floor laughing hysterically.

The other cool/scary thing is stairs. Certain stair cases can actually look flat from the perspective of looking down them. If there are no shadows on the stairs or they have some sort of patterned tiles on them, I can’t tell exactly where each step begins and ends. Needless to say, I always use the railing, at least to start. Going up stairs isn’t usually a problem; falling up the stairs is much less painful than falling down them.

The worst thing is bashing my head or shoulder on things because I don’t see things on my right side. It’s very painful and worst of all, you don’t see it coming–literally. There have been several door frames I wanted to punish for attacking me for no good reason.

 

[ad name=”Google Adsense 468×60″]
I’m very curious if my pool playing abilities have improved due to having one eye. I’m pretty sure they haven’t. I thought my mini-golf skills had improved, but my two-eyed wife can still beat me.

The tip about sitting at the table in restaurants was interesting. It’s very annoying to be talking to someone sitting beside you at a table, because you literally have to turn your entire head to see them. Of course, this will look rather strange to the rest of the people at your table.  My preferred table seat is right-most corner, preferably the opposite side that the server would come from.

The other fun thing is not noticing someone approach. For some reason, I set up my home office so that my right side is facing the door. Not good. My wife scares the living daylights out of me, albeit unintentionally, on a regular basis. I have offered to buy her tap shoes but she declines.

I’ve been using good old lefty for almost five years now and I’m getting pretty used to it. I can now successfully put ketchup on a hot dog, and I can pour liquids into a glass 95% of the time. Bashing my head on things is not fun and getting foreign objects trapped underneath my prosthesis is incredibly painful, but all in all, I don’t mind having one eye. Could be worse, I could have no eyes at all. 🙂

Share

When I lived in Hamilton, I used Mountain Cable (now Shaw) as my Internet Service Provider. The service was excellent, very fast and there were no usage limits. I could download hundreds of gigabytes per month and no one cared. I have since moved to Kitchener and been forced to be introduced to the wonderful world of Rogers. I should note that I dislike Bell and anything Bell-related, and that I’ve never had a positive experience with DSL. So Rogers high speed it was.

Unlimited for only $25 $50 more per month? no thanks

I signed up for the Rogers Extreme plan which includes 10 Mbps down, 1 Mbps up, and 95 GB of usage. I was a little concerned about the internet usage limit, but I wasn’t worried because I also noticed that additional usage was billed at $1.50/GB to a maximum of $25. $25 more for unlimited internet? Sounds good. I generally was under the 95 GB mark, but if I went over I made sure I went way over to justify the extra charge.

Unfortunately, at some point without my knowing it, Rogers increased the maximum additional usage charge to $50, which means my total internet bill could be over $110. That’s too high for me. I noticed Rogers had 2 more faster plans available: Extreme Plus for $69 (25 Mbps/1 Mbps/125GB) and Ultimate for $99 (50 Mbps/2 Mbps/175 GB). I figured I would just upgrade to the Extreme Plus and pay $10 for an extra 30 GB–that should be enough, I thought. I phoned up Rogers and upgraded my plan. I was a little unhappy that I was told I would need a new modem to take advantage of the faster speed, but I was more concerned with the extra $7 per month I’d have to pay just to rent the thing. Also, the new modem was actually a Wireless N gateway. That’s nice, but I already have a wireless N access point and use pfSense as my router/firewall. I was not impressed, but I figured the exta GB and 25 Mbps down were worth the inconvenience.

The swapping of the modem

The next day, I went to the Rogers store and swapped my modem. I also decided that I would purchase the new modem instead of renting–it would be paid for after 28 months of paying $7. I took my new modem home and set it up. The first thing I did was log in using the mso credentials to disable the gateway nonsense. I’m very glad they let me do that, although it’s a waste because I just paid $199 for features I’ll never use. It was really easy to set up. Once the gateway features were disabled, pfSense was able to get my public IP from the modem instead of the 192.168.0.10 address it had previously been assigned. I then excitedly went to speedtest.net and ran a speed test (see below).

I’m about 15 Mbps short…?

…hmm, that’s nice, I thought, but I’m paying for 25 Mbps. I ran it again, and again, and then tried the official Rogers speed test, plus a bunch of other ISP speed tests and even thespeedtest app on my iPhone. Each showed very similar results. None would go over the 10 Mbps barrier. I knew it was incredibly unlikely that a busy network would give me these consistent results and I knew that being so close to 10 Mbps couldn’t just have been a coincidence, so I called technical support. I actually was going to use the live chat technical support, but when I read the disclaimer about closing windows on your computer that you didn’t want the tech support guy to see, I immediately picked up the phone. I’m probably more qualified than most of the tech support at Rogers, so there’s not a chance I’d let some newbie touch my precious Mac–assuming they even support Macs.

I had to resort to calling tech support

Tech support was helpful and figured out my problem right away. The genius that upgraded my account the day before hadn’t checked properly to see if the DOCSIS 3 was installed in my area yet. It wasn’t. The tech support guy said it should be coming in a few months but couldn’t promise anything. Wonderful. I asked the tech guy if I could keep the Extreme Plus on the old speed, but with the increased usage limit of 125 GB. He said that I could not officially be switched to the new plan until DOCSIS was in my area, so I was stuck with my 95 GB limit. Although I was unhappy about this, the tech support guy was nice and friendly and answered all my questions fully. He then transferred me to customer service so I could downgrade my account.

I must say the customer service representative I spoke to was one of the best customer service folk I’ve ever encountered. She was very friendly and helpful, and instead of the typical silence one experiences while the representative types away on a computer, she told me exactly what she was doing and why. I greatly appreciated her friendly approach. To compensate me for my inconvenience, Rogers gave me a $5 goodwill credit–which will barely dent my additional usage fees for the month, but it’s better than nothing. Anyway, I’m on the right plan now and will probably try upgrading to Extreme Plus again when it’s available in my area. Apparently I’ll know when that happens because I will be “bombarded with marketing” according to my friendly Rogers customer service representative. I don’t doubt it.

Disappointed but content

Overall, I am happy with my Rogers service. It’s always super fast (even at 10 Mbps), my 3rd party VOIP works perfectly, and I’ve always had a good experience with Rogers customer service. I just wish the sales guys would not sell services to customers that they can’t use. Oh, and it would be great if Rogers offered a cheap DOCSIS 3 compatible modem with no gateway functions.

Share

Version Control Can Be Dangerous

Version control for a website sounds like a great idea. Everyone wants to be able to make risk-free changes to their website. It’s a great feeling to know that at any given time you can revert to previous versions of files. With a VCS (Version Control System) like Subversion, you get a nice central location for all your code, a web interface to browse your files and, well, it’s a great thing to brag about to all your techie friends. The only problem is that it can seriously screw up your web development workflow if the implementation of such a system is not thought out carefully.

One Dilly of a Pickle

The biggest potential hurdle in setting up a VCS for web development is the web server. Unless you’re an HTML-only web developer (hopefully they don’t even exist anymore), then you either have a special development web server set up or you use your live web server to test out your scripts. Once you’ve figured out how to check your files in and out via your VCS, you’ll realize that you need to get these files to your web server. If you don’t have root access to your server, you might be in trouble. At this point you’ll want to stop everything and re-evaluate implementing VCS into your web development workflow. I’ve had to re-evaluate my processes a few times and the following describes what I’ve learned about various web workflow methods I have experimented with, as well as a detailed description of my current web development workflow.

 

[ad name=”Google Adsense 468×60″]

Remote Web and Subversion Server Accessed by Samba Share…

…a theoretically simple setup, but practically, it turned about to be a nightmare. I set up a basic LAMP box and also installed Subversion on it. I also set up a Samba share and created Apache virtual hosts to map to those shares. The theory was that I would check out files from the Subversion server to my local machine to a network drive mapped to the Samba share. So when working on files I could just CTRL+S and see my changes immediately on the remote server. When I was happy with the changes I would commit them. This setup almost worked, except for the checking in part. I kept getting file permission errors whenever I tried commit due to the Windows Subversion client not having permissions to write to the Samba share. I spent quite awhile trying to figure out what was going on, but I knew that if it takes this much effort to get it working, then no thanks. I don’t like my development to be put on hold because I can’t save my work properly.

There was one additional disadvantage of this method. By using Samba shares, I was limited to only local development unless I wanted to use VPN or SSH tunnels. It was time to move on.

Remote Web and Subversion Server with Files Checked in to Test

This method involves having Subversion installed to your web server and having it automatically export your project on commit to your web directory. This works pretty well. Your code is safe, as it’s checked in to your server constantly. The only problem is that checking in a file every time you make a change gets very annoying very quickly. There’s also the post-commit script that has to be configured properly, and if you have multiple projects you’ll have to account for that in your post-commit script. In my opinion, it’s not worth the trouble.

 

[ad name=”Google Adsense 468×60″]

How I Do It

I use three machines for development: an iMac running OS X 10.6 and a ThinkPad running Windows 7 at home, and a PC running Vista at work.

On each of these machines, I use Zend Studio 7.1 as my IDE and Zend Server CE 5.0 as my local web server.

Each one of my projects gets its own repository on my Subversion service hosted by Springloops. In most cases, for each project I have two deployment servers set up: one for production and one for staging.

Zend Studio handles all my Subversion tasks, although I sometimes use Tortoise SVN (Windows) or a terminal (Mac or Linux) for other non-code-related files.

I set up virtual hosts to point to each one of my projects’ working copy located in my Zend Studio workspaces folder.

My Typical Coding Session

My typical coding session would go like this:

Update local working copy of my project in Zend Studio.

Write some code and test on local development web server (Zend Server CE).

When I’m happy with those changes, I will commit them in Zend Studio, and then Springloops will automatically deploy the code to my staging server.

Once the site has been fully tested, I log in to my Springloops account and manually deploy my site to my production server.

But What About Databases?

You may have noticed I haven’t mentioned databases. Due to the potentially catastrophic issues that could occur from mixing staging and production databases, I keep them separate. I use both local and remote MySQL servers, depending on the type of project. I manually syncronize them when necessary by using phpMyAdmin.

Share