Moved again!

I swear, I actually seem to move hosts more often than I tend to blog these days. Oh well, this time i’ve moved away from AWS to a combination of Laravel Forge and Digital Ocean.

There were a number of reasons for this, but largely it was to consolidate hosting of various hack apps alongside my blog.  I had various apps hosted on AWS, Pagodabox and Fortrabbit. Lots of these were just little hack apps which I wasn’t really happy to pay for hosting. Recent changes to Fortrabbit and Pagoda meant I could no longer host there for free so I decided to throw them all up onto a Digital Ocean server powered by Forge.

I’ve used Digital Ocean at work before and it’s a great service, but I can’t really be bothered to do server admin on hack stuff and Forge makes that whole process a breeze. It took about 30 minutes, but in that time i’d purchased and set up both the Forge and Digital Ocean accounts, migrated this blog from AWS, and moved the first of my hack apps ( across. With the quick deploy feature on Forge i’m able to deploy updates to my apps via commits to Github. This gives me all the power and convenience of Pagoda and Fortrabbit deployments.

Pricing wise, i’m able to host my blog and all these other little sites for around $20pcm total ($10pcm for each service). This is roughly what I was paying for my AWS account too but i’ve got much more flexibility here. I never liked AWS much, and their console is horrible. It took me almost as long as the rest of the work just to try and work out how to terminate the EC2 instance. Safe to say i’m not sad to see them go.

In terms of Pagodabox and Fortrabbit things are a bit different. Both have been really handy services for me and I still use Fortrabbit for a couple of things at work. Pagodabox is in the middle of a big infrastructure update, and good luck to them with that. They’re just not suited for this purpose, but i’ll use them both again i’m sure.


Composer on Pagoda Box

Composer recently switched from using GitHub’s http interface to using the GitHub API to load dependencies. This caused issues on Pagoda Box (and probably elsewhere) as the number of unauthenticated API requests stacked up. For me this was causing deployments to be aborted as Github prevented further requests required by my application to fullfil its composer requirements. The solution is to prevent the Github API from being used and installing from source. Turns out this is actually very simple.

If you have something like this in your boxfile:

php composer.phar install

then you need to update this to force installation from source like:

php composer.phar install --prefer-source


Integrating offline documentation into my workflow with Dash

People like to compare programming languages, or their favourite framework. Everyone tends to have a favourite and the subsequent discussions (arguments/flame wars) very rarely add any value. Lots of non-PHP programmers delight in the various “php sucks!” articles, sometimes they’re right but often they miss the point. PHP can suck but so can any programming language when used incorrectly. That said, i’m pretty fond of PHP. Its been my programming language of choice for the last 10 years give or take. I happen to like it and I tend to think there’s a lot to like about it. If I was to distill this down to a killer feature though (as people tend to do) then i’d suggest its nothing in the language itself. For me the killer feature in PHP has always been the documentation. The PHP docs rock, they look awful (hopefully not for much longer) but they add so much value to the language its untrue. The comments too are usually excellent. I’ve lost count of the amount of times where the answer I was looking for (whatever the question) was either answered by the documentation itself or the user contributed notes beneath. If you’re a PHP dev then you will have found this too – 100%. The only problem with the PHP docs is how much I can rely on them at times, and that they rely on an active internet connection. Now, admittedly this is few and far between but there are times when internet is flaky or non-existent – travelling on trains, planes (or automobiles) for example. Recently though I’ve found a solution to this issue – a great Mac App called Dash

Screen Shot 2013-02-23 at 21.12.45
launching dash from Alfred

Dash. Dash describes itself as “… an API Documentation Browser and Code Snippet Manager. Dash stores snippets of code and searches offline documentation”. You want a searchable copy of the PHP docs available offline then Dash is for you. Of course, Dash is not limited to PHP. There are docsets for most languages and libraries you can think of. This is already powerful. Recently though i’ve picked up on a couple more features in Dash that make it really killer for me and now, whether I have internet or not, I tend to use Dash for all documentation reading. One of these things is the hookup it has with Alfred, another excellent Mac app. I can launch Alfred with the shortcut ⌥+SPACE, and by typing “dash SOMETHING” the docs are immediately searched for that item – for example “dash array_intersect“.

Screen Shot 2013-02-23 at 21.27.48
A function definition file within Netbeans

This is pretty sweet. Its quick, its offline, and somehow it feels distraction free as its outside of the browser. This definitely scratched an itch for me, I wondered if Dash could do anything more that I would fine useful. It turns out it could. I’m a big fan of the Netbeans IDE for all my PHP development. Yeah, I know all the cool-kids are using Sublime Text with its infinite customisation but I like Netbeans. I’ve been using it for years, i’m happy with it. One thing i’ve never liked so much though is the way it handles documenting core php functions and libraries. Hovering over a function will display the Phpdoc for that code, where as ⌘+Click will take you to the function. This works great for custom code but not so great for the core  language code.  I want to see this documentation on the PHP site, the Netbeans method of displaying it within a function definition file just doesn’t cut it. Fortunately i’ve found a method of integrating Dash here too – using the “Look up in Dash” System Service. I’ve been a Mac user for about 3 years though and i’ve always been a little confused by services, this is the first time i’ve made any use of them. Through a System Service within Netbeans i’ve been able to connect a shortcut in Netbeans to the Dash service. I’ve chosen ⌘+⇧+D. By highlighting a piece of code, and using that shortcut Dash is launched and searches immediately for that code. Of course you can use any shortcut you like but this works for me. Pretty simple in the end, and now I have Dash linked into my workflow quite effectively.

A new home for my blog

It seems like every other post on this blog has been about a new location for my blog. For the last couple of years i’ve happily been hosting my site at They’ve provided a great service but I just felt like I needed  a change. Predominantly this was because I was doing some investigation into cloud hosting solutions for work and I realised i’ve never had a proper play with Amazon’s EC2 – the backbone of much of the “cloud” infrastructure on the web. So here I am, a new home for the foreseeable future. I’m also attempting to blog more frequently. Hopefully both things will go well.

Buying presents and the trick to being thoughtful

Like most people (more often Men) I used to always really struggle whenever it came time to buy a present for a loved one. Whether it was Christmas or Birthday I would inevitably end up flustered and forced down the route of the unimaginative book or chocolates. There were even times where I totally forgot Birthday’s of people that mean everything to me (this is an altogether sadder story though so maybe i’ll leave that one for another time). This was clearly not on but a couple of neat little tricks have helped ever me break this cycle of bad son/boyfriend and friend.

The first one you’d think is a total no-brainer but it was something i’d not ever really done. I’m not great with dates, I could remember a few key ones but Birthdays of all my friends and family were beyond me. These are all now stored as repeating Birthday events in iCal which in turn is synced to Google Calendar. Don’t just rely on Facebook to remind you about Birthdays. I check my Calendar most days for one thing or another, this way I can now see all important events as they approach and plan ahead. I do most my shopping on the internet so its always vital to leave that all important shipping time.

Its all well and good getting prior notice about needing to buy a present but that pretty meaningless if you’re still devoid of inspiration. This is where a very simple lifehack that a good friend told me about has really changed my present buying behaviour – keep a list. Often, one of the most frustrating things about a lack of present inspiration would be that nagging feeling that the person in question had mentioned something they needed but you could no longer remember what they’d said. This is where my super simple lifehack comes in. I have a list within the notes app on my iPhone which is subdivided into each important person in my life that I tend to buy presents for. Each time i’m with one of these people and they mention something they’d like or they’ve lost or they’re interested in I add it to their list. When present buying time come’s around I look to the list and invariably there is something there to choose from or to give me a theme or a starting point. As a result i’ve managed to come up with some pretty kick ass presents in the last few years.

I guess that this may make me sound a little cold and lacking in heart (a bit OCD also maybe?) and maybe the title of this post would have been more honest if it was “The trick to appearing thoughtful”. Maybe this little lifehack means I am actually thoughtful, you will likely decide on that yourself. All I know is that it works for me.

While we’re on the subject of presents, whoever perpetuated the myth that Alcohol was a thoughtless and pointless present clearly did not really enjoy a drink. If in doubt what to buy for me, buy me booze.

Using multitail for monitoring multiple log files

Like many developers my job tends to include a number of low-level sysadmin tasks. I generally have open most the day with one thing or another, whether working locally or SSH’ed into one of our remote servers. Once an app is in production its really handy to keep an eye on the server logs to see whats happening and be able to respond proactively to errors as they occur. Multitail is a great tool I found for monitoring multiple log files at the same time, helping to keep all of this monitoring in a single window.

In simple terms multitail allows you to monitor multiple files simultaneously. In my case this is almost always the apache error_log file but it could be access logs, ftp logs or anything really.

A simple use of multitail could be:

-l "ssh root@REMOTE.IP.1 tail -f /usr/local/apache/logs/error_log" 
-l "ssh root@REMOTE.IP.2 tail -f /usr/local/apache/logs/error_log"

One of the most powerful features in multitail is the ability to add exceptions based on regular expression patterns. This allows you to filter out any errors which you’re not as interested in. For example, if you’re monitoring a log for PHP errors you may be less interested in 404 errors. This can lead to a more advanced multitail usage like this which includes named windows and multitail divided into vertical columns:

multitail -du -C -s 2
-Ev "does not exist" -Ev "filter this" -Ev "dont show this"
-t WindowName1 -l "ssh root@REMOTE.IP.1 tail -f /usr/local/apache/logs/error_log" 
-t WindowName2 -l "ssh root@REMOTE.IP.2 tail -f /usr/local/apache/logs/error_log"
-t WindowName3 -l "ssh root@REMOTE.IP.3 tail -f /usr/local/apache/logs/error_log" 
-t WindowName4 -l "ssh root@REMOTE.IP.4 tail -f /usr/local/apache/logs/error_log"

Installation of multitail is really simple if you’re using Homebrew, simply “brew install multitail” and you’re ready to go.

Removing duplicate rows in MySQL

Its often the case that you find application issues at the stage they become problematic. MySQL seems to be one of the most common ones of these for me whether it be something as simple as a lack of an index or something far more fundamental with your schema. A recent issue I came across had been caused by some far from perfect code associated with updating of elements within an ecommerce CMS using an API connection. A table that should realistically have no more than 10,000 rows had grown to over 4 million. This had caused an almost inevitable slowdown with all interactions with this table. Looking at the table there was a huge amount of data duplication. The data tended to be duplicated on all but the primary key, the question was how to remove this duplicate data without having to run a long running PHP or Shell script against the production database. The answer was surprisingly simple and one of those times where a simple SQL command is all thats needed.

ALTER IGNORE TABLE `table_with_duplicates` 
ADD UNIQUE INDEX `remove_duplicates` (`col_1`, `col_2`,  `col_3`);

An explanation of how this works can be seen on the MySQL site:

IGNORE is a MySQL extension to standard SQL. It controls how ALTER TABLE works if there are duplicates on unique keys in the new table or if warnings occur when strict mode is enabled. If IGNORE is not specified, the copy is aborted and rolled back if duplicate-key errors occur. If IGNORE is specified, only the first row is used of rows with duplicates on a unique key, The other conflicting rows are deleted. Incorrect values are truncated to the closest matching acceptable value

Working with Zend Tool in multiple dev environments

On any given Zend Framework project I can be working in 2 or 3 locations – my work PC, home PC or my MacBook. My source code will always be in Subversion and I usually work on a development server before pushing completed work to the production server. In this kind of environment I’ve never been too sure where exactly I should work with Zend_Tool.

How i see it, there’s 2 options:

  • Set up to work locally with Zend_Tool on each dev environment and then push to the dev server from there, checking in the Zend_Tool manifest etc with each Zend_Tool usage.
  • Use Zend_Tool directly on the dev server and then download each addition/alteration to then push into SVN.

I would be inclined to say the most reliable way would be the multiple Zend_Tool setup but i’d be interested to hear if people can think of any potential issues with this or reasons why i should make a different choice.

n.b. I originally posted this as a question on Stack Overflow – feel free to drop in over there and answer the question.

Simple introduction to using Oauth with Zend_Service_Twitter

The Zend Framework is slowly changing the way i develop websites. I say slowly as the documentation for much of the framework is sadly lacking. It is overly complex at times and at other times it is lacking in required detail or is just out-of-date. In putting together  a twitter based application recently i came across such an occasion with regards to the recent changes made by twitter in turning off basic authentication. Hopefully, this might help someone else too.

If you’re looking to build a Twitter application using the Zend Framework then there is some half decent information on the Zend Framework site in the section dealing with Oauth. However it was lacking in some detail, looking at the documentation for Zend_Service_Twitter was also not particularly useful as it has not been updated with examples of how to use Oauth. What i’ve put together is a really simple example of the process you could follow to make this work. The beauty of the Zend Framework is that there is always a number of ways to achiveve the same task so this is by no means the “right” way to do it, that being said it works for me.

First of all i added the configs i would need to my config file, application.ini

oauth_consumer.callbackUrl = "";
oauth_consumer.siteUrl = "";
oauth_consumer.consumerKey = "MY_CONSUMER_KEY";
oauth_consumer.consumerSecret = "MY_CONSUMER_SECRET";

Its probably worth noting that i use sessions to persist a few values in this example, they are setup like this within my TwitterController.php file:

class TwitterController extends Zend_Controller_Action
protected $session;
public function init()
$this->session = new Zend_Session_Namespace(‘Default’);
// etc..

Given the multitude of options with regards to how your Zend application might be put together I will not tell you where the rest of the code should be placed as thats completely dependent on your application.

First of all get your request token and then redirect the user to Twitter for them to grant access to your application

// within TwitterController::authAction
$config = $this->getInvokeArg(‘bootstrap’)->getOption(‘oauth_consumer’);
$consumer = new Zend_Oauth_Consumer($config);
$token = $consumer->getRequestToken();
$this->session->request_token = serialize($token);

Following the authorisation at Twitter the user will be returned to your callback URL, identified in your config file as oauth_consumer.callbackUrl. Using a combination of your request token and the response from Twitter the unique user access token is generated using the getAccessToken method of Zend_Oauth_Consumer.

// within TwitterController::callbackAction
$config = $this->getInvokeArg(‘bootstrap’)->getOption(‘oauth_consumer’);
$consumer = new Zend_Oauth_Consumer($config);
$access_token = $consumer->getAccessToken($this->_request->getQuery(), unserialize($this->session->request_token));

This was all pretty simple, you now have an access token that you can store for all later usage. It was unclear from the documentation though what the next step should be, all examples of Zend_Service_Twitter used basic authentication. Looking at the source code i noticed that for a valid signature you have to pass to the Zend_Service_Twitter constructor an array of options that included the config variables from your original request to Zend_Oauth_Consumer as well as your access token and twitter screen name:

$token = unserialize($user->getUserToken()); // retrieve token from storage
$configs = $this->getInvokeArg(‘bootstrap’)->getOption(‘oauth_consumer’);
$twitter = new Zend_Service_Twitter(array(
‘username’ => $token->screen_name,
‘accessToken’ => $token,
‘consumerKey’ => $configs[‘consumerKey’],
‘consumerSecret’ => $configs[‘consumerSecret’],
‘callbackUrl’ => $configs[‘callbackUrl’]
$response = $twitter->account->verifyCredentials();

Thats it, you are now able to make use of all methods within Zend_Service_Twitter. Happy tweeting!

Moving to from Media Temple to Linode

This weekend I finally found the time and motivation to do something that I probably should have done a couple of years ago, I started moving my sites away from Media Temple’s Grid Server. If you’re reading this post then i guess the migration to my shiny new Linode VPS has been successful. I was very tempted to write an in-depth discussion of why I have left Media Temple and why i chose Linode. I decided not to for a few reasons:

  • This could just be my opinion – feel free to do your own research though
  • Media Temple are likely to be too busy schmoozing at industry parties to care
  • Its sunny outside and i can smell BBQ
  • I’ve already exhausted too many hours trying to make my Grid Server usable, i don’t intend to waste any more of my time even thinking about them
  • I’m a geek with a shiny new toy that doesn’t suck, i intend to play with it now