Tower Storm and my 30 second 2 button deployment process

2 years ago I read an amazing book The Lean Startup which made me completely rethink how I developed applications and websites. In the past I’d make a change to an application, spend 30 minutes going through the main features and release. Inevitably a few days later customers would come back saying x, y and z are all broken. So I’d fix them and re-release and now 3 other completely different features would be broken. This happened consistently but with limited time and money I thought it was impossible to improve this process.

Today I can make a change to Tower Storm and within 30 seconds have this change live online with very little (soon to be none) manual testing and no old bugs are coming back to bite me. In this post I want to show you how I’ve done it.

Automated Testing

The first step is to eliminate this 30+ minutes of testing that I had to manually do after every change. There is absolutely no way you can quickly release and iterate on your app without either:

a. An army of testers who are always able to detect any bug and will happily retest everything in your application upon every single change
b. Automated tests that find regressions and issues in your application for you

Being that you’re running a lean startup here I don’t think you’ve got thousands of dollars to burn on many dedicated testers so lets explore the world of automated testing.

Automated testing is where you create code that tests the functions of your application to determine if it’s doing what it should be doing. For example if you have a function that should remove every comma from a string and you want to ensure that it works for a variety of different strings you might create a test like so:

function testRemoveComma() {
    var sentence = "this,is,a,sentence";
    var expectedSentence = "thisisasentence";
    var actualSentence = removeComma(sentence);
    assert.equals(expectedSentence, actualSentence);
   
}
									

In this javascript example we first create a sentence with commas in it, then we specify what we expect back from our function. Then we call that function and ensure that what we got back from it matches what we expect back from it.

This example is what is known as a “unit test”, it is a test that checks one function by giving it inputs and receiving outputs and it doesn’t do other tasks such as connecting to your database or reading files. It should only check one function only. If that function calls other functions you need to use a technique called ‘mocking‘ so that they don’t really get called. I’ll go into more detail on how to create unit tests and mock objects in a variety of languages in a later post.

To start unit testing you’ll need a library to run these tests. Generally there is one or two good testing libraries for most languages. For Javascript I’d recommend mocha for Node.js testing or Jasmine for client side testing, for Java JUnit with Mockito is awesome and for PHP PHPUnit works well.

Unit tests are the simplest, fastest and often most fragile tests. They aren’t the best for ensuring your app is bug free but they are perfect for eliminating existing bugs and ensuring they never occur again.

The thing I love about unit tests is because they are so fast and easy to write you can do a process known as test driven development. This is where you write unit tests for your code before you write a single line of code. So in the remove comma example above we could write an empty removeComma function, then write the above test and run it only to see it fail, then after it has failed we create our removeComma function and run the test again and when it passes it means our code is working.

When you do test driven development constantly you can save hours by not needing to run your app after each code change. You simply test then code, test then code and eventually at the end you run your app and because every function works as it should your app should (in theory) work perfectly first go. It’s amazing when you get into this flow because if you’re building a large app you can continue to code for hours and really get into the zone without having to recompile everything and boot up your app to see if it’s working as it should.

Better testing with Integration and Acceptance tests

After you’ve mastered the art of the unit test there are even more powerful tests that you can use that will allow you to deploy your application without even running it and know that all functionality is working.

You do this by creating integration and acceptance tests. Unlike unit tests Integration and Acceptance tests actually test your app in a real environment with database and network calls. Integration tests are similar to unit tests in that they run one function or a group of functions in order and check that they are all working as they should. The difference is integration tests run the code as if a real user was calling the function, so if the function creates records in the database the integration test will do that, and if your function calls another external service the integration test will do that too.

Here’s an example of a PHP integration test in ZenTester:

/**
     * @test
     */
    function Checklogin() {
        $this->ci->load->library('userManager');
        
        $random_key = self::$random_key;
        $reg_data = ut_get_registration_data($random_key);
        
        //logout first (check_login function in controller does this too). 
        $this->assertEquals($this->ci->user->logout(), true, "logging out before logging in");        
        $this->assertEquals(is_array($this->ci->usermanager->register_user($reg_data, 0)), true, "Registering User With All information");
        $this->assertEquals($this->ci->user->login($reg_data['email'], $reg_data['password']), true, "logging in to this user");
        $user_id = $this->ci->user->get_user_id();
        $this->assertEquals($this->ci->user->is_valid_user(), true, "Checking that after login we are a valid user.");
        $this->assertEquals($this->ci->user->logout(), true, "Testing logging out");
        $this->assertEquals($this->ci->user->is_valid_user(), false, "Checking that we are not a valid user after logging out. ");

        ut_erase_user($user_id);
    }
    
									

In this integration test we first create a new user with the helper function ut_get_registration_data. Then we register and log in with that user. After logging in we ensure that the user has successfully logged in and is valid. Then we log out and check that this also worked. Finally the user is deleted at the end.

In this case we create and clean up all our data so the database isn’t littered with test data. The downside of always deleting your data at the end of the test is you may find that it’s hard to track down why an integration test is failing because you can’t see what was created and what wasn’t. At Wotif we don’t clean up our data at the end of each tests and instead re-use the test data upon every run and delete old data at the beginning of each test. This way you don’t add much test data to the database while still being able to figure out what went wrong when a test fails.

Acceptance tests are another level of abstraction, they use your app from a users perspective, loading pages, clicking on links etc and ensuring what is shown to the user after performing specific functions is correct. They are often done with tools such as selenium or curl. At Wotif we’ve been using CucumberJVM to run selenium on a remote box which loads up our app, tests that all the main features are working from a user perspective and reports if anything is broken. These are then run automatically by Team City every time we push a change.

 

Using GruntJS to build your assets 

Grunt is the second most amazing part of the deployment process. It basically takes the application and builds it so it’s ready to upload. It currently does all of the following (the grunt plugin used to do each item is in brackets):

  • Bumps the game version number (grunt-bump)
  • Checks the javascript for any errors (lint)
  • Cleans the build directory (grunt-contrib-clean)
  • Copies all the correct files to the build directory (grunt-contrib-copy)
  • Turns all the cofeescript into javascript (grunt-contrib-coffee)
  • Builds the core game javascript into a single js file (grunt-exec which runs impact.js’s build tool which is coded in php)
  • Uglify’s (minifies) the javascript along with all external libraries and config files into a single game.min.js file (grunt-contrib-uglify)
  • Compiles all the less css files (grunt-contrib-less)
  • Minifies the css and puts it into one file (grunt-contrib-cssmin)
  • Compiles the jade files into html (grunt-contrib-jade)
  • Uploads all the compressed and compiled assets to a new folder on amazon s3, the folder name is the current game version number (grunt-s3)

It’s a complicated process yet grunt.js handles most of these tasks with very little configuration needed and can do all of this in under 30 seconds.

The assets are uploaded to a new amazon s3 folder of the builds version number so that assets are never overwritten and users who are still playing the game are not interrupted. You can do this by setting the variable pkg to your package.json file then using the variable <% pkg.version %> in your s3 upload script. My s3 task looks like this:

grunt.initConfig({
    bump: {},
    pkg: grunt.file.readJSON('package.json'), 
    s3: {
      bucket: 'towerstorm',
      access: 'public-read',

      // Files to be uploaded.
      upload: [        
        {
          src: 'build/public/js/lobby.min.js',
          dest: 'game-server/<%= pkg.version %>/js/lobby.min.js',
          gzip: true
        },
        {
          src: 'build/public/css/lobby.min.css',
          dest: 'game-server/<%= pkg.version %>/css/lobby.min.css',
          gzip: true
        }
      ]
    }
});
									

If you’re using grunt-bump to auto bump the version number with every build you’ll also need to modify the grunt-bump/tasks/bump.js file and add the following line to the bottom of the grunt.registerTask function so that after the version is bumped the variable pkg is set to the latest version:

grunt.config.set("pkg", grunt.file.readJSON("package.json"));
									

In the game code it simply loads the assets for it’s current version number so even if people start games after this build process is done they will load the old game assets and it’s only when the new version is deployed and Node.js is restarted that the new assets will be loaded. This way the server code and game client code are always in sync. Lastly versioning the assets also ensures that users browser don’t cache old assets which could cause errors if gameplay changes are introduced yet clients are loading an old cached version of the game.

All the TowerStorm servers are hosted using Amazon EC2 and in the future I’m looking to implement a system where with each new version a bunch of new servers are spawned with the new game client and assets, then whenever players start new games they are all started on the new servers only and the old servers only stay alive until the last game is finished then they are powered down. This will allow us to continually release new versions of Tower Storm without ever having ‘patch downtime’.

Continuous Integration

The third step is to take this unit testing and asset building and automate it with a dedicated server that runs everything in a server like environment. This way if you have a team of developers they don’t each have to set up grunt and acceptance tests and full build environment on their machine, instead every time they commit a new change the continuous integration server downloads the new code from git, compiles it using grunt and runs all the unit tests using either a custom private server setup or running them on it’s own machine using it’s own browser or a headless browser like phantomjs.

I haven’t yet set up a continuous integration server for Tower Storm as I’m currently the only developer and it was easier to set everything up locally (especially in these very early stages) but I’ll definitely be setting on up soon. At Wotif we’ve tried out Jenkins, Bamboo and Teamcity and all were good in some ways and bad in others. I myself prefer the layout and feel of Bamboo the most however this is often personal preference as other members of our team prefer Teamcity’s layout more. Jenkins is probably the least liked in usability and layout but it is completely free and comes with tons of plugins for doing almost every task you like so if that’s what you’re looking for then it’ll work well for you.

Automated cmd files and the 2 button deploy process

To tie all these various testing, running and deploying scripts together I’ve created a few command files (yes I run windows 8, although I use Ubuntu at Wotif and the Tower Storm servers are running Linux) that make things even easier. Here’s what they do:

commitAndPush.cmd – Runs tortoisegit (my favourite git gui by far) commit screen then push’s the code after you’ve committed your changes. It looks like so:

call E:\apps\scripts\tgit.cmd commit C:\coding\node\towerstorm\GameServer
call git push --all --progress  BitBucket
pause

									

the tgit.cmd file it refrences is a hook to tortoisegit to make it run any command from the command line. It’s contents are:

"C:\Program Files\TortoiseGit\bin\TortoiseGitProc.exe" /command:%1 /path:%2
									

devenv.cmd – Runs the game locally using node-dev which makes it auto restart whenever a change is made and it also runs test.cmd explained next:

set NODE_ENV=development
start cmd.exe /K "cd .. && call ./node_modules/.bin/node-dev server.coffee"
start cmd.exe /K "cd .. && call scripts/test unit"
									

test.cmd – This loads a cmd prompt that automatically runs all the unit tests using mocha and re-runs them whenever a test is made. It scans the test directory for all coffeescript files and runs them:

setlocal EnableDelayedExpansion
IF "%1" == "" (
  SET files=C:\cygwin\bin\find test -name "*.coffee"
) ELSE (
  SET files=C:\cygwin\bin\find test\%1 -name "*.coffee"
)

FOR /F %%i IN (' %files% ') DO SET tests=!tests! %%i 
.\node_modules\.bin\mocha --watch --reporter min --slow 10 --globals $ --compilers coffee:coffee-script --require coffee-script test\_helper %tests%
pause
									

I run these scripts by binding them to the macro keys on my Logitech G15 keyboard (which I bought mainly because it had these keys). I have the dev environment setup bound to one key, grunt bound to another and commit and push bound to a third. This way I can develop in one key press and deploy a new version of Tower Storm using just 2 buttons :)

Hope this was informative enough and if you have any questions or are confused about any steps let me know.

Sublime Text Navigation History Plugin

Ever wanted to be able to jump between files or functions in sublime text and easily jump back to where you were before? Well worry no longer :)

Over the past 6 months I’ve been using a navigation script from http://www.sublimetext.com/forum/viewtopic.php?f=5&t=2738 to make jumping around code faster. For some reason no one has put this into package control, so I added it today, pull request is pending and once it’s merged you can search for “Navigation history” to find it.

I also made one slight improvement so that when you use Sublime Text 3′s new goto functionality you can easily get back to exactly where you were before.

The github repository is here: https://github.com/timjrobinson/SublimeNavigationHistory

Enjoy, and if you encounter any bugs or problems please create an issue for it on Github.

Multiplayer online Tower Defense incoming :)

The past few months I’ve been working on a browser based multiplayer tower defense game called Tower Storm, because it’s something I’ve been wanting to build for years.

I had originally planned for it to be PvE where players work together to kill monsters similar to Azure Tower Defense, however we soon realised while PvE is easier for most people to understand it doesn’t have enough replayability without constantly adding new content.

So I’m focusing on PvP first in the style of Wintermaul Wars because I absolutely love it’s gameplay.

I’ve been play testing with friends on the home lan for a while, hopefully will have a version online soon.

How to stop NodeJS Mocha unit testing console beeping

Thought I’d create this post to save others the 20 minutes I wasted wondering why NodeJS and Mocha Tests were making my console bleep over and over.

The problem isn’t with node, it’s with windows console beeping whenever the console.error command is sent to it (I believe).

You can disable the bleeping by following the instructions here:

http://superuser.com/questions/10575/turning-off-the-cmd-window-beep-sound

The instructions are also below in case this link is ever broken in the future:

 The Windows command line command “net stop beep” will turn off the beeping, and “net start beep” will turn on the beeping.

And how to disable it permanently:

  1. Right-click My Computer and select Manage.
  2. Expand System Tools and select Device Manager.
  3. From the View menu, select Show hidden devices.
  4. Expand Non-Plug and Play Drivers.
  5. Right-click Beep, and select Properties.
  6. Select the Drivers tab.
  7. Click Stop. You can also change the start-up type to Disabled so the beep service never starts.
Tagged , , ,

I was flicking through the jb-hifi magazine while eating breakfast this morning and was seriously amazed that it’s now 2012 and Apple is still the only company that understands how to market technology.

Here’s a hint: 95% of people don’t care about the specs of their phone/tablet (and they didn’t care about laptop specs either). Almost every non geek I’ve ever talked to about computers has no idea what the difference is between AMD and Intel, Ram sizes of even CPU Speeds. All they know and care about are the drive size so they can figure out how much music and movies they can fit on their device and with the rest of the specs they just presume that the bigger number is better.

Seriously apple changed the game way back in 2001 when instead of saying “we have a mp3 player with 128mb of space which runs on a 16Mhz cpu blah blah”, they told people “Fit 1000 songs in your pocket” and showed how much fun they would have using their mp3 players.

Guess what? The iPod went on to become the biggest selling mp3 player ever (It also helped that it was a great product, but so was the Zune and we all know how that ended up).

Now they’re doing the same with phones and tablets and still almost every other manafacturer has their head in the sand.

Samsung figured it out recently too, I saw an awesome ad for the Galaxy III last night about what it could do not what the tech specs were, and do you know who now leads sales in smartphones around the world? Nope not apple, it’s samsung! http://www.bgr.com/2012/05/01/apple-samsung-idc-market-share/

Wow look at that, the companies that sell on emotion and benefits instead of specs make the most money! Nah must be a coincidence better go back to selling our 10% faster cpu’s…

 

A Distributed, Open Source, Peer to Peer, Encrypted Data Store

The thing I love and hate most about Facebook is that it has ALL my data. It’s fantastic that third party programs such as AirTime are able to link into Facebook and get all my data to personalize their site experience to me, but it’s bad that Facebook is the one in control of all this data, and one company having a monopoly over everyone’s information is a really bad idea.

What I’d love to see created is something similar to Bitcoin but for data. An open source network of computers that allow you to store any arbitrary data in the network encrypted with your own private key, then retrieve that data at any time anywhere. I’d imagine it’d be a key-value library system similar to Redis. Apps could then easily integrate with this network to either store or retrieve any data when you give them the key for it. You could possibly have different keys for different types of data or sections of data.

Basically it would allow everything that Facebook does currently but with a data store that no one controls and no one can access unless you give them explicit permission to.

What could be possible using this data store:

- A radio for your car that via 3G connects to the data store, grabs all your favourite bands / songs / music and streams them + finds and streams related music automatically (like pandora). You could then press if you like or dislike songs and it would save that back to the data store. Then at home on your PC you could load up your favourite music buying site, connect it to the data store, see all the songs you’ve recently enjoyed in your car and purchase them instantly. It would completely disrupt the current radio and music industry and be amazing for the indie music scene.

- Your fridge auto monitors the food you put into it, and saves out what you’ve consumed to the database, then a shopping list app automatically generates a shopping list based on your personal food preferences and what’s currently missing in your fridge. Taking it one step further it a local grocer (or even farmer!) could tap into the database and auto ship you your weekly food based on what you’ve consumed.

- Dating sites could grab data for your likes / dislikes, partner preferences and so on to automatically find amazing matches for you (and most people are currently wary of linking dating sites to Facebook).

It would help create an amazingly interconnected world without the need to interface with Facebook or any other companies. Also there are already scripts out there for exporting Facebook data, so it should be easy enough to port your data into this new data store in seconds.

Of course a lot of this is currently possible via Facebook, but most companies are afraid of Facebook and building technology that integrates deeply with it for fear of it not being around in the future or general fear of giving up so much data to another company. With a data store such as this companies are free to integrate as much as they like without worry of anyone taking it away from them, blocking them out or it shutting down.

 

 

Disappearing songs in Grooveshark

Recently discovered a bug in Grooveshark where songs started randomly disappearing. Some were probably from copyright infringement but others were not copyrighted at all (old game songs, obscure niche bands etc) so I started to investigate.

Upon investigation I discovered that actually a lot of the songs were still on the “My Favourites” page but weren’t showing up in “My Music” (even thought they were ticked to say they should be). I think the cause of this was when i added them roughly 2 – 3 years ago I only clicked favourite and not add to music. Grooveshark continued showing them in My Music for a while but seems to have recently stopped hence the disappearing act.

To fix it all I had to do was click the green tick on each song twice to remove and re-add them to my library then they began showing up in My Music again.

Just putting this out there in case anyone else has similar issues and is searching for a fix.

Still having troubles finding your old music?

Here’s another method to get your music back. When you’re logged in you’ll see your name in the top right of the control panel, click on this then go to profile. Then on your profile page you’ll see a stream of songs that you’ve recently played. Even songs that have been removed from your music still show in this list (as of 05/05/2012) and you can re-search and re-add them to your library from there.

MySQL using SELECT IN (x, y, z) vs one at a time

Finally deployed the Zentester heatmapping re-write today and wanted to share a 3 things I’ve learnt on the way on optimization.

1. Do “WHERE IN ()” instead of “WHERE =” Wherever possible – The biggest thing I discovered was that when selecting individual rows from a database by doing something like:

SELECT * FROM database WHERE id = 5
SELECT * FROM database WHERE id = 6
									

It’s much much faster to select them all at once rather than doing it one row at a time. How much faster? In one section of the heatmapping I was grabbing visitor id’s from ipaddresses from the visitors database like so:

SELECT id FROM visitors WHERE ipaddress LIKE 'xxx.xxx.xxx.xxx'
									

The ipaddress column was a primary key but even then every time I did this query it took approximately 0.1 seconds. I combined all of these into one huge SQL query looking something like this:

SELECT id FROM visitors WHERE ipaddress IN ('xxx.xxx.xxx.xxx', 'yyy.yyy.yyy.yyy', 'zzz.zzz.zzz.zz')
									

Loading 1000 different rows in this way took just 0.7 seconds. Which equated to a 150 x speed increase over the old approach.

2. PHP Arrays are really fast – I used to think PHP Associative arrays were slow and It’d be better to just grab everything from MySQL one at a time rather than caching it in php. Oh how wrong I was. I changed the overall code structure of my data processing from the following (psudo code):

while (data) {
    $visitorData = SELECT * FROM visitors WHERE ipaddress IN data->ipaddress
    $pageData = SELECT * FROM pagedata WHERE page IN data->page
    $siteData = SELECT * FROM sitedata WHERE site IN data->site
    //Process and insert data into another database
}
									


To this:

while (data) {
    $ipAddressArray[] = $data->ipaddress;
    $pageArray[] = $data->page
    $siteArray[] = $data->site
}

$ipData = SELECT * FROM visitors WHERE ipaddress IN ($ipAddressArray) 
while ($ipData) {
    $ipAddressMap[$ipData->ipaddress] = $ipData->visitorid;
}

$pageData = SELECT * FROM pages WHERE page IN ($pageArray)
while ($pageData) {
    $pageMap[$pageData->page] = $pageData->id;
}

$siteData = SELECT * FROM sites WHERE site IN ($siteArray)
while ($siteData) {
    $siteMap[$siteData->site] = $siteData->id;
}

while (data) {
    //process data using the maps instead of constantly querying the database
}
									


Instead of looping through all the data to be processed and processing it one at a time It’s figuring out up front every single piece of data that’s going to be needed from the database and grabbing it in just a few queries.

Then all this data is stored in associate php arrays (which actually hash the keys of the array so you can make the array keys literally anything, in one case ip addresses were used as the key). Then everything is processed grabbing the data out of these php associative arrays rather than pulling it from the database every single time.

I’m processing approximately 10,000 rows of data at a time and after implementing this change it went from taking 5 – 10 minutes to process down to 10 – 15 seconds.

3. Batch your inserts – I also got a small speed increase by combining all my inserts into one large statement, though don’t make this statement too large or MySQL will run out of memory when you try to run it (I discovered about 5000 inserts at once works best for me). Batching inserting took the insert time from about 60 seconds down to about 10, so it wasn’t as huge of an increase as the batched SELECT but it is noticeable especially when you’re dealing with lots of data.

Netbeans is awesome

If you still can’t find that ‘perfect’ IDE give Netbeans a try. I was recently introduced it and am still blown away by the amount of functionality and features it has. It’s similar to eclipse but I always found eclipse to be too sluggish to use on windows (everything seemed to take half a second to do which adds up and gets frustrating very fast).

The main things I like

  • Plugins for things like CoffeeScript, SCSS, Maven etc which make life a lot easier.
  • Shortcuts for just about everything, so if you’re a keyboard ninja you can get things done really fast.
  • It has everything else you’d expect in an IDE (code completion, jumping to definitions etc).

I’ve been using PHPDesigner for about 2 years now and although it’s great for developing in PHP it doesn’t have the best support for other languages such as Javascript, Coffeescript and Ruby.

If you’re a Windows coder give it a whirl, it’s open source and free.

Coffeescript Classes and Public / Private functions

So I’ve seen some coders confused by how they should declare their functions in Coffeescript. Here’s some notes I took while learning on how to make the equivilant of Public and Private functions for your class in Coffeescript.

Public Function - This is your bread and butter function, it is what most coders use by default with coffeescript and is accessible from outside the class.  Notice how it uses a colon after the function name instead of an equals sign.

class SomeClass
    functionName: ->
        alert('hello world');
									

Private function - This will create a private function as it makes it a variable inside the class. So the function is a closure and can only be accessed by functions of that class. Notice how it uses an equal sign instead of a colon.

class SomeClass
    functionName = ->
        alert('hello world');
									

There are two catches with Private Functions. The first is because they are a completely seperate function the @ (this) callback will not point to the main object (even though it is called from another internal function). If you wish to use @ in a private function whenever you call the private function it you must use .call(this) which will load the function but use the main class object as the @ callback inside it. For example:

class Dog
    dogName: "fido"
    constructor: (@dogName) ->

    doStuff: ->
        alert('the dog is walking');
        sayHello.call(this);

    sayHello = ->
        alert("Hi! I'm "+@dogName);

ralph = new Dog("ralph");
ralph.doStuff();

peter = new Dog("peter");
peter.doStuff();
									

If you run this example yourself you’ll notice both dogs have separate names and have the private function sayName. The sayName function cannot be called from outside the class (if you try and call it you’ll get an error) which is exactly how private functions should work.

The second catch is that this private function is shared for every dog. This isn’t really an issue for private functions as whenever you use @ the @ variables are specific to each dog. Though it is an issue if you wish to create private variables for your dog, as if you create them in this way they will be shared between every dog which is most likely not what you want.

If you have any questions or queries let me know via the contact form.