Monday, October 29, 2012

First attempt at some sort of continuous deployment

So I've heard of this continuous deployment concept and it all sounds great. I am keen to give it a go myself but am unsure where to start. So rather than attempt this with a production application in work I pick a nice small app that I've written to manage some of our application config.

This is a small web application hosted on one of our servers which has a database for persistence. Not much you might say but still has got the ingredients to try some sort of automatic deployment. To give you an idea of what I was trying to get away from I need to describe the evolution of the deployment.

What I started with

When writing the application I was not at all concerned about deployment, I could barely write html never mind get the thing running on an actual server. However as time went on I realised that I would have to have a go at getting this deployed out to a running server. This started with a very very manual process. First I had to install all the prerequisites, in this case python 2.7, mysql server, mysql-python connector, django and all those other little good things that it needed to run.

After that it's a matter of getting the code on the machine. For this I had a bunch of hand cranked scripts which replied heavily on ssh keys on my local machine to do the deployment. Not ideal. This really became apparent when other developers wanted to get in to make some changes. I wanted to remove this reliance on me as a bottle neck for deployments.

Conversion to rpm's

In step rpm's, Red Hats package management system. This was something we'd been trying to investigate in work of how to package our production code so I wanted to see how I could implement this as part of a trail run. This took care of the any scripts that had to run post deployment as all is catered for in the rpm word. A better solution alright however I was still taking the rpm down to my local VM after CI had built it and doing manual testing after an export from the live database.

Automated deployment to a test environment

In order to take me and my lovely VM out of the picture I really needed a test environment where I could tear down the database and recreate as necessary. This meant that I would be able to automatically deploy to this environment when the CI build had completed. A much better solution. In order to do this I had to get a new environment up and running, back to compiling and install python again. To hell with this I said - in order to be able to do this anywhere I really needed packages for all my dependencies as well.

I was much more confident in rpm's now so I created one for all the dependencies also, I now have a python rpm, a mysql-python rpm, a mod_wsgi rpm. All ready to set up on a new environment if I so wish. There was another angle on this as well - any developer could get up and running with a developer environment in no time, assuming they would use Red Hat or one of it's alternatives (CentOS in my case)

Current State

I'm now able to deploy to the test environment my new rpm from the build. I also scripted an export of the database, a fresh if you will before the deployment so that the test database would be as close as possible to live data. This means now more testing on my local VM to make sure that the rpm and deployment is up to scratch.

Next Steps

What now. Testing that's what. And lots of it. I'm of the opinion that in order for me to get to the point where I'm able to click the button to deploy to live, i.e. that I've deployed to the test environment and am able to test the change I need a bunch of integration tests. Something that I didn't do well in the first line of development, yes to my shame I cut some corners on the testing front.

I'm really feeling the pain of this architectural blunder now. I have very little confidence in what I add into the app will not regress some other part of it. The technical debt that I have to pay down is quite substantial on this front. There was a second side affect to this I didn't think of either. A social get out clause that other developers were able to see that I hadn't put effort into proper testing and were able to not bother with it themselves, further increasing the technical debt within the application.

Definitely the lessons learned here are that testing, while not an immediate concern at the time of writing the application, is key to be able to take the human out of the picture. I now have a deeper respect when jumping into writing a part of an application to the tests that need to be in place to support the change that I'm making and the resultant lack of confidence in my change if this is not done at the start.

Testing through your home firewall router

Recently I came across a problem where I needed to test a hosted service getting access to a port on my laptop. The general workflow was that I would initiate the request from the hosted service which would then talk to a service running on a tomcat on my local laptop.

There are a number of problems to get around when attempting to do this. First you can't just use your IP address as you see it on a command prompt ipconfig or ifconfig. The network address that you see here is the internal address that your home router has given you and any device connected to it. Therefore the hosted service isn't going to have a clue how to get to 192.168.0.*

What you need here is the external IP address from your home router. There are a number of ways to get this, including some websites that will display it for you but I went to the source - the router. Luckily there was a configuration page which told me the external IP that my router was using.

The next problem is your firewall. Most modern routers come with a firewall - for good reason - to stop those nasties from getting in to your local network. For this test I was testing for a short time so I was happy to bore a hole through my firewall to my local computer IP through port forwarding. This is where you select a port on the router and a port on your laptop and any traffic hitting your router via it's external IP will be forwarded to that port on your laptop through it's local network IP.

In my case I just forwarded port 80 to port 80 (the default) as I was able to set up my service on my local laptop on that port. So after all this was done I was ready to test. Initiating the test from the hosted service still did not work however...ragin.

The final thing that you need to complete is to turn off your windows firewall (if you're running windows). Since I had already let the traffic through the router it was the windows firewall that was blocking the traffic.

Very happy with myself for actually getting that done as I've never tried it before. Good fun.

Oh and yes, I did quickly turn on my firewall again after the testing was complete and remove the port forwarding!