## Wednesday, 12 June 2013

### Deploying from SVN through Jenkins to AWS Elastic Beanstalk (Part 2): Linking Jenkins to AWS EB

Following on from Part 1 of the blog on integrating Jenkins with AWS EB, this concludes the two part series by focussing on the AWS deployment process within Jenkins. At the end we will briefly outline the limitations of AWSDeploy to EB and what you would need to look to if your needs are more complex.

### So what does Jenkins look like?

Setting up Jenkins will basically require executing the following the steps:
1. Create a 'New Job' in Jenkins
2. Fill in the URL of  your SVN repository
3. Decide on Jenkins' polling frequency for SVN
4. Fill in the batch process details, including MSBuild.exe and AWSDeploy.exe CLI, with appropriate parameters
5. Save the job

### Step 1: Create a 'New Job' in Jenkins

Nice and easy, from the main Jenkins page, click "New Job" on the left hand menu in Jenkins.

When the new job page loads, fill in the details of the job you with to create. You'll want to create a free-style software project.

I Normally prefix the job name with 'RELEASE-' but for this demo, I am being a bit slapdash. When you are happy with it, click the OK button.

 fig 1 - Creating a 'New Job' in Jenkins (click to enlarge)

### Step 2: Configure the Job

Jenkins then moves to the job configuration page which allows you to set up the SCM (in this case SVN), the Build Steps and Post Build Actions.

Hence:

1. Fill in a display name under "Advanced Project Options" if you want it

2. Under "Source Code Management" select the Subversion radio button and enter your repository URL. Jenkins will poll your SCM and if you need to enter SVN credentials, a validation help link appears and you can click to enter your SVN credentials within that.
 fig 2 - Links to enter your server's credentials. This server doesn't exist - just showing the link
3. Select your method of authentication. In my case, I chose Username/Password and then enter your credentials. Important note, the site runs unsecured by default. Once you have entered the credentials, click the OK button.

 fig 3 - Subversion authentication in Jenkins (click to enlarge)
4. Complete the rest of Jenkins' SVN config. I always check out a fresh copy so I don't have previous builds laying around and potentially making a mess of the build. Note, you can also check out multiple URL's if you have dependent projects. This can of course, also be done through an SVN 'extern' declaration.

5. Build Triggers - This is an important feature that allows you to configure what sets off the build. For example, if you have a dependency chain of projects/solutions which are already set up as Jenkins jobs, you can choose which project(s) need to complete before this job runs. Projects are separated by commas.

Here you should select the "Poll SCM" step. This then brings up a text box where you can enter the polling frequency in a CRON like language. Jenkins polls your SCM (SVN here) and stores the latest revision that it builds. If there is a change in that revision number on a poll, it then starts a checkout and build process.

Alternatively you can choose to build periodically, or combine the two.

6. The build steps - This is probably the key elements when it comes to deploying to AWS. For simplicity, I have put all the steps in this demo into one build step so that it completes the steps in one go. However, you would normally want to split them out into multiple steps, such as 'Build', 'Test', 'Deploy' so you can stop at any point. You can pick up the build artefacts between the steps by using the same workspace environment variables.

Putting it all together still works, as long as an errorcode is returned, but it isn't neat.

Clicking on the "Add build step" button open up a text area when you can enter a list of  Windows console commands. In my case, they are:

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe "%WORKSPACE%\AProject.sln" /p:Configuration=Release /p:Platform="Any CPU" /m
cd "%WORKSPACE%\AProject"
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe /t:Package /p:Configuration=Release
"C:\Program Files (x86)\AWS Tools\Deployment Tool\awsdeploy.exe" /w /r /v /DDeploymentPackage="%WORKSPACE%\AProject\obj\Release\Package\AProject.zip" "%WORKSPACE%\AProject\AWSDeploy.txt"

You will notice the %WORKSPACE% environment variable in this series of commands. This is the Jenkins workspace that the code has been checked out to. There are a number of environment variable and they are:

The following variables are available to shell scripts

BUILD_NUMBER
The current build number, such as "153"

BUILD_ID
The current build id, such as "2005-08-22_23-59-59" (YYYY-MM-DD_hh-mm-ss)

BUILD_DISPLAY_NAME
The display name of the current build, which is something like "#153" by default.

JOB_NAME
Name of the project of this build, such as "foo" or "foo/bar"

BUILD_TAG
String of "jenkins-${JOB_NAME}-${BUILD_NUMBER}". Convenient to put into a resource file, a jar file, etc for easier identification.

EXECUTOR_NUMBER
The unique number that identifies the current executor (among executors of the same machine) that's carrying out this build. This is the number you see in the "build executor status", except that the number starts from 0, not 1.

NODE_NAME
Name of the slave if the build is on a slave, or "master" if run on master

NODE_LABELS
Whitespace-separated list of labels that the node is assigned.

WORKSPACE
The absolute path of the directory assigned to the build as a workspace.

JENKINS_HOME
The absolute path of the directory assigned on the master node for Jenkins to store data.

JENKINS_URL
Full URL of Jenkins, like http://server:port/jenkins/

BUILD_URL
Full URL of this build, like http://server:port/jenkins/job/foo/15/

JOB_URL
Full URL of this job, like http://server:port/jenkins/job/foo/

SVN_REVISION
Subversion revision number that's currently checked out to the workspace, such as "12345"

SVN_URL
Subversion URL that's currently checked out to the workspace.

The batch commands perform the following functions.

The 1st command builds the solution containing the project, using the release configuration. In a normal project I at least use the web.Debug.config and web.Release.config files with the MS Build match and replace mark-up into the main web.config file. A tutorial blog can be found here.

The 2nd command enters into the directory of the built project. This is just so we can run the package verb for MSBuild.

The 3rd builds a standard deployment package out of the project.

The 4th line is the AWS Deployment line. This actually carries out the deployment of the package created in batch step 3 previously using the AWSDeploy.txt file we created from within Visual Studio.

Post-build actions
A post build action in Jenkins is an event that takes place after the build steps. They occur regardless of whether or not the build has completed successfully and encompass one or more of the steps shown in the screengrab below.

 fig 4 - The Jenkins post-build actions (click to enlarge).
In email post-build steps, Jenkins allows users to be sent emails for unstable builds (it builds but tests fail), build failures and send separate emails to the people who broke the build.

You'll notice form the screenshot that there is the option of publishing JUnit tests. Obviously, MSTest results are not exactly in the same format. However, you can run a build step to convert MSTest's TRX to HTML files using tools such as the MS Test Report Generator. There is a similar process for NUnit results.

Recommendations:
Don't forget to set up e-mail alerts within Jenkins for job events as a "Post-build action". You can also tie events to an RSS feed. In either case, you have the option of using them for failed builds and all builds. Plus the email benefits identified above. Note though, Jenkins doesn't automatically set up an (or your) email server. You have to have a mail service/daemon running to send emails.

Also, I definitely recommend post-build steps to publish your test results. This may include a step to post to your build monitor if you have one. Maybe naming and shaming the individual breaking the build... not that you want a harassment suit on your company's hands of course ;-)

### Step 3: Run the darn thing!

When you check in some code, at the intervals specified for the Jenkins cron, Jenkins will poll SVN, clean checkout the code, build, test and deploy the code. Any failure will cause the build history to include the usual red light.

 fig 5 - Build history, including failures
You can of course, run the build using the "Build Now" menu link in the top left menu to kick it off manually. In either case, there is a significant increased delay over any local process, but that is to be expected since the deployment is going over the wire and EB is incrementally building the environment (it would be longer first time out. I've had to wait over 10 minutes sometimes. However, )

The following is the example deployed to AWS using the above process. As well as using AWS Elastic Beanstalk, for my own work, I have a route 53 entry to the domain name and a back end DB.

 fig 6 - Deployed EB example site (click to enlarge)

### Conclusion

Jenkins has always been a really easy to use system for CI. I've found it easier to use than ThoughtWorks Go. However, both Go and TFS give you more customisation options out of the box. You can certainly expand Jenkins using custom plugins. Obviously, there is quite a bit more to do to set up a CI system from the above. But it gives you a framework to modify and work with.

Also, the use of the AWSDeploy.exe makes it really easy to deploy fairly simple environments.

Limitations
When you get up to the level of having to manage large, custom architectures and deployments, where detailed configuration and set-up of multiple AWS resources are requires with EC2, ELB, SQS, S3, SWF, RedShift and big data instances for example, a basic AWSDeploy text file won't cut it any more.

At that point, you would need to consider moving from just using an EB setup to using CloudFormation to set up a more complex cloud architecture. As it happens, EB does basically generate a JSON Chef  Cookbook which is used as the CloudFomation template. You can see this if you go to the ClouseFormation stacks page in the AWS Console. I suspect there will be times when you'll need to roll your own.