Friday, 27 March 2009

XmlMultiMassUpdate - Extending MSBuild Community Task's XmlMassUpdate

The MSBuild Community Tasks library provides lots of additional and very useful functionality to MSBuild through new MSBuild tasks. One task that we are using on our build server to merge, at build time, certain configuration settings based on the environment we are building for is the XmlMassUpdate task.

XmlMassUpdate takes a source XML document and updates it from another XML document from a particular document root. Our merging routine operates as described in Doran Yaacoby's excellent blog post.

The Problem


The issue we were having is that XmlMassUpdate can operate only on a single XML file. For our build process, we wanted to merge the contents of all substitution.config files into all appSettings.config in the entire solution tree. This is because we use have multiple executing assemblies (ie. web applications, class libraries for unit tests, console applications, etc.) that are being built in a single solution. Also, because we use a templated CruiseControl.NET configuration (and the maintenance overhead), we could not manually hard-code individual XmlMassUpdate tasks in each solution to update the config files. The ability to operate only on a single file at a time was causing problems.

The Solution


To overcome this we created our own XmlMultiMassUpdate task which is based on the XmlMassUpdate task. The task now can take an item group with multiple files and will update all the files in the set. For example:


<ItemGroup>
<ContentFiles Include="C:\example\**\config\appSettings.config" />
<Substitution Files Include="C:\example\**\config\substitutions.config" />
</ItemGroup>

<XmlMultiMassUpdate
ContentFile="@(ContentFiles)"
SubsitutionsFile="@(SubstitutionFiles)"
ContentRoot="/"
SubstitutionsRoot="/" />



Constraints


There are a some constraints when using a file set (rather than an individual file) in the XmlMultiMassUpdate plugin:

  1. The number of files in the content item group and the substitutions item group must be the same.

  2. The original XmlMassUpdate allows the specification of a 'MergeFile' where the output will be written. When using multiple files a merge file cannot be specified. Instead the substitutions file is merged into the content file and the content file gets overwritten with new content.



The Full Source


I have not made the full source code for the XmlMultiMassUpdate available in this post. Mainly because it has been done as part of this project at the NHS Information Centre and all such things should follow the appropriate channels. If you really think it would be useful to you though, please email me at talk@peppermint-it.com and I'll see what I can do.

Wednesday, 18 March 2009

Redefining Configuration Variables: StackOverflowException

The Problem


Today I made some changes to our CruiseControl.NET solution in order to get a project working that is slightly different from our general project structure. After making the changes I restarted the service but it wouldn't start. Looking in the Event Log I got a rather generic "There has been an error in the .NET 2.0 Framework" kind of exception which wasn't very helpful. What was going on?

Some Background


To understand the reason for the service termination, a decent understanding of CCNET's cb:define and cb:scope configuration declarations is needed. For those of you who may not be familiar with these declarations, I'll give you a brief run-down:

cb:define
This declaration allows you to define a variable in the CCNET configuration that can be used later. For example:

<cb:define workingDir="C:\example"/>


will create a variable called workingDir which can be used throughout the project like so:

<workingDir>$(workingDir)</workingDir>


cb:scope
The scope declaration allows you to override a previous definition for a certain scope. The following example shows this:

<cb:define workingDir="C:\example"/>
<!-- workingDir will equal C:\example -->
<workingDir>$(workingDir)</workingDir>
<cb:scope>
<cb:define workingDir="C:\different"/>
<!-- workingDir will now equal C:\different -->
<workingDir>$(workingDir)</workingDir>
</cb:scope>
<!-- workingDir will equal C:\example again -->
<workingDir>$(workingDir)</workingDir>


The Reason


Eventually I discovered why the service wasn't starting. I had kind of assumed that the 'define' and 'scope' declarations were similar to those in a programming language such as C#. In C# the following would work:

string workingDir = "C:\example";
Console.WriteLine(workingDir); // Writes 'C:\example'
workingDir = "C:\different";
Console.WriteLine(workingDir); // Now writes 'C:\different'


This, however, is not the case. My configuration, in an attempt to achieve "C:\example\anotherLevel" as a workingDir within the scope was:


<cb:define workingDir="C:\example"/>
<cb:scope>
<!-- Attempting to get C:\example\anotherLevel -->
<cb:define workingDir="$(workingDir)\anotherLevel"/>
</cb:scope>


This was resulting in a StackOverflowException which was killing the service when the configuration file was read and parsed.

The Solution


In the end I had to effectively redefine the complete path within the scope:

<cb:define workingDir="C:\example"/>
<cb:scope>
<!-- Redefine with full path -->
<cb:define workingDir="C:\example\anotherLevel"/>
</cb:scope>


While this results in a bit of duplication in the configuration file, I think that a working service is more important!

Thursday, 12 March 2009

CCNET & Subversion: Invalid SSL Certificate

As we are developing our automated deployment pipeline and are refactoring our build process anyway, we have decided that now is the time to make the long awaited switch from Visual Source Safe to Subversion for our source code control. With a SVN server already installed and a repository ready and waiting we set to work changing our ccnet.config to use Subversion source code control blocks. It was then that this issue raised its head.

The Problem


Our SVN server was configured to use SSL for a little bit of extra security and best practice. The thing is that the SSL certificate was a self certified one (rather than an official one from a recognised body) which means the svn.exe console application used by Cruise Control.NET was prompting as to whether the certificate should be accepted (due to its invalidity). The trouble is, CCNET had no idea what to do and just errored then and there.

The Solution


To overcome this issue we had to log in and manually run the SVN console application against the repository. When prompted we said to accept the certificate permanently. This had to be done with the same user under which the CCNET service runs.

Once this was done CCNET could access Subversion successfully and we were on our way to a successful switch.

Friday, 27 February 2009

FAST Search Profiles & the Search Business Center

The back end search functionality for the Signposting project is provided by FAST ESP. When setting up FAST, we configured various search collections with different data sources and processing pipelines. One example of this is the content from the NHS IC site is certain to be marked up with known meta data which we use to extract important information. This content is crawled using FAST's Enterprise Crawler data source. Other content is obtained from custom XML extracts from various databases. These have their own processing pipelines which maps the XML to fields in the FAST index. With all this working nicely we get down to searching.

Part of FAST ESP is the Search Business Center. This tool allows you to get nice search statistics graphs, such as top queries and top zero result queries. It also allows for boosting and blocking of results so content experts can manually rank content that they know to be authoritative higher. All exciting stuff that we were keen to check out. Unfortunately it just didn't seem to work. No matter how many searches we pumped in, no stats appeared. What was going on?!

Well, it turns out that we were searching against the main search engine cluster. This simply performs a search against all the available search collections, but no stats are produced. To get stats you need to create and a search profile. When searches are made against a particular search profile rather than the whole base cluster the Business Center features become available.

To search against a particular search profile, we simply changed the 'view' property in our query string from 'view=espsystemwebcluster' (the default cluster) to 'view=livesearchsppublished' where 'livesearch' is the name of the search profile (i.e. for a search profile called examplesearch you would use 'view=examplesearchsppublished').

Monday, 16 February 2009

jQuery: First Impressions

Working on some UI stuff today I started investigating and using jQuery (http://jquery.com). My first impressions are positive as it makes a lot of the more cumbersom bits of Javascript much easier. Here are some examples of some of the stuff I've used so far:


Selectors


Gone are the days of document.getElementById or using a for loop to iterate through an array returned by document.getElementByTagName. jQuery makes it much easier to selected items from the Javascript DOM using the new $ function:

  • To select an element by ID, use: $('#someId')

  • To select all h1 tags: $('h1')

  • To select all h1 inside the someId element: $('#someId h1')


Effects


There are lots of effects with jQuery. So far I've only used the show and hide effects. The longer winded Javascript:
myElement.style.display='none'

or
myElement.style.display='block'

has been replaced by the shorter and more intuitive:
myElement.hide()

or
myElement.show()

Also, if you fancy a bit of fading effect, you can simply add a number of milliseconds fade time in the method call:
myElement.show(250)

will show myElement in a 1/4 of a second fade in.

Utility Functions


jQuery enables you to replace the old-style, cumbersome for loops commonly used to iterate through an array with the easy to use each function. Check out the following example:

var names = ["Aaron", "Glenn", "Ian"];
names.each(function() { alert(this); });

This simple code will iterate through the each name in the array and use it in a Javascript alert.

Cookies


As well as jQuery, I've also discovered a cookie library that is built on jQuery that makes using cookies in Javascript a breeze. The library is courtesy of Klaus Hartl and can be found at http://www.stilbuero.de/2006/09/17/cookie-plugin-for-jquery/. As with all open source, the use of this library is done at your own risk!


The cookie library allows access to browser cookies using the following syntax:


  • Get a cookie value: $.cookie('cookieName')

  • Set a cookie value: $.cookie('cookieName', 'value')

  • Delete a cookie: $.cookie('cookieName', null)

  • Set a cookie value with a particular expiration: $.cookie('cookieName', 'value', { expires: 5 }) // expires 5 days in the future



That's it for now... I can say I'll be investigating jQuery further and am excited about what other goodies I'll find!

Friday, 30 January 2009

Doing It Right From The Start With Continuous Integration

The Signposting project is proving one of the best I've worked on as we've been given the time and the resources to do it right, right from the start of the project. This includes, not least, being able to set up a fully functional (almost) continuous integration environment.


Using CruiseControl.NET, the first step for creating any solution is to add the initial, blank solutions an projects into our build process. In this way, with automated unit testing, potential errors can be caught and corrected early on. Our build process generally consists of three build types for each project:


  1. Code Diff: This build runs a tool which does simple metrics on the source code. It runs once a day and we don't care if it is running on successful code or not, so it is run separately to ensure it runs even if the build fails.

  2. Daily: Running once a day, this build performs unit tests, further code metrics and analysis and will run integration tests (when we get to that stage).

  3. Continuous Integration: Running each time a developer checks code into source code, the CI build compiles the solution and runs unit tests and test coverage analysis.



Using a well configured build environment like this means I, as a developer, have a lot more confidence in my code. The Test Driven Development model we use ensures up to date unit tests that are run locally before each check-in, meaning I am comfortable that bugs are kept to a minimum. It also encourages me to keep code changes small and check-in often. It's great to have the CCTray notification icon in my task bar turn from yellow (building) to green (successful) with the message: "Yet another successful build"... geek that I am.


All in all with sensible timescales, a great set-up, good project management and regular feedback from the 'client' this project has lots of promise!