Thursday, May 19, 2005

M*A*S*H

In Dinosaur Brains, Albert J. Bernstein described the M*A*S*H unit:

"You know the show: a bunch of intelligent, antiwar doctors are competent and loving. In a crazy situation, they keep themselves sane by going crazy. You see M*A*S*H units developing, especially with young, bright, creative people.

Notice the crazy hats, the rude parody of the corporate logo, the graffiti on all the posters, and the strange outfits in the software development department of your company or your own particular M*A*S*H unit. What you're seeing are the behaviours that intelligent, creative people come up with when they feel disenfranchised or under almost intolerable stress."


I think there's at least two things going on here. First there's the intelligent and creative worker's tendency to decorate their workspace in a creative way. Secondly there's the disenfranchised workers tendency to attack the organisation.

The first should be encouraged as a creative mind's way of expressing their individuality; The second should be used as a warning signal to management that there's a morale problem. The two shouldn't be confused.

Of course, there's a behavioural difference between having a giggle with the people you're working with and goofing off. There's a creative difference between having a photo of your child and a cuddly toy on your desk and having a piece of photography and some lego models.

We have a giggle, and we've got a lot of lego. We find the lego not only allows for people to express themselves in their space, but is also a great tool for focusing the mind. When someone is mulling over a particularly tricky problem or musing on the future implications of a problem they will often find that they'll pick up a some lego and just fiddle.

Thankfully we've got a management team that accept the fact that we're not just mucking about. Not all companies are quite so forgiving...

Wednesday, May 04, 2005

REALLY don't fancy yours much

Looks like it was a really lucky escape from the ravenous Bulldog.

The Register have spotted Bulldog-hell

Software development companies need not prevail

Jay Fields suggests in a blog entry that the idea of in-house development is flawed and needs to die here... (I'm paraphrasing)

Sorry Jay, but I really disagree. There's nothing inherently wrong with the idea. The problems you cite are problems that any short sighted organisation will find when they're attempting to build software. I've worked for a couple of software houses and those I've worked for suffer from exactly the same problems described. Short sighted recruitment processes, cost cutting exercises, low quality code.

True, when you're working in-house these problems may be more common. The terms 'cost-centre' and 'profit-centre' are two that really cut to the bone.

However, I'm now working as an in-house developer with a team where the staff turn over is extremely low (around 10%), the team is built of very high calibre developers, morale is high, uptake on new technology is measured in relation to reasoned benefit, code quality is higher than I've ever seen before and customer satisfaction is through the roof.
We don't have any external people / consultants / software houses involved and I don't see any reason to have any.

In-house development isn't necessarily doomed to failure. Short sighted, cost driven organisations are the problem, and software houses are by no means immune to it.

Tuesday, May 03, 2005

Don't fancy yours much

I little while ago I was looking for a broadband connection. I almost went with Bulldog. Glad I didn't. Someone out there's really not happy with the service...

Bulldog Hell

Friday, March 25, 2005

Assertive Documentation

I've been reading another Pragmatic Programmers book: Pragmatic Unit Testing - In Java with J-unit [PUT] Yet another excellent book that covers topics relevant outside of the core of its material. That is, anyone who's using or thinking of using unit tests will get a lot out of this book irrespective of their choice of language. Knowledge of java helps, but is by no means mandatory and the advice it gives is universal.


It talks about strategies for devising complete tests, reminds us that test code is production code like any other and states that test code should 'document your intent' (pg 6). By this I take it to mean that there should be a clear statement of what that tested code should and should not do. If the test code is to be documentation then it has to be clear, since unclear documentation is useless. However the book skips over an important means to that end: the assertion text.


The assertion text is one place where you can state your intention without having to worry about the general guff involved in code. You get to write a single sentence saying what it is that is being tested, outside of the constraints of the programming language. In order to write a good assertion, think about what it is that assertion is for. It is a statement that should clearly say, completely, without support from the context, what is being tested. When that assertion fails, the text is taken out of its context and stated without its supporting code. When the assertion text is stated on its own, which would you prefer to read (using a Junit like format):



Testsuite: sortMyListTest
Testcase: testForException:
FAILED: Should have thrown an exception

or

Testsuite: sortMyListTest
Testcase: testForException:
FAILED: When passed an empty list,
exception is thrown


The first statement (taken from the example code below) is incomplete. Whilst it can be easily read in the context of the code, taken away from its origin it looses so much that investigation is required to find the fault. The second statement tells you enough to be able to track down the problem without even looking at the test code.
As an aside: A well named test procedure can help as well. One option would be to change the name of the test procedure to testForExceptionWhenPassedAnEmptyList.


In addition, taking a little bit of time over the assertion text can help in clarifying the meaning of the tests. When you're writing the tests, if you can't state the assertion in an easy to understand manner then it may be a clue that there's a problem with the scope of the assertion. Are you testing for multiple things in one assertion? Is it clear to YOU what it is that you're testing for? That is, a producing a smoke test that ends in the assertion 'bigProcThatDoesLoads does loads and loads of stuff, then some stuff and a bit more stuff, except when there's this condition when it does this stuff instead' is as much use as assertTrue( false ) when it comes to finding out what's wrong with the code.


Once you have a clear set of assertions sitting within well named procedures you can take the concept of the test code as documentation another step. That is, it should be possible to generate plain text documentation from the test results. Whilst the test code is accurate documentation, it's not quite as clear as plain text. Well written tests are pretty easy to read, but I always find that it's the assertion text I look at first. The generation process could remove duplicate assertion text as it goes along to ensure the result is a concise set of documentation.


Going back to the example given in [PUT] page 36, one could image the generated documentation along the lines:


sortMyList


  • When passed a list of integers, the list is mutated into ascending

  • When passed a list of strings, the list is mutated into ascending order according to ASCII values

  • When passed a list of objects, the list is mutated into ascending order according to the return from the method getValue

  • When passed a list all of the same value, the list is not mutated in any way

  • When passed an empty list, an exception in thrown



In order to do this you need your assertions to be made both in the negative and positive cases. That is, the [PUT] example is no longer valid:




public void testForException() {
try {
sortMyList( null );
fail ("Should have thrown an expection");
} catch (RuntimeException e) {
assertTrue( true );
}
}



Rather the test needs to be formed (in “as close as I care about” Java):




public void testForException() {
bExceptionThrown boolean = false;
try {
sortMyList( null );
} catch (RuntimeException e) {
bExceptionThrown = true;
}
assertTrue('When passed an empty list,
an exception is thrown'
, bExceptionThrown );
}



Also, when writing tests in the file driven manner there needs to be someway of reading the assertion text in from the file. Maybe those #'s cease to be comments and become assertion text signifiers. On a sole line the assertion is for a family of tests, when placed at the end of the test data it becomes an additional qualifier. Since your test code is production code, you may even be using a generic file reader that will replace tags e.g.{x} with the test data...


Applying this to the example from [PUT] pg 40, describing a getHighestValue function:



#
# When passed a list of ints, returns the highest value
#
9 7 8 9
9 9 8 7
9 9 8 9

...
#
# When passed a single value, returns that value
#
1 1
0 0
#
# Works with the boundaries of 32bit integer values
#
2147483647 # (positive maximum: {x})
-2147483648 # (negative minimum: {x})


We would get the documentation:


getHighestValue


  • When passed a list of integers, returns the highest value

  • When passed a list of negative integers, returns the highest value

  • When passed a mixture of positive and negative integers, returns the highest value

  • When passed a single value, returns that value

  • Works with the boundaries of 32bit integer values (positive maximum: 2147483647)

  • Works with the boundaries of 32bit integer values (negative minimum: -2147483648)



The resulting process does sound an awful lot like JavaDoc, PHPDoc and the rest of those helper applications. Now I don't go for such things. I think the chances of the comments being kept up to date is pretty slim, it's very difficult to police and the resulting documentation is patchy at best. However, the assertion text lives within the code in a much more symbiotic manner. It may be plain text, but it is still part of the code, and when it's right you get an immediate benefit from it. You're writing your tests before you write your code aren't you? Surely that means that you're seeing those assertions fail pretty much as soon as you've written them. You can write a test that covers a lot of the functionality of the code before you run it. When you do you get immediate feedback that tells you how good your assertion texts are. Good assertion text makes it easy to use the failures as a specification for the next bit you need to write. It takes diligence to do it but it's a lot more likely to happen than accurate JavaDoc comments. And if the generated documentation isn't clear then it may be a pointer to the fact that the unit tests aren't clear either.