Friday, September 15, 2006

The Google Test Automation Videos are here...

Well, not here, but here

I'll update the earlier posts with each relevant video link when I can...

Wednesday, September 13, 2006

XHTML Transitional

OK, so it's taken a bit of time, though very little effort (if you know what I mean), and I think it's probably one of the sadest exercises I've ever gone through (pretty difficult to believe, I know)... BUT the home page of this site is now XHTML Transitional compliant.

It's been validated in two seperate sources.

First up, the obvious: W3C 's online validator

Secondly (because one is never enough), the much handier HTML Validator Firefox extension by Marc Gueury, which is based on the Tiny validator.

The plan is to keep any new posts to the site XHTML transitional compliant. Though I've got no plans to go back and fix earlier posts. Frankly I've got better things to do with my time than remove the <br /> tags that Blogger sticks between <li> tags and change all the &s to &amp;s in the hrefs. :)

At least the site finally looks the same in Firefox and IE6 now!

Update: Is it just me, or is the commented out CDATA tag inside the script tag telling the browser that there's unstructured data a bit of a hack. I know my javascript should be in a different file, but I don't have any hosting for the files, and I'm not going to set it up for 2 or 3 1K files!. Anyway, you'd think the spec would cover this problem in a more natural way. Maybe an attribute on the tag
unstructured="true"
Maybe
type="CDATA"
Maybe it can be implied by the script tag itself?

I'm no XML expert, but the solution seems more than a bit nasty to me.

Google LTAC - A more personal note

Alright, so I've been going on about the Google LTAC quite a bit recently, but I wanted to mention a few more personal observations...

  • Is it coincidence that the comedy presenters were called Adam and Joe?
  • Google might talk about Work / Life balance, but you're always at the conference even when you're on the toilet thanks to the highly informative 'testing on the toilet' hints and tips sheets ;) (see below)
  • Even the user interface 'simplicity innovators' can't help themselves when it comes to conference freebies... I never realised that I needed a coloured light on my pen until now. (Shame the light makes the pen a bit too heavyweight, the ink keeps clogging meaning a slow startup time and the blue LED keeps on cutting out)
  • Also, for a company that's very much 'of the now', a mouse mat is just soooo last millennium.
  • I've never been at a conference where there were so many laptops. Although I'm a little surprised that no-one brought any internet enabled CI lava lamps with them.
  • Google may not be Evil, but they still gave us plastic cutlery and polystyrene plates (boooo)


And a couple of awards

  • Phrase of the conference: "That's a big bucket of suck"
  • Agile Pimp: Dan North, a man with an eye for spotting the delegate that's ripe for a bit of lean process
  • Free snack food of the conference: Innocent Smoothies.
  • Information Download Award: Adam Porter, watch the video (when it's out), you'll understand.
  • Demo of the conference: Jason Huggins, the cutting edge can cut you deeply when you've got an audience.


Testing on the toilet

Tuesday, September 12, 2006

Spare a moment for the little people

They're here, and they may need your help.

Also, it's good to see the new world order we were promised has finally arrived, though if might appear that Disney World has taken the political situation a bit far over here

Sunday, September 10, 2006

A language for discussing performance testing

OK, so all I'm doing here is repeating the text that I previously had in a post about the Google London Automation Testing Conference, discussing the talk by Goranka Bjedov on Performance testing.

I figured that it was sitting in the middle of a fairly large post, and I wanted it to be seen and reviewed by more people than would be bothered to plough through the other stuff.

It's a suggested series of terms by which different types of performance tests can be described:

  • Performance Test: Given load X, how fast will the system perform function Y?
  • Stress Test: Under what load will the system fail, and in what way will it fail?
  • Load Test: Given a certain load, how will the system behave?
  • Benchmark Test: Given this simplified / repeatable / measurable test, if I run it many times during the system development, how does the behaviour of the system change?
  • Scalability Test: If I change characteristic X, (e.g. Double the server's memory) how does the performance of the system change?
  • Reliability Test: Under a particular load, how long will the system stay operational?
  • Availability Test: When the system fails, how long will it take for the system to recover automatically?

In addition to the above, there is then another term, which I would suggest is not a type of test in its own right, rather it is a term denoting the depth of analysis being performed.

  • Profiling: Performing a deep analysis on the behaviour of the system, such as stack traces, function call counts, etc.


Any thoughts?