Tuesday, April 26, 2011

Scenario Planning and me. A short personal history.

I have been fascinated by scenario planning for quite a few years – actually for most of my adult life. This is because I grew up in apartheid South Africa: and so was briefly exposed to the power of scenario planning. As were most South Africans of my age.

A bit of history might be in order: trying to peer into the future has always been a human fascination. Way back when we used extispicy. We also tried using the position of the stars at birth, the length of creases in a hand, and more recently, how tea leaves fall in the bottom of a tea cup. The list of things humans have done to divine the future is very, very, long.

In the 50's various people tried to put the art of forecasting the future onto a more scientific footing. In the 70's Shell managed to profit handsomely in the oil crunch – and ascribed their success to their scenario planning, built on these initial attempts to forecast the future.

Other companies were dazzled by this, and soon a lot of them were running their own scenario planning groups. Anglo American Corporation, a huge South African mining conglomerate, bought into the whole process in a major way.

Amongst the many scenario's they produced, was a set that predicted two possible futures for South Africa: a “high road” resulting in peace, prosperity and stability, or a “low road”, where the country descended slowly into a spiral of ever increasing conflict and brutality.

So alarmed were they by the prospect of the “low road” that they created a travelling road show, that went from town to town, fronted by the head of their scenario planning division, a Mr. Clem Sunter. Those who attended were given an overview of scenario planning, it's many successes till that point in time, and then walked through each of the two possible futures for South Africa. In excruciating detail.

The torch of newsprint was shone on the road show. It was newsworthy, after all. Editors editorialised, columnists columnised, and opinion pieces were opined. Letters were written, politicians gave speeches. For quite a few months fierce debate raged.

I lived in a small town, and so by the time the road show hit our city the controversy was at its height. The town hall was packed. Mr Sunter didn't disappoint: the show was put together very well, and by the time he was finished I think that most of of the audience felt that the “low road” was a very bad future to look forward to.

As did the apartheid regime – they asked for a private viewing and asked many questions. A few months later they started to unwind the whole apartheid edifice.

Where the two related? I can't say. But from my point of view it would seem that scenario planning had certainly played a key role in shaping the future of South Africa.

So it came as quite some surprise to me when I left South Africa that no one else in the world seemed to know very much about the power of scenario planning. Actually, most people knew nothing of scenario planning!

Every now and then I would turn to the web to see if I could find out more. But what I got was a confusing mass of information that obscured rather than revealed. Being a very busy software developer with a family I never had the time to pull at the end of the information and to find out any more about scenario planning, fascinated though I was.
Till now. I have got my hands on a copy of “Games Foxes Play: Planning for Extraordinary Times” by Chantell Ilbury and one Clem Sunter. A book that claims to transform scenario planning from “an esoteric discipline” to a “practical model” that allows “intense strategic conversation”.

It's been a fun read. I think I'd like to try to work through a very high level scenario planning exercise to get a feel for it. Now I just have to find my conversational partners...

Tuesday, April 19, 2011

CSS heuristics

As a developer I believe that styling web sites should be done by practiced design guru's. But that us developers should have at the very least a basic understanding of what those design guru's are up to.

Thus I have spent this last week wading through (and playing with) simple CSS experiments. It seems to my uninitiated eyes that the world of CSS is one of craftsmanship - knowledge gathered by bitter experience when bumping into the foibles of the various browsers.

So at the end of the week I have put what I have learned together and come up with the following list of heuristics for me to reach for and to try and follow whenever I have to touch CSS in future. Of course the trick is to keep this initial list up to date as I learn more...


  • Use external style sheets
  • Use a reset style sheet
  • Combine elements (don't repeat yourself, if at all possible)
  • Group your rules by location, ordering your style sheet from generic to specific. E.g.: Put generic classes first, then header, then navigation, etc...
  • Try to use composite (shorthand) properties and values where possible.
  • Alphabetise properties for easier reading.
  • Consider putting the colour scheme used, a version number and your name in a comment at the top of the style sheet.
  • Avoid browser specific hacks if at all possible. And if you have them in your CSS files, comment them so people understand why they are there!
  • Avoid unneeded selectors – but do consider sand boxing the affect of a selector by making sure it is bounded (i.e.: make sure that the application of a selector is restricted to only a specific area)
  • Specify units for non-zero values – don't specify units for zero values
  • Minimise the use of HTML element id's
  • Refactor often, removing unused rules and properties, and fusing rules where possible
  • Try to follow object oriented CSS principles (separate structure and skin, separate container and content).
  • For complex CSS consider using XCSS or Sass
  • Have (and follow) a naming and coding convention which avoids underscores (for reasons of browser incompatibility). For an example of one see: http://na.isobar.com/standards/#_css
  • Keep CSS files under version control (a no brainer for a software developer)
  • Consider sizing your text in ems and using Javascript to find the correct starting pixel size. Then your base text should be 1.0 em ( * { font-size: 1.0em })
  • Validate your and prettify your CSS.
  • Consider using CSS sprites
  • Code link pseudo-classes in this order: Link, Visited, Focus, Hover, Active (Leave Visitors Focused on Hoovering Activities)
  • During development use separate style sheets with lots of comments. For production, compress, minify and consolidate into one file!
  • Avoid using important! (treat its use as in indicator of something being wrong in your styling)
  • If designing for both mobile and large screen, consider starting with the mobile design. That way you will be aware of the assets the mobile shouldn't have to load.
  • Know that CSS3 "pays off when it comes to production, maintenance and load times"


  • If possible, give every page's body a unique descriptive class: this makes it easy to add page specific styling
  • Try not to make class names represent design i.e.: prefer “comment” over “right” - otherwise you are baking design decisions into the HTML
  • Try to avoid div and span soup in favour of meaningful HTML elements
  • Avoid style attributes in HTML tags
  • Use HTML 5's doctype: as it switches all modern browsers into standards mode – even if you aren't using HTML 5
  • Validate your HTML!

Wednesday, April 13, 2011

The naked server

Twitter have abandoned Rails for some parts of their application. I didn't find their description of the move to be fully informative.  E.g.: just what do they mean when they say "we launched a replacement for our Ruby-on-Rails front-end"? Predictably the move got a few comments on the Ruby/Rails newsgroup that I subscribe to. The general consensus seemed to be "so what"? I think we might all be missing the bigger picture. Perhaps this is just a logical step in the progression of web applications? If so, what does it mean for their future? To explain my thoughts, some history:
  • Way back when, pages viewed in a browser were fairly simple, being mostly static files that were dished up from directories by a dedicated server application.
  • Then we needed to make those static pages more dynamic, and to reflect knowledge related to the identity of the person who was viewing those pages. However, browsers were geared to rendering static pages, so we controlled what we could and put the smarts on the server, building frameworks that allowed us to customize each individual page for the viewer. Our first efforts at these frameworks were quite awful, IMHO, but as time went by we got better at it.
  • Now web browsers have an common embedded scripting language (ECMAScript): and can create and change whole pages, dynamically, on the fly.
See the flow here? The complexity started on the server, and then tunneled down those tubes to the browser. Currently it is spread across the browser and the server, with JavaScript frameworks, and server side frameworks being combined to produce an application. Do we still need those server side frameworks to compose and build web pages? Surely we can now start to strip the server side of the equation back? In this vision of the future we would create the bones of the site using HTML and CSS, mix in JavaScript, and serve these as static files to the browser, that then makes the magic happen. The browser fetches any data required from the server in an easy to digest format, such as JSON.  The server loses the complexity that it gained to build dynamic web pages. I'm going to call this vision of the future the naked server.  If correct, I predict two things:
  • JavaScript/HTML/CSS is going to grow dramatically in importance (and that seems to be happening).
  • node.js is going to become popular. Because if you know JavaScript, and you simply want to fetch some data from the server, or write some back, why not use the same language and, possibly, files, on both sides of the tubes?
Over the next few weeks I am going to see if I can write a simple naked server application. Not being a JavaScript guru yet, I think I might stick to my knitting, and use naked servlets for the server side of the equation. Then I plan to add to my knowledge and to build a node.js cloud based application. This should be fun!

Wednesday, April 06, 2011

Software craftmanship

Gojko Adzic has taken the bother to go through the Hudson/Jenkins code base and critique it. He doesn't find the code to be pretty, and wonders what impact this might have on software craftsmanship.

I found the comments quite good. I wrote my own lengthy response, which I am going now going to quote, as I want to add a footnote.

"Any long lived piece of code is going to have many people working on it, each with their own unique style and understanding of the code base. 

Every piece of functionality added is going to have it’s own motivating circumstance, so, for example, there might be time or financial limitations driving it that we, the code readers, are not aware of.

It is almost always easy to criticize others code, but without insight as to how that code was created and added to, the criticism is, to me, shallow. It perhaps gives us insight into the critics preferred coding style, and also maybe teaches us a little.

But beyond that, Hudson/Jenkins code works. The Hudson/Jenkins is released often. The Hudson/Jenkins is very popular. I think that is the true test of software. Regardless of our personal feelings on reading the code. 

You ask where does this code leave software craftsmanship? Hudson/Jenkins is the rock on which many continuous integration projects are based, so I guess it seems to show that software craftsmanship is really not as important as releasing early, releasing often, and getting immediate feedback from a wide community of users. That seems to trump code ‘quality’.

But you know what? It does look as though they could do with a little help cleaning that code base up. We should start by contributing some tests…"

The footnote:

I'm with Dan North. "No-one wants your steenking software – they want the capabilities it gives them."

The only craftsmanship the user of an application will experience is via their interface to the software. It is one of the oddities of the software world that we can write a perfectly literate, indeed beautiful, program that is totally unusable.  And as Hudson/Jenkins seems to show, we can write an abomination that is extremely popular, well liked, and very, very important.

At this moment I believe that the software craftsmanship manifesto is by developers, for developers. If it were any other way it would have usability and interaction at its heart, not code.

What software craftsmanship seems to offer is code that more reliable, cheaper to build, and if it passes the first hurdle of customer acceptance, easier to maintain. That's not bad: but I find myself wondering how many of its practices are well founded.

Take the 100 line constructor that Gojko complains about: in Code Complete, Steve McConnel writes "Decades of evidence say that routines of such length (>100 lines) are no more error prone than shorter routines. Let issues such as the routine's cohesion, number of decision points, number of comments needed to explain the routine, and other complexity-related considerations dictate the length of the routine rather than imposing a length restriction per se. That said, if you want to write routines longer than about 200 lines, be careful."

Surely we should forgo our beliefs in favor of research results?

During my career as a software developer I have seen fads come and go. Many seemed to have very little scientific underpinnings, being based more on belief than on reality. I worry that, seductive and appealing though it sounds, software craftsmanship might be in the same boat.