I’ve just been catching up on Bruce Schneier’s blog and this article on Security following hurricane Katrina made me think about some stuff.
Firstly, he’s spot on about security spending. I hope we have time in the UK to change our tack and spend the 3 billion+ planned on ID cards on something more worthwhile.
But what interested me more was this:
Redundancy, and to a lesser extent, inefficiency, are good for security. Efficiency is brittle. Redundancy results in less-brittle systems, and provides defense in depth.
This is where the approach of re-using code and removing duplication really hurts and where the Agile community really needs to re-think things.
The strive to reduce duplication has clearly had a negative effect on software. Software today throws up more bugs and error conditions than at any other time in history and this can be attributed to the removal of duplication.
Removing duplication, as any sysadmin will tell you, reduces your availability. The same principle applies to code. The fewer routes there are through the code and the fewer implementations you have of your business logic the higher the percentage of your transactions will end up going through those inevitable bugs.
Ever wonder why so few banking transactions fail today? The answer’s simple. The massive duplication provides substantial redundancy throughout the code, allowing a high proportion of those transactions to pass cleanly through areas where the bugs aren’t relevant to them, only occasionally getting that fatal combination of a particular type of data and a particular bug.
If this duplication was removed through the ruthless re-factoring that the XP community advocates, a far higher number of transactions would pass through that inevitable bug.