Home > Mac & other computers > Hyperterrorism

Hyperterrorism

May 29th, 2003

My occasional correspondent, Graeme Bond, is as far as I know, the only other person in Australia to have pointed out in public (and in advance) that the Y2K panic was based on patently spurious arguments.

Not surprisingly, Graeme is also a sceptic about cyberterrorism and sent me this link to a story headed ASIO slams cyberterror ‘hype’. I’ve been a sporadic reader of The Crypt Newsletter, a debunker of computer-related panics, and can particularly recommend this story on how U.S. infowar commandos smuggled a deadly computer virus into Iraq inside a printer. Set to go off on April 1 of course.

UpdateLooking over my files, I realise that I forgot to mention Stewart Fist of The Australian who also debunked Y2K , and whose judgement in matters technological is usually reliable. Feel free to remind me of others I’ve omitted.

Categories: Mac & other computers Tags:
  1. May 30th, 2003 at 06:02 | #1

    But what exactly is your claim? That there was no problem, or just that lots of unnecessary stuff was done along with the necessary stuff?

    Having worked on this myself, I know that there’s no question that there was loads of legacy code that would have crashed if Y2K fixes hadn’t been implemented. I don’t know if people would have died or anything, but lots of businesses certainly would have been thrown into chaos.

    In addition, I also know that a lot of companies deliberately decided to upgrade their systems as long as they needed to fix Y2K stuff anyway. That may have been overspending, but it was deliberate.

    I’m curious, what do you think would have been the appropriate response?

  2. John
    May 30th, 2003 at 07:05 | #2

    The correct response was “Fix on failure” for all but the most mission-critical systems. Since Y2k-related bugs emerged gradually (and fairly infrequently) over the period leading up to and following 2000, this wouldn’t have caused serious problems.

    I did mention the upgrading as an offset against the cost of Y2K in some of the articles I wrote on this topic.

  3. Factory
    May 31st, 2003 at 00:30 | #3

    Hmm whilst there was quite some scare mongering around Y2K, particularly around systems that have little chance of being affected, ie. any system that doesn’t use COBOL, or store any time related data.
    ‘Fix on failure’ erm, well seems to be someone speaking outside his field. The kind of software that is vulnerable to Y2K problems is also the kind of software that is overly focused on reliability. Whilst it may seem strange to a windows user, in those systems one minute of downtime a year can be completely unnacceptable.
    (Oh yeah and I’m a programmer, so like, now I have an excuse for all my poor understanding of economics, it’s speaking outside my field you see.. :) )

  4. John
    May 31st, 2003 at 00:39 | #4

    I’m not a programmer, but I don’t think that disqualifies me in this case. The main issues don’t require any detailed knowledge of programming. In fact, it was the ease with which this bug could be understood by nonprogrammers that was part of the problem.

    The Y2K remediation program was not confined to “the kind of software that is overly focused on reliability” – in Australia, at least, every computer and any piece of electronic equipment that had a clock of any kind had to be certified compliant. I saw microwave ovens listed as compliant, for example.

    In any case, I already made the point that fixing mission-critical systems was a good idea.

  5. Graeme Bond
    May 31st, 2003 at 15:51 | #5

    Kevin Drum, Factory and others still unconvinced by John’s Y2K scepticism would do well to read Nicholas Zvegintzov’s essay “The Year 2000 as Racket and Ruse” at http://www.softwaremanagement.com/References/year_2000.html which cogently demolishes much of the Year 2000 mythology. It was first published in 1996 in American Programmer and Zvegintzov is a respected consultant and commentator.

    To my certain knowledge, at least some tertiary institutions in Australia were mentioning the year 2000 issue to students as early as the mid 70′s. It was an easily managable software maintenance issue, much more easily identifiable than others. It should have been fixed in the normal course of maintenance of the old COBOL programs and never arisen in newer ones. Prudent companies had it well in hand years before 2000 and those that didn’t deserved to bear the full costs (if any) of their last minute panic.

    The series of Y2K Infrastructure Forums held jointly by the Federal and State Govts late in 1999 were most informative. The common theme was that, after spending millions upon millions of dollars, the organisations concerned had managed to locate and correct a few minor anomalies in displayed dates on event logs. Nothing that could cause disruption of service was discovered and no safety issues were found.

    On the other hand many organisations got undeserved tax breaks for ‘Y2K work’ that really had little to do with Y2K and a lot to do with upgrading systems for other reasons. More taxpayers money was wasted in the public sector in futile quests for the non-existant while much needed expenditure for maintenance of public infrastructure etc was witheld to fund this folly.

    I dealt at length with these issues in 2 articles published by the Fairfax press ‘Apocalypse how: making a mountain of the millennium molehill” and “Y2K an expensive scam”.

Comments are closed.