Denial (Millennium Problem) LO10791

Magnus Ramage (magnus@comp.lancs.ac.uk)
Wed, 30 Oct 1996 22:17:23 +0000

Replying to LO10745 --

I found Sherri's remarks about the year 2000 problem with computers
interesting. I agree, it's all a bit of a mess, and it's quite possible
that many computers will suffer problems as the millenium approaches.
(It's not just on 1 Jan 2000 that problems will start - any accounting
programs that have a financial year different from the calendar year will
fail during 1999. And so on...)

One quibble with Sherri's message before moving on the main point. She says:
>But most computers have hit the market in the last 20 years.
>While technology moves quickly didn't anyone think about this? We have
>complex relational data bases but we don't have the year 2000?

The problem is that while most computers (in bulk terms, due to all those
PCs) have come out within the past 20 years, the big companies that use a
lot of numbers have had them since the 1950s. And in those days, memory
and disk space were expensive (and besides, computer programmers like
abbreviating things) so the "unnecessary" four-digit years were replaced
by two-digit years, and the "19" assumed. Saves a lot of space if you have
a database with hundreds of thousands of dates of birth, say.

And this became a cultural norm _even when there was more disk space
cheaply available_. I started programming in an insurance company in 1989,
using PCs which were primitive compared to today's models, but had plenty
of space compared to the 1950s computers. But - I quickly saw that the way
of writing a date was "30/10/96" and not "30/10/1996", and so just used
that format. The thought didn't occur to me. Indeed I don't recall ever
considering what would happen in 11 years time, nor did we ever talk about
it.

So the cultural of computer programmers learned how to program in this
way, and kept it like that. It doesn't help that the computing culture in
most large MIS departments is _extremely_ conservative about the way they
go about things: the technology may change, but the methodologies and
languages remain the same.

Sherri goes on (after some more things):
>I am
>not advocating anarchy -- just a brief meltdown. But could this date
>issue actually bring about chaos? Do you know how many of the services you
>receive are based on a computerized calendar? Are we living in denial
>about a technical aspect of this system and what can we learn based on our
>behavior?

It could cause chaos, yes. Although maybe it won't: the whole thing does
have a faint taste of the end-of-the-world-is-nigh hysteria that some
groups like to push towards the end of a century/millenium. (Admittedly
computer scientists are a rather different group from the usual prophets
of doom...)

A little eco-anarchist voice in me says "wouldn't it be a good thing to
stop our world relying so much on the infernal machine? Perhaps it could
bring about a fundamental rethinking of our relationship to technology, of
the kind that's so badly needed." Somehow I doubt that would be the
result, but the world stock markets suddenly grinding to a halt would do
us all a heck of a lot of good even if other things didn't. [And of course
I think computers have brought us benefits - like this discussion...]

A strange set of possibilities!

Magnus

--

Magnus Ramage Computing Dept, Lancaster University, LA1 4YR, UK Email: magnus@comp.lancs.ac.uk Web: http://www.comp.lancs.ac.uk/computing/staff/magnus.html

If the only tool you have is a Hammer, then everything looks like a Process

Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>