The official logo of webweevers.com - a division of HidalCorp

 

 

WebWeevers.com

 

<< BACK

CONTACT US

The Official Flag of the Great United States Of America!

DEO JUVANTE

 

  integrity
    flagstaff bmx
    flagstaff info
    faith
 

Google

 
 
terms of use
    All information provided within this website (complete) is offered as is and is not supported or specifically recommended over any other site.
   

Should you have any questions, comments or otherwise, please click on this text, my picture above.

 

A "Stumble Upon" Community Favorite
A "Stumble Upon"
Community Favorite

 
 
    to top

 

Y2K Info

Other Y2K Pages

The History And The Hype

Y2K History

  • February 13, 1984: Paul Gillin makes the first printed reference to the Y2K problem in Computerworld  magazine.
  • September 6, 1993: Peter de Jager makes the first printed warnings of the dangers of the Y2K bug, also in Computerworld  magazine.
  • 1993-1999: Governments and businesses worldwide spend somewhere between $300 billion and $900 billion fixing Y2K bugs. Fueled by inaccurate media coverage and gossip, many expect a doomsday scenario of chaos and destruction at midnight January 1, 2000.
  • December 1999: Governments and large businesses worldwide set up 24-hour Y2K crisis centers. After the U.S. warns of a worldwide terrorist threat to strike during the holidays, governments worldwide raise security for millennium celebrations to unprecedented levels. Many employees had to work or be on-call for December 31 and January 1.
  • January 1, 2000: The Y2K rollover occurs with minor problems. People worldwide are ecstatic. Most people believe that the Y2K problem is over, and many question whether there was ever a problem to begin with.

Click here for original site

 

Computer scientists may disarm the Y2K bomb in time, but that doesn't mean they didn't screw up.

BY CHRIS TAYLOR

Two digits. That's all. Just two lousy digits. 1957, they should have written, not 57. 1970 rather than 70. Most important, 01-01-2000 would have been infinitely preferable to 01-01-00. Though most of the dire predictions connected with that date--the Year 2000 computer bug's moment of truth--are unlikely to come true, a little computer-generated chaos would provide a fitting conclusion to a 40-year story of human frailties: greed, shortsightedness and a tendency to rush into new technologies before thinking them through.

How did this happen? Who is responsible for the bug we call Y2K? Conventional wisdom goes something like this: back in the 1950s, when computers were the size of office cubicles and the most advanced data-storage system came on strips of punched cardboard, several scientists, including a Navy officer named Grace Murray Hopper, begat a standard programming language called COBOL (common business-oriented language). To save precious space on the 80-column punch cards, COBOL programmers used just six digits to render the day's date: two for the day, two for the month, two for the year. It was the middle of the century, and nobody cared much about what would happen at the next click of the cosmic odometer. But today the world runs on computers, and older machines run on jury-rigged versions of COBOL that may well crash or go senile when they hit a double-zero date. So the finger of blame for the approaching crisis should point at Hopper and her COBOL cohorts, right?

Wrong. Nothing, especially in the world of computing, is ever that simple. "It was the fault of everybody, just everybody," says Robert Bemer, the onetime IBM whiz kid who wrote much of COBOL. "If Grace Hopper and I were at fault, it was for making the language so easy that anybody could get in on the act." And anybody did, including a group of Mormons in the late '50s who wanted to enlist the newfangled machines in their massive genealogy project--clearly the kind of work that calls for thinking outside the 20th century box. Bemer obliged by inventing the picture clause, which allowed for a four-digit year. From this point on, more than 40 years ahead of schedule, the technology was available for every computer in the world to become Y2K compliant.

Programmers ignored Bemer's fix. And so did his bosses at IBM, who unwittingly shipped the Y2K bug in their System/360 computers, an industry standard every bit as powerful in the '60s as Windows is today. By the end of the decade, Big Blue had effectively set the two-digit date in stone. Every machine, every manual, every maintenance guy would tell you the year was 69, not 1969. "The general consensus was that this was the way you programmed," says an IBM spokesman. "We recognize the potential for lawsuits on this issue."

No one in the computer industry wanted to rock the boat. And no one could alter the course IBM had set, not even the International Standards Organization, which adopted the four-digit date standard in the 1970s. The Pentagon promised to adopt century-friendly dates around 1974, then sat on its hands. Bemer himself wrote the earliest published Y2K warnings--first in 1971, then again in 1979. Greeted by nothing but derision, he retired in 1982. "How do you think I feel about this thing?" says Bemer, now an officer at his own Y2K software firm. "I made it possible to do four digits, and they screwed it up."

Meanwhile, the torch of Y2K awareness passed to a new generation. In the fall of 1977, a young Canadian named Peter de Jager signed on as a computer operator at IBM. His first task was to boot up a nationwide banking system run on an IBM 370. When the machine whirred into life, it asked for the date. As De Jager, a mathematics major straight out of college, entered the number 77, a thought occurred to him. Did this machine care what century it was? With the impetuousness of youth, he marched off to his manager and informed him the computer would not work in the year 2000. The manager laughed and asked De Jager how old he was. This isn't going to be a problem until you're 45, he said. Don't worry, we'll sort it out.

And that, at least for the next 13 years, was the attitude De Jager adopted. "We used to joke about this at conferences," he says. "Irresponsible talk, like 'We won't be around then.'" But by 1991, De Jager, a self-described "nobody" in the industry, had decided he would be around. Four years later, he was giving more than 85 lectures a year on the topic and posting regular updates to his site, the Web's first for Y2K warnings, www.year2000.com.

And here's the curious thing. From 1995 on, Y2K awareness had a kind of critical mass. Congress, the White House and the media all got wind of the bug at about the same time. After making too little of the problem for so long, everybody began to make, if anything, too much of it.

Why then, and not two decades earlier? Why De Jager, and not Bemer? Proximity to the millennium may have had something to do with it as well as the increasingly ominous tone of the warnings. This was Bemer's dry 1979 prophecy of doom: "Don't drop the first two digits. The program may well fail from ambiguity." Twenty years later, here's De Jager's jeremiad: "The economy worldwide would stop...you would not have water. You would not have power..."

This alarmist language may yet be justified. By 1999 folly has compounded folly. In many cases, the original COBOL code has been rejiggered so many times that the date locations have been lost. And even when programmers find their quarry, they aren't sure which fixes will work. The amount of code that needs to be checked has grown to a staggering 1.2 trillion lines. Estimates for the cost of the fix in the U.S. alone range from $50 billion to $600 billion. As for Y2K compliance in Asian economies still struggling with recession? Forget about it.

The fact is that no one on the planet really knows what will happen when 01-01-00 rolls around. Whether we'll be glad we were panicked into action or we'll disown the doomsayers depends on how diligently the programmers do their job in the next 50 weeks. One thing is already clear. In a century in which man split the atom, spliced genes and turned silicon into data, the tale of Y2K--how we ignored it for 40 years, then flew into a tizzy--will not be remembered as our finest hour.

Taken from time.com - January 11, 1999

 

 

 

  flagstaff photos

 

  iq tests

 

  hidalgo

 

  suicide

    photography
    tithing
 
Winner of the International Association of Web Masters and Designer's "Golden Web Awards" for the years 2003 - 2004
Winner of the prestigious International Association of Web Masters and Designer's "Golden Web Awards" for the years 2003-2004
 
Click here to add to "My Google"
Click here to add to "My Yahoo!"
 
 
notice
    © 1997+
All rights reserved
    Another webweevers.com
Production
    A division of
HidalCorp
 
 

 

 

Quote of the Day
This Day in History
Hangman

 

 

 
 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


©
C. Czach Hidalgo 1996+ webweevers.com
"Fire @ F-Holes Cove"
Blue Ridge Reservoir, AZ
September 4, 2007

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

eXTReMe Tracker