"Has been a lifesaver so many times!"
- Catherine Rampell, student @ University of Washington
"Exactly the help I needed."
- Jennifer Hawes, student @ San Jose State
"The best place for brainstorming ideas."
- Michael Majchrowicz, student @ University of Kentucky
What is Y2K and What Effect Did It Present On Modern Culture?
Two seemingly small digits may turn January 1, 2000 from a worldwide celebration into a universal nightmare. Affecting companies worldwide, many pay millions upon millions of dollars in order for computers to recognize the difference between the years 2000 and 1900. One of the world’s most detrimental dilemmas, the year 2000 computer bug is an extensive and interesting problem that everyone must face. The definition and implications of the Year 2000 crisis are as unique as the steps taken by modern human culture to prevent them.
Like all other tasks, computers process dates as numbers. To the outside world, date values can have numerous formats and meanings but to the internal workings of a computer, a date is just another set of numbers (Kendall 63). When expressing the year part of a date, using two digits such as "4/5/96", the possible values for this year part range between 00 and 99. Obviously, if the year part were expressed using four digits, the values could range between 0000 and 9999 (“The History and the Hype”).
In real life, if one adds 1 to 99, the answer is 100. However, if one tells a computer to add 1 to 99 and also specifies that the result must be no more than two digits, the answer is 0 or 00. Now consider the effect of this numeric example on a date which expresses year values with two digits. If one takes the date "4/5/99" and tells the computer to add 1 to the year part, the result would look like "4/5/00" (Kendall 68). To most humans, this date will suggest that the year part represents the year 2000. But, to a computer (and this is basically the problem), because the numeric representation of the year is effectively zero, the year is interpreted as 1900. According to the logical thinking of a computer, adding 1 year to 1999 results in the year 1900 (Johnson Interview).
The whole question is one of interpretation. Humans can usually distinguish the intended value of a two digit year because of the context of a date within its subject matter. For instance, if one writes "I will graduate 6/1/01", most people will automatically assume that the year they hope to graduate in is 2001. If the person said the same thing to a computer, the chances are that the computer would interpret the same year as 1901 (Marcus 34). Basically, the definition of the Year 2000 problem is the “inability of computer programs to correctly interpret the century” from a date which only has two year digits (Johnson Interview).
On the face of it, the specification of the problem appears to be fairly simple, and so many may think so is the solution. After all, how much of a problem can two digits cause? As the reader will discover, those two digits will be the reason for the largest and most costly task ever undertaken by any industry, world wide.
Almost all the time, the first question asked when somebody learns about the Year 2000 problem is, "How was this allowed to happen?" (Johnson Interview). To most people, the thought that so much damage could be done; by so many people over such a long period of time and completely undetected; is absolutely beyond belief (“Y2K: So Many Bugs... So Little Time”).
The fact of the matter is that the Year 2000 issue is there. Programmers are aware of this problem for years. Unfortunately for us, because of the "I won\'t be around in 15 years, so it doesn\'t concern me" attitude the programmers display, the problem goes unchecked (Johnson Interview). It\'s only because the likely implications of the Y2K crisis being almost on top of us and because companies now stand to lose large amounts of money that the issue is now finally receiving the attention it deserves (“Y2K: So Many Bugs... So Little Time”).
When examining the underlying reason for the cause of the problem, two culprits arise. The first, and certainly the most instrumental reason, is the issue of storage space in the 1960\'s & 70\'s (Blair Interview). During this era, the cost of storing data was far from insignificant. In an effort to minimize storage costs, most projects will make a drive to
View Full Essay
Calendars, Software bugs, Procedural programming languages, COBOL, Hazards, Year 2000 problem, Year 10, 000 problem, Y2K, PLI, Addition, Programmer, Computer programming
More Free Essays Like This