can't blame them for exploiting stupid peopleMrFelony wrote:i bet walmart came up with the hysteria. they must amade a SHOOT load of money off of it
Bird Flue
Moderator: EG Members
- Devil_Dante
- Crusher of Dreams
- Posts: 1629
- Joined: Fri Apr 22, 2005 3:47 pm
- Location: In the middle of nowhere
- martyr3810
- Mastered PM
- Posts: 188
- Joined: Sun Jun 19, 2005 10:33 am
- Location: An Unmarked Grave
:shrug: I'm still of the opinion that Y2K and SARS were manufactured panics... I don't see this Bird Flu being much different - currently at least. Were any of the events stated by Killfille to occur I suppose there would be legitimate cause for panic, but mostly I think its just drug companies and the media capitalizing on it.
I am without a home and in that I find my place in it all.


- Killfile
- Flexing spam muscles
- Posts: 587
- Joined: Thu Jun 23, 2005 8:54 pm
- Location: St. Petersburg - 1917
- Contact:
Psi will back me up on this. We program computers for a living (what, you think anyone can earn a living on a history degree?).
The Y2k issue was (and indeed remains, but we'll get to that in a sec) a very real issue in computer science. You're all familiar with the background, but there's a little more to it. With modern programing languages, you can store the date as an integer and rely upon the fundamental reality that when you add 1 to 999 you're going to get 1000.
But that kind of reliability costs memory, and back in the early 1970s, when the IT infrastructure of the country was just coming into being - well - 640k wasn't only enough for anybody - it was a decidant amount of memory. To save space, programers would store things in the minimum possible number of bits and even define (loosely) the operations possible on those bits so as to avoid spending memory on things they didn't need to do.
Thus the familiar Y2k problem was born. Store the date as 79 instead of 1979 and suddenly you've got issues with what happens when you add 1 to 99. It should go to 100, but there is an overflow error and you just get 00.
Now we're not talking "rise of the machines," but there are very real consequences to this kind of an error. First off, processes that are supposed to run every X number of seconds/days/minutes/hours whatever, rely upon the date and time to tell them when to run. Unexpected data can have unexpected results. We know the data to be unexpected, because if the programers HAD expected it they wouldn't have stored the date as 79 in the first place.
More problems arise when we start taking about the internet though. Almost every network service relys upon synconized time to work out how to run. Indeed, synconized time is so imporant that the NTP (network time protocol) takes up much of any high level networking course. With divergent times, encryption stops working, packet handeling dies, and distributed applications, like webclusters, database farms, search engines, etc all go down the tubes.
Could some of these errors have triggered nuclear meltdowns, plane crashes, and the like? Doubtfull - most systems capable of such catestrophic faliure are required to have a human monitor at all times.
But financial collapse was a real possibility. With that collapse comes a differnet kind of human misery - a run on the banks and an economic depression the likes of which the world hasn't seen since the 1930s.
Y2k was real - but most of the media outlets had no idea HOW it would happen.
Of course, it can still happen. Unix (which runs on almost everything that runs the internet) stores the time as seconds since an arbitrary time in the 1970s. That second count is going to exceed the maximum value of an integer in 2037. Before that happens, we need to fix Y2k+37. But as they said 30 years ago in 1975 - we've got a while, and there's no way today's code will still be in use then.
The Y2k issue was (and indeed remains, but we'll get to that in a sec) a very real issue in computer science. You're all familiar with the background, but there's a little more to it. With modern programing languages, you can store the date as an integer and rely upon the fundamental reality that when you add 1 to 999 you're going to get 1000.
But that kind of reliability costs memory, and back in the early 1970s, when the IT infrastructure of the country was just coming into being - well - 640k wasn't only enough for anybody - it was a decidant amount of memory. To save space, programers would store things in the minimum possible number of bits and even define (loosely) the operations possible on those bits so as to avoid spending memory on things they didn't need to do.
Thus the familiar Y2k problem was born. Store the date as 79 instead of 1979 and suddenly you've got issues with what happens when you add 1 to 99. It should go to 100, but there is an overflow error and you just get 00.
Now we're not talking "rise of the machines," but there are very real consequences to this kind of an error. First off, processes that are supposed to run every X number of seconds/days/minutes/hours whatever, rely upon the date and time to tell them when to run. Unexpected data can have unexpected results. We know the data to be unexpected, because if the programers HAD expected it they wouldn't have stored the date as 79 in the first place.
More problems arise when we start taking about the internet though. Almost every network service relys upon synconized time to work out how to run. Indeed, synconized time is so imporant that the NTP (network time protocol) takes up much of any high level networking course. With divergent times, encryption stops working, packet handeling dies, and distributed applications, like webclusters, database farms, search engines, etc all go down the tubes.
Could some of these errors have triggered nuclear meltdowns, plane crashes, and the like? Doubtfull - most systems capable of such catestrophic faliure are required to have a human monitor at all times.
But financial collapse was a real possibility. With that collapse comes a differnet kind of human misery - a run on the banks and an economic depression the likes of which the world hasn't seen since the 1930s.
Y2k was real - but most of the media outlets had no idea HOW it would happen.
Of course, it can still happen. Unix (which runs on almost everything that runs the internet) stores the time as seconds since an arbitrary time in the 1970s. That second count is going to exceed the maximum value of an integer in 2037. Before that happens, we need to fix Y2k+37. But as they said 30 years ago in 1975 - we've got a while, and there's no way today's code will still be in use then.
Killfile wrote: (what, you think anyone can earn a living on a history degree?). .
Heh that made me think of Jamie Cullum's great song Twentysomthing,where he sings about how he is an expert on Shakespeare but that the world doesn't need an academic.
Sorry for being offtopic but i really reconize myself in that song,hopefully i wont end like him here.
The ink of a scholar is worth a thousand times more than the blood of the martyr- The Quran