The Y2K bug

the Y2K bug.

were tech companies actually scared or was it all a scam?

Other urls found in this thread:

youtube.com/watch?v=ddzbxJasID4
youtube.com/watch?v=VT_GG7q3vhs
twitter.com/AnonBabble

Mostly a scam, for some stuff a real issue though. There were tons of bullshit companies providing y2k audits to businesses.

Yes.

I worked at a dotcom startup at the time, we knew Y2K was bullshit but the managers of the building we were based out of were terrified. I remember they had people come around and make sure that the electrical and data systems would world on Jan 1st. Those fuckers even had the lifts and air conditioning systems tested.

You tell me, i still haven't left my Y2K bunker and can't tell if it's safe to go outside yet.

Actually it was of a concern. Massive amounts of money was spent to prevent catastrophic failures. Ignoring the problem would probably have had serious consequences. Still buttmad that my hp computer I bought in 1995 was not Y2K ready.

My first IT job was doing regression testing and bugfixes for y2k, it was a legit technical issue, but impact was overplayed. It would cause bugs here and there, accounting issues, maybe some over/under payments and interest miscalculations, I highly doubt anyone would have died from it

>Those fuckers even had the lifts and air conditioning systems tested.

what in the fucking hell

I'm hyped for the y2k38 bug. There will be so much work to be done for people working in the rolling stock, aviation, process control and power industry is very vulnerable.

If you somehow still use an 8086 and 286 pc in 1999, yes it is an issue

what ar you talking about? My 286 works fine.

youtube.com/watch?v=ddzbxJasID4

The media articles were mostly hyperbole but it was a legitimate issue back in the day.

You had to test every single thing if you wanted to be sure there'll be no issues. There was no gurantee that anything would work on 1.1. 00:00 without testing it first. Even in 90's there was a lot of legacy code in use that was never designed with Y2K support in mind.

it's not, stay inside.

All I remember was when I was a kid I changed the date on our 386 to ops pic to test it out. That was before 1998, so it was a hot topic for a while.

I understand if the lifts were modern (though it would be overkill in 1999 to have a lift which knew the date), but aircon? wtf, nobodys life depends on that.

>10 million AC systems die simultaneously
>no central heating in murrica
>die of cold or business shut down for months

I should try get some embedded and legacy xp so I can be a rich old wizard when that time comes around.

Why would an AC system shutdown from thinking the year is 1900 or similar and not 2000?

more like 0.1% of 10 million AC systems die simultaneously, 1*10^4 people need to remove power and/or a bios battery before they can cool off.

is for

Error state. The central control system could go bananas and stop working.

Do you want to be one of the 100000 businesses that needs to buy a new AC system at the same time and have your business suspended for a few months? Or do you spend a significantly smaller effort by testing and fixing it 2 years before potential meltdown?

And yet this didn't happen. Even things which used dates explicitly didn't just die suddenly, for the most part:
>In the United States, the US Naval Observatory, which runs the master clock that keeps the country's official time, gave the date on its website as 1 Jan 19100.[31]
>In the United States, 150 slot machines at race tracks in Delaware stopped working.[30]
Holy shit man, 150 machines!? No way!

10,000
and the point is that nobodys life really depends on AC, and in 1999 most AC systems would be very very basic from a logic standpoint, likely date agnostic. So the chance of being adversely affected was very very slim.

Also many systems could be forced to an earlier date to postpone any repairs. This is before IoT.

You don't think that part of why it never became a big issue was because everyone spent the 3 years before Y2K to fix their business critical systems?

So the correct course of action would be ignore the issue like a retard and not test the system beforehand? Yeah sounds like a great plan. Which is why you that was working with servicing AC systems in the 90s still do that now.

Surely not without an adapted DOS clock device driver that corrects at least the year reported by the RTC.

Maybe. Maybe not:
>Countries such as South Korea and Italy invested little to nothing in Y2K remediation,[61] yet had the same negligible Y2K problems as countries that spent enormous sums of money

Most computer systems really don't need to know the date to not catastrophically fail.

>>Countries such as South Korea and Italy invested little to nothing in Y2K remediation,[61] yet had the same negligible Y2K problems as countries that spent enormous sums of money
also

>The lack of Y2K-related problems in schools, many of which undertook little or no remediation effort. By 1 September 1999, only 28% of US schools had achieved compliance for mission critical systems, and a government report predicted that "Y2K failures could very well plague the computers used by schools to manage payrolls, student records, online curricula, and building safety systems".[63]
>The lack of Y2K-related problems in an estimated 1.5 million small businesses that undertook no remediation effort. On 3 January 2000 (the first weekday of the year), the Small Business Administration received an estimated 40 calls from businesses with computer problems, similar to the average. None of the problems were critical.

I have not turned it on in years, but it worked fine then... I did not look at the clock

Couldn't all of this be solved by just chucking the year in a 32-bit int and the rest of the date into another 32-bit int?

We would have over 2 billion years before this format had to be updated, at the whopping price of an extra 32 bits.

a bigger issue will be 2038
we didn't have so much technology in 2000 to really even cause an issue. the biggest issue was two digit years that all had to be converted to 20xx

>ou had to test every single thing if you wanted to be sure there'll be no issues. There was no gurantee that anything would work on 1.1. 00:00 without testing it first. Even in 90's there was a lot of legacy code in use that was never designed with Y2K support in mind.

Oh yeah, that makes total sense, until you think that even then these things didn't have date settings, so it wouldn't made a difference at all.

>What is the past tense

They fixed it by using a 32-bit signed integer. The problem will repeat itself in 2038 when it rolls over. I know of train control systems that will have to be fixed because the date will be fucked up. The date in the train will mismatch with the signalling system and make the train apply full emergency brake.

The only way to fix it is to replace the hardware the train software is running on. And that will be extremely expensive considering all certifications it must pass. Probably cheaper to just buy new trains despite it being some 10 years left before the planned obsolescence.

After our investigation the reactor control software would have caused an emergency stop at a nuclear power plant at the rollover. The end result was that they bought an entire new process control system and had it installed before the rollover.

As I said, wouldn't this be solved by giving the year its own 32bit signed int and the rest (time + day/month) its own 32bit signed int?
This way even 32-bit computer systems could have a datetime system that works for 2billion years.

>>>>>>>>implying they need new AC in january

The fix is using a signed 64-bit time_t integer

the y2k shit was all electronic devices were magically going to stop working, not a ((((((((computer))))))) database issue like 2038

BIGINT TIME BOYS

>The fix is using a signed 64-bit time_t integer
But lots of computers aren't 64-bit.

Can't you just used unsigned integers at that point..?

How do you know that? An AC unit might keep track of running hours or time untill next maintenance. An unhandled exception in those systems might cause a chain reaction that could bring the whole system down. But you're correct that this was more about Y2K panic than logical reasoning.

Memory-based issues should not occur before 2038. Y2K bugs are almost always logical errors rather than hardware limitations.

Computers do not process time in hours or days or years. Time is (or should be) always calculated in seconds and then converted to human-readable formats.

But yes, switching from 32-bit to 64-bit numbers would push the rollover year so far into future that universe will end before it'll occur.

>An AC unit might keep track of running hours or time untill next maintenance.
>1990s

Not needed
You don't need a 64 bit computer to use 64 bit datatypes.
Some use unsigned 32 bit integers to push the rollover another 90 years or so into the future.

>m-muh time telling program needs to be speedy fast so I'll use C without bigints
>bugs? In my program? But I'm an expert! Thats why I chose C afterall
>my programs are flawless, I'm glad I'm using speedy 32 bit signed integers

It doesnt matter, 32-bit computers can handle 64-bit or even larger numbers. The only issue is that it slows down operations because these numbers have to be processed in multiple parts.

Might as well rewrite the whole system while you're at it.

what

I use bigint in postgresql

idc about your c shit

They were scared, but most bugs were patched in the early 1990s so there were no issues in the end

>Deep Impact, was lost when its internal clock reached exactly 2^32 one-tenth seconds since 2000 on 11 August 2013, 00:38:49 UTC.
People still think time rollovers can't cause serious issues.

My 286 saves the current year without problems

>a scam
It wasn't so much a scam as people being stupid and companies making money off of that (as they always do). It's like selling hats that shield your head from the governments' mind-control rays - the market is already there; you don't need to come up with a scam yourself.

In theory, one way to fix the Unix time bug for 32bit systems would be to change the offset.
That will need to be handled on a case-by-case basis.

But I wonder if there is a more system-wide thing you could do to manage it instead.
You COULD virtualize 32bit Unix time to 64bit. But it would be a pain in 50 asses, and some programs will still have a shitfit anyway because they deal with 32bit values only, leading to memory issues.

Not sure what option would be cheaper overall.
Depends how many programs have hard limits on the time data type.

on per companny basis software option's probably cheaper in almost every embedded case because they'd have to redo entire hardware systems to work with whatever botnet chip they're getting from (currenntyear)

>Y2K was nearly 17 years ago

Although there was a lot of fear mongering about the millennium bug, the general feeling I remember of that time was mostly excitement and optimism for the future

youtube.com/watch?v=VT_GG7q3vhs

Where did it all go wrong?

People were kinda scared. But then 2000 hit and nothing happened, everyone was like KEK

Damn, you are not wrong, having this kind of experience will surely be valuable.

It went wrong for a while. This year we shifted back into optimism mode

yeah i remember back then IT people were viewed as gods it was a a scam. today saying you work as IT =jobless

Yes. AC systems didn't exist in 1900 so how would they work if they thought that was the date?