this post was submitted on 06 Nov 2023
1164 points (98.3% liked)

Comic Strips

12989 readers
1790 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 10 points 1 year ago (1 children)

Early computers had very limited resources, RAM, storage, etc. (first computer I worked with only had 4k of RAM for example) It often made sense to only use the last 2 digits of the year as an optimization in many common tasks that computers were used for, as both the 1800s and the 2000s were far enough away that most basic date calculations worked fine. Also, the industry was changing rapidly, and few people expected their software to be used for more than a few years - certainly not for decades, so focus was usually on solving the immediate tasks as efficiently as possible, without much consideration for the distant future.

However, it turned out that a lot of the code written in this period (70s and 80s) became "legacy code" that companies started relying on for far longer than was expected, to the point that old retired COBOL programmers were being hired for big $$ in late 90s to come and fix Y2K issues in code written decades ago. Many large systems had some critical ancient mainframe code somewhere along the dependency chains. On top of that, even stuff that was meant to handle Y2K was not always tested well, and all kinds of unexpected dependencies crept up where a small bug here, or some forgotten non-compliant library there could wreak havoc once date rolled over into the 2000s.

A lot of the Y2K work was testing all the systems and finding all the places such bugs were hiding.

[โ€“] [email protected] 3 points 1 year ago

that's interesting, thank you!