In this thread: mostly people that don't know how timekeeping works on computers.
This is already something that we're solving for. At this point, it's like 90% or better, ready to go.
See: https://en.m.wikipedia.org/wiki/Year_2038_problem
Time keeping, commonly, is stored as a binary number that represents how many seconds have passed since midnight (UTC) on January 1st 1970. Since the year 10,000 isn't x seconds away from epoch (1970-01-01T00:00:00Z), where x is any factor of 2 (aka 2^x, where x is any integer), any discrepancies in the use of "year" as a 4 digit number vs a 5 digit number, are entirely a display issue (front end). The thing that does the actual processing, storing and evaluation of time, gives absolutely no fucks about what "year" it is, because the current datetime is a binary number representing the seconds since epoch.
Whether that is displayed to you correctly or not, doesn't matter in the slightest. The machine will function even if you see some weird shit, like the year being 99 100 because some lazy person decided to hard code it to show "99" as the first two digits, then take the current year, subtract 9900, and display whatever was left (so it would show the year 9999 as "99", and the year 10000 as year "100") so the date becomes 99 concatenated with the last two (now three) digits left over.
I get that it's a joke, but the joke isn't based on any technical understanding of how timekeeping works in technology.
The whole W2k thing was a bunch of fear mongering horse shit. For most systems, the year would have shown as "19-100", 1900, or simply "00" (or some variant thereof).