@glupson , Y2K mainly affected legacy systems. That is, systems that were written on OS from the 60's, had been implemented in the 70's, elaborated in the 80's and fudged along in the 90's.
It was all because memory was so scarce in 1960. Back then a mainframe ran on core memory - that is, magnetic beads, or cores, which were physically moved back and forth to indicate one bit. A mainframe computer the size of a truck might have 32K of core. Every bit was precious, literally.
It takes 4 bits to indicate a digit, so the software was written to use 8 bits, not 16; that is '66' uses 8 bits, '1966' uses 16 bits. Anyhow 40 years is a long time in computing, and we'll all be retired by 2000, and "Use computers for air traffic control? Huh? What are you smokin'?"
Net result, deep in most every big application was a date function which could only handle two numbers, and returned an error message when it saw a year beginning with a zero. That caused the program to crash and fail. The prospect was a return to no computing whatsoever for all the longest standing applications (hence most important) in society, and no paper backup. Overnight.
Easy when you take the trouble to understand the issue. Thank goodness most of those responsible did.
Sorry to digress.