If you've been around computers long enough, you've seen this message.  It might have been decades ago, but it's held up as an example of idiocy for the same reason people make fun of the McDonald's Hot Coffee lawsuit.  Also, just like with the McDonald's case,  the people making fun of it are really really missing some important info.

j48vZC4This error message is not an accident. This is not someone being dumb. This is… an artifact of the A20 line hack and it's glorious and stupid but not for the reasons you think.

When the first IBM PC hits the market almost 35 years ago, a bunch of programmers end up relying on a memory seeking method that involves just cycling through RAM until it bounces off the end register and restarts from the beginning until they get to their target.  They do this because it's easy and saves some processor ticks and, well, it works.  This is happening out in the field in thousands of businesses and all the software is pretty much custom written so there's no central 'update' depot.

When the AT chipset comes out and the range of available memory expands, this seek method will now cause an overrun condition and crash a computer but they don't figure this out until very late in the ship process.  IBM realizes that if they ship their new flagship systems, thousands and thousands of businesses will experience huge problems and they'll probably blame it on the machines.  This will be a disaster.

"OH SHIT", they say, "WE NEED TO FIX THIS". They also needed to turn off Caps Lock but that's a different story. They decide they'll fix it by having a watchdog circuit keep an eye on the memory and look for a process using this traversal method to bounce back to the beginning then manually DO that. But… there's a shortage of processing. Anything that's doing this WON'T be doing its real job, and that'll slow something down.

"We need to find a processor that can take on this job without hurting performance!" they now cry. They look all over. Disk controller? No, it's busy. They need every bit of power for I/O. Graphics? GOOD LUCK, this is a game of inches and picos with pixels sprinkled everywhere and they can't spare it there either.  Even if they use the CPU, they'll take a performance hit and MIPS is everything.

Eventually some engineer (maybe the whole CAPS LOCK thing is still fresh on his or her mind) points at the keyboard and drunkenly rasps "What about that?" It turns out there's a little processor in the keyboard that's not performance bound. Nobody is running benchmarks on keyboards, at least not that this will affect. They write a hack… IT WORKS!

Post fix, the keyboard then sits there between keypresses watching for 1980-era business code running wildly through the memory and delicately flicks it back to the beginning when it approaches the previous limit.

Eventually, this memory management was moved onto the motherboard and the keyboard was no longer required. Modern computers account for little hacks like by the hundreds in their BIOS and the little processors in keyboards are once again free to go back to waiting for you to type the next letter in your Great American Novel instead of performing a vital chunk of memory management.

And that's why it was so important to have a keyboard plugged in way back when.  It wasn't a dumb error, this was a leftover chunk of computer history that stuck around for a few years after the original bug it addressed had been fixed.  But… the original fix for the bug was far from stupid like anyone who's seen that message has suspected, far from it.  It was a kinda crazy smart hack if you think about it.  Maybe it's not stupid if it works.