But hey, feeding everything from single microcontroller is $2 cheaper...
Anyway. A big part of that class was learning to clean, repair, and manage printers. Again, it was the 90's, and we were high school kids. We came out quite capable with many computer skills but the printer stuff really stuck with me. I've done technical support throughout the years and have setup hundreds of printers.
The printers of today are awful landfill fodder compared to the Okidata's of the 90's. Pure simplicity and speed vs FULL COMPUTERS, with scanning, faxing, and every other imaginable feature crammed in with zero hope of doing anything other than replacing the toner.
The first Laserwriter in 1985 had more processing power than the Macintosh it was sold to accompany.
Printers have been full computers for a long time now. As we expect them to do more and more, the computers in them get more and more complex.
Who does? Who asked for updates blocking third-party ink, 1GB "drivers", full-color "test prints" each time you switch it on, ...?
Printing reliably doesn't sound too demanding, manufacturers reached that point long ago, and since then I haven't seen all that much groundbreaking innovation. Sure, things like wifi were added but that doesn't require cutting-edge technology - consumer devices could handle that 20 years ago, and more reliably than the printers I've used. I also haven't heard of anyone being excited about NFC in printers, and from experience I can say it's not nearly intuitive or frictionless enough to warrant the integration.
Maybe. But how expensive were they?
I can buy a good laser printer for under $200 these days. It will be more compact, lighter, mechanically simpler and use way less power than older printers. Something has to give.
Some older printers were really overengineered (which in many cases did make them more reliable), but that has a cost. Turns out, consumers didn't want to pay those costs.