To use my father as an example, he sat at work with computers on his desk for 20 years, running everything from Windows 3.11 to Windows XP. He never had a problem using or understanding the computers in front of him because generally any pattern that he learned in one place would apply pretty much everywhere else on the system. Buttons had a uniform look. Radio controls worked the same way. Menus were in predictable places and had similar structures. Toolbar icons, if present, had a uniform appearance. Folders and files were presented uniformly. Tooltips appeared over many things if you hovered your mouse pointer over them. Embossing, solid, thick and dotted lines all had meaning designed to guide the user in certain directions. Disabled controls looked visibly non-interactive. Even "Open", "Print" and "Save" dialogs were globally consistent.
In that regard, older versions of Windows were beautiful in their simplicity — they were uniform and predictable. The same is true of classic Mac OS also. The reason for this is because both Microsoft and Apple spent a lot of time and energy working with real world users to figure out what made sense and what needed work. The result was that they both ended up with clear user interfaces that had learnable visual cues.
Then this aesthetic design trend started and now user interfaces have never been more unclear. Overwhelmingly UI elements are now flat or inconsistent in their appearance. It isn't clear at a glance which objects are interactive and which aren't, or what the side effects will be when you click on given thing. To make this worse, Electron happened, effectively leaving developers (who usually have completely insufficient understanding of user experience or design) to roll their own user interface toolkits and to build applications that end up looking nothing like anything else on the system. So now you don't just have to contend with the fact that the operating system controls aren't as friendly as they used to be, but now every application is out there trying to play by its own rules too, with their own designs and their own learning curves.
Now we're in a situation where very little of what you learn in one application makes sense or applies in another, which is the epitome of user hostility. I spend a lot of time trying to remotely diagnose wildly inconsistent applications over the phone and try to help him make sense of the mess that is modern day desktop computing. He rightly finds it confusing and overwhelming. Even in my own experience, modern macOS isn't much better.
At this point, Microsoft's "Official Guidelines for User Interface Developers and Designers" (2001) and Apple's "Human Interface Guidelines" (1995) should be mandatory reading for everyone who thinks they know better.
There's real psychology and methodology to building a good user experience. Every single detail matters.