Computing as a whole has a human factors problem. The people most able to fix this problem - programmers - are also the people who least recognize that there is a problem to fix. Indeed programmers are invested, both literally and subconsciously, in the current order. Programmers implicitly place themselves at the top of a computing class hierarchy that strictly categorizes "developers" and "users" by whether they pay or are paid, and developments that facilitate "normies" programming to a useful level are staunchly opposed. The current front line of this debate is LLM programming, which is a goddamned revolution as far as non-programmers are concerned, and programmers hate that non-programmers might be able to write programs, and tell scary stories about security vulnerabilities. But it's not new - before they were sneering at LLMs, they were sneering at LabVIEW, and telling scary stories about maintainability. Or Excel.
The smartphone revolution was really a user interface revolution, but in making computers accessible it also infantilized the users. This philosophy is now baked in to such an extent that not only are you not allowed root on most computers, but writing an "app" has an exponentially higher barrier to entry than writing a shell script, or even a basic Hello World in C. Programming is becoming harder, not easier. Compare to the 80s, when a child or teen's "game console" might have been a ZX Spectrum or Commodore 64, which would boot right to BASIC - a language which, incidentally, remains a much better introduction to programming than Python.