OO is important and if I’m three years of computer science there’s no time to teach it then you have to ask what you’d heck they are doing.
OO is everywhere and it’s an important software concept and is the right solution for certain classes of problem.
Inheritance, polymorphism and encapsulation inform every major professional programming language and framework (except c). Even go which in many ways is active response to oo uses most of those concepts extensively. One major challenge of people programming in go is how to adapt the familiar patterns to it.
Now I’d probably not teach the whole GoF as an exercise in cataloging patterns but teaching a few of the most common while showing the concept of patterns (probably the least well understood concept in development) seems sensible.
I understand that not every developer is a big fan of OO but that doesn't mean we can ignore it.
OO is out of fashion just like blockchains and NFTs are out of fashion, and the same way AI will fall out of fashion in the future. The huge hype around it will die and what's left covers the few useful scenarios.
OO isn't debunked and its not out of fashion.
All that's happened is that it's no longer gospel that "the only way to program is OO".
I write code that has all sorts of styles and approaches that fit the task at hand and sometimes the right tool for the job is OO.
I get the sense you're saying that OO has been proven to be hokum and no one should learn it or do it anymore and all that remains OO is the smoking ruin of 20 years of Java. That's not correct at all.