While "teachable moments" in most contexts aren't fatal, the overall point that it's better to learn from other people's mistakes remains the same.
From my experience on this planet, purely theoretical teachings tend to be forgotten in favor of things you've actually used and done.
I've had plenty of times where I lacked both the theory and the scraped elbows (creating shared state clusterfucks that seemed clever at the time), and times when I was missing the theory alone (mechanically reaching for the OO hammer because there was a nail-shaped screw in front of me).
In all programming instances I can think of where I had the theory but not the experience I knew that there was a lot I didn't know and read about the implementation issues and went and bothered smarter people about it. Of course I wr[i|o]te plenty of shitty code in various contexts. I guess we can say that's the experience happening, but that's a bit unsatisfying as a model.
With the theoretical learning you get measurably better: if you've literally never even heard of time complexity, or read the latency numbers everyone should know, you're more likely to write some code with some garbage performance. Once you know about it you're unlikely to write grossly nested loops making unnecessary un-batched requests to servers 12 timezones away, or if you do you'll at least feel gross about it and it will be a (bad) choice not a mistake.
With experience with a particular technology you do as well. Knowing the Widget API inside out and knowing which Widget calls are lazy vs eager and which get cached or whatever are going to improve your code and get it written faster, but the gains are much more marginal than the theoretical knowledge. With enough practice learning new APIs gets easier. Remembering arbitrary incantations is something everyone eventually gets fairly good at. I think most of HN would do very well at Hogwarts. Assuming you already know how to program, I bet just reading something like Clean Code is likely to improve your JS quality more than an equivalent time spent churning out new JS (even though the example language in CC isn't JS so any benefit would be ported through theory).
It's very cheap to implement theoretical concepts in practice and I think a lot of dismissiveness towards "theoretical" things are defensive insecurities and/or lazy, probably at the same rate that "premature optimization" is utilized in such causes. Software has a very narrow gulf between practical and theoretical.
In domains where experience and theory are farther apart I propose that it's not so much experience being better than theoretical knowledge, rather the theoretical knowledge is frequently just plain wrong or extended beyond its applicability.
Look at the FBI's terrible high-school surveillance program that was linked here recently. It's based off of really garbage pseudo-science concerning extremist radicalization. There is better, more modern research on the subject, which should be used instead, but even then caution should prevail and less theoretical and more informed logic and experienced based decisions might ultimately be wiser. In a healthy field of study and practice, as time goes on and more data gets accumulated theory increasingly approaches reality.
Now consider the math courses you take at college for example. Here the important part is to learn and understand deeply abstract concepts, but then you explicitly need to practice in order to tie them together so you are actually able to use this knowledge.
I think both aspects of assimilating knowledge are absolutely crucial in all domains, but often one of them is naturally the default, while the other requires some effort.