This isn't going to change any time soon and it's wise to prepare strategies in advance.
Few days ago a guy complained about having to write code during the job search process. Most comments I read here on HN were not sympathetic to him. He was advised not to send out as many resumes, show more passion etc etc. "Show some code" is a totally different request than "Show me the code for this exact problem I made up" - yet many programmers here were commenting as if they have never had to do a job search.
My point is that before blaming others, it might be good for the community to introspect and see if the cause is truly from outside.
At work, my project is purely C++. But if I have to write a tool for data analysis or to test a subsystem, I'll use C# so I improve my .NET skills. Before,I used Python for that, but C# became more appealing.
Same thing at home. Web projects (of which there are few!) use PHP, everything else is C/Assembly (I do a lot of microcontroller stuff) or C# front ends using Visual Studio Express.
If you understand the differences between the OO concepts in Java, Ruby and Python, then you'll find it easy enough to pick up the surface layer (the syntax) of the languages.
If you're in a java shop and learning Python, have a look at jython and see where you can use it for testing, automation, etc. Even if only for your own local scripts.
1. I learn it "enough" to know what it's good at, what it's bad at, and to have enough hand-written code to pick it back up pretty quickly (particularly if you set up a test-driven learning environment)
2. Even when I forget the language's particulars, it changes the way I see other programming languages. For instance, while I don't use Ruby in my day job yet, I still think of things I can do with method_missing and how I might approximate that same power and flexibility in my work where appropriate. In short, learning languages helps me program "into a language" rather than "in a language", to borrow Steve McConnell's terminology.
Other than that I agree with the main idea of the article...
If you do have all the skills he listed, how do you make the leap into doing real top-level creative work?
For example, one of the top links on Hacker News now is a profile of Brad Fitzpatrick, and it's pretty much accepted that he's done some industry-changing work (LiveJournal, memcached, Mogile, OpenID, Pubsubhubbub, etc.) But if you were familiar with his work c. 2002, it wasn't all that impressive. Yeah, he was a good programmer, but he just ran a website with some modest success. Several of us have done the same.
I've heard the same applies to other leading programmer luminaries, eg. John Carmack.
Somewhere along the line, some programmers start really distinguishing themselves while others remain merely "good". And I don't think it has to do with ploughing all your efforts into one project. People like Brad Fitzpatrick, Jamie Zawinski, Paul Buchheit, or Rob Pike are known for multiple contributions. Is it just the cumulative effects of time, or is there something specific they do with their time that propels them from good to great?