Yep, I know that if I don't take care of my own training, it's just not going to happen, most companies are not so altruistic that they'll hand me everything on a silver platter.
But at the same time, a company that never hires people unless they already have the exact skillset they're looking for, a company that fires people on a whim because priorities change, and a company that provides zero incentive for people to keep learning (e.g. with 20% time or a willingness to let employees experiment with new tech) – well, those are not companies I want to work for.
It wasn't until my first few times sitting in on hiring discussions that I encountered professional engineers who didn't take an outside interest in expanding their toolset.
There is absolutely nothing wrong with them. They are fantastic folks who are very good at their chosen profession. And for the majority of professions (I only have anecdotal knowledge of this, but I would love to hear examples from other fields) that is enough to ensure stable employment. But I can't imagine the fear, uncertainty, and doubt that accompanies company layoffs and downsizing when you haven't played outside of your comfort zone in a while.
This is just a long-winded way of saying, “great summation, SheepSlapper!”
I got into software because there was so much to learn and explore, so this realization still baffles me. Why on earth would someone want to do this job and not want to learn new things? It's like a baseball player who hates being outside.
Not only that, but often times I'm faced with the prospect like the author's, where people I've worked with actively prevent those around them from learning new things on the job. "No, don't write this standalone module in Python, our standard is PHP; it was good enough 5 years ago, it's good enough now!" (in a four man shop).
As someone who loves constantly learning more, it's suffocating to be around people who are so paralyzed. I simply cannot fathom the fear that drives someone to say to the offer to learn something new on the job, "no thanks, I'm happy becoming obsolete, and you can't learn it either, because I might have to one day support it, and I'm not interested in learning anything new!"
That irrational fear of technological change is exactly what I'm talking about. There's a joke around some of my friends, "What's the best way to get a mediocre .NET programmer to stop talking about safety and reducing bugs? Bring up the safest most secure .NET language that exists: F#."
As the joke goes, most of these developers are confronted with the reality that they don't really care about bugs enough to even bother learning a different fully supported .NET language that will find entire classes of bugs automatically and is orders of magnitude safer than C# and is faster to write.
Most people when they have told me they "care about bugs, not languages" have been just saying something nebulous to kill the discussion and defend their life choices.
I'm not trying to be snarky to you, I'm actually very interested to know what changes you've made that are not technological to reduce bugs, fix security holes, etc. I'm always on the prowl for such stuff, and this sounds intriguing.
Hear, hear. There must be 100+ "frameworks" out there, all for the task of rendering a web page. If you learnt them all, you would be stupider than when you started. Pick a handful of technologies - ones that will last - and get deep into them. Step off the crazy treadmill and go for quality, not quantity.
If you had gotten into a "full stack" (lol) of Unix, Oracle, C++ in 1994, you would still be very, very employable today as long as you remained more or less current with them, and you'll still be in 2024. Whereas if you learn "frameworks", you'll be starting again from scratch every year or two.
I am baffled by him. After spending two years with him, I still can't understand his motivations. I quit that job in large parts because of him.
Unless there is strong reason, codebase should be in one language. Having it in multiple languages makes learning curve for newcommers unnecessary steep, especially if the company is willing to hire juniors. It also makes maintenance more difficult and expensive. Too many languages will also force your people to have very shallow knowledge.
We are going to do this one module in python should be done either cause you consider to move to python altogether or because the usual language is really bad fit. "Someone wants to learn it" is both bad reason and unprofessional.
I spent the last 10 years learning programming languages and PLT; it's a hobby of mine and I put quite a lot of effort behind it. While I certainly forget some things I learned (I regret not using C on a daily basis and not keeping with C++ advancements), at any given time during those past ten years I was fluent in at least 3-4 languages. Without needing a refresher, right now I can speak and I really know in depth the following languages: JavaScript, OCaml, Erlang, Racket, Python, LiveScript and Pharo Smalltalk. I have about 20 other languages I could become similarly fluent in with a week's effort. It's not shallow knowledge, it's just 10 years of work. There is really nothing to force you not to have deep understanding of many different languages and technologies.
On the professional side - I have quite big system under my care which is mixed Python, JavaScript, Ruby and Erlang, not to mention bits of C, shell scripts and Makefiles and two compile-to-JS languages, some compile-to-CSS and compile-to-HTML languages. The system works - and believe it or not, working on it is a pleasure and using the right tool for the job really feels liberating. It let's me move twice as quickly with twice as good results than I'd get trying to do for example fault tolerant, concurrent backend service in Python instead of Erlang (or quick data mining script in Erlang instead of Python for that matter).
I believe using the right tool for the job is the very definition of professional. Of course, the learning curve is probably steeper for newcomers, but for professionals above a certain level language specifics are rather easy to grok and becoming fluent in a language takes a few weeks tops. Besides, it's not like every team member is required to know every technology used in a project - there's a tech lead for this and I wouldn't want to work with one who can't easily convert iterative algorithms to tail-recursive ones and switch from algol-like to prolog-like to python(-like) syntax on the fly.
Anyway, I know what I know and I know what I do and you're basically saying that these skills are irrelevant and using them would be unprofessional. In short, what I'm saying in response is: bullshit.
so it's not to say they aren't interested, but rather lack the time to dedicate. therefore, it helps if the company they work for can offer a little time to learn new things.
My current hypothesis is people get so used to internally defending their neglect of the skills that earn them travel and shelter that when the opportunity arises to actually get paid to improve, all the reasons they normally use prevent them from even doing that.
Your life doesn't have to revolve around programming to invest time in advancing your own career.
> so it's not to say they aren't interested
Well, that aren't interested in learning things on their own. You said it yourself.
Now, what about an alternative world where he did not "get oo" or perhaps a lifestyle where he had children and no time at work to learn. Or one of these newer not quite as successful software companies which has no money and no extra time.
Keeping up with new tech requires time, and money. Start ups provide neither of these. Even bigger "start ups" attempt to keep up the illusions of a smaller company including mandatory over time and no extras (eg tuition reimbursement, sabbaticals, more than 2 weeks of pto a year, etc).
The other thing, computing as a career is quite a bit harder, more complex, and highly competitive than when the author had their formative years.
The real rallying cry is how do you make an industry that respects career advancement?
Using children as an excuse is laziness. It might not be as easy, but having children does not preclude you from learning new things or advancing yourself. It requires effort and planning, but frankly, using children as an excuse is wrong.
> Keeping up with new tech requires time, and money.
It requires time and effort. Money is rarely an issue.
> Even bigger "start ups" attempt to keep up the illusions of a smaller company
I'm going to assume that you just have bad experiences, because this is hardly my experience.
> Today keeping up is a ridiculous job sometimes.
Even though he advises people to keep up, he actually kind of admits how ridiculous this is today.
It's not possible to keep up anymore, there are just too many people creating too many things, like languages, frameworks, technologies.> I was writing web applications when I first heard of Ajax (a few months after the term was coined) and I started using it; again I wound up teaching my teammates about the new thing first. Sadly it scared the architecture team who thought I had bought some new technology without approval and wondered if it was supported. None of them had heard of it (since they didn't pay much attention) and when I told them it was just Javascript they were only barely mollified.
I can imagine being an architect and having a programmer like that, bringing up every hip thing he encounters, just because it is cool and new... Probably not even considering all of the ramifications. Yeah, sure, AJAX is here to stay (as we know now), but how many "perspective" technologies are now long dead?
I like staying a bit further behing the edge. I follow the direction of technology but I use it only when it is proven and supported well enough. Well, usually. :)
the article was a bit heavy on the narcissism, i wouldn't want to work with such a person. i'd rather work with the Brazil-esque Robert DeNiro type: "Listen, kid, we're all in it together."
I think ultimately, many people in the industry, they only get to learn new tech when they leave for their next job. The pressure is momentarily reduced while they learn at their new job.
Just my 2 cents.
I guess that's a combination "back in my day/get off my lawn" statement, plus a little whining, and maybe a humblebrag, but I don't think that's an unusual story at all for software developers.
I interned at IBM during grad school with a team of consultants that all did enterprise Java stuff for financial institutions- that was very different. IBM would frequently send those developers away for a week or more at a time, multiple times a year, to get training on specific technologies. I'm not sure how common that is anywhere other than IBM though, or if IBM even does that anymore. Maybe Google does it? I don't know.
Sometimes I deal with developers who either can't or won't teach themselves anything, and can't or won't learn by doing. They absolutely need someone to hold their hand and explain things to them every step of the way, and they will just throw their hands up in the air and fail before putting any time into trying to read up on whatever topic is giving them trouble. I don't know what to attribute this to, so I'm trying really hard to not jump to the conclusion that they suck or they don't care or whatever. I'm sure a lot of them do just suck at their jobs and/or just don't care, but maybe some of them have genuine problems with learning that aren't their fault. The only thing I can say for sure is that this is a trait that is a major impediment to their careers and getting their jobs done without sucking up too much of their cow-orkers' time (as we all know, orking cows requires long stretches of uninterrupted concentration).
TL;DR Spot on, and being able to develop your own technical skills to keep up to date and expand your horizons is absolutely critical to being a really successful developer. You are also the only person that you can count on to do this for you. You can't really count on any employer, even some mythical ideal company with bottomless resources that treats each employee as a magical snowflake, to do this. Even if your company does provide training, it's not necessarily going to be the training you want or need to receive.
I think not making your employer pay a fair share of your training is like being one of those people who stay 4 hours extra when their project isn't late. People do it through a mixture of anxiety, peer pressure, and possibly not liking their children all that much.
To me, it is your responsibility to learn the tech you want to use in your next job, but it is fair to ask the company to pay for new skills you need for them.
From the companies perspective, it is worth spending the money up front so that you don't mess up a project by not really knowing what you are doing.
It was just three years ago that my main responsibility was maintaining code on a black & yellow terminal for a VMS server. Another couple years and I could have easily have been one of those people pushed out of the industry with no easy way back in.
Although my company has provide an avenue for me to transition to doing things with the LAMP stack it is still in some sense legacy. It's a large website base that started over a decade ago.
I have made the choice that I'm done with being legacy and am doing whatever I can to learn current tech. I will even be willing sometime later this year to get a new job at a junior level just I can cut loose the legacy code crap I am tied to. At this point it feels mostly like a bunch of anchors holding me down. I want a new job where I can learn from the people around me and truly be focused on my direction.
I can only speak for myself, but the transition improved my life immeasurably. I can't even imagine how different things would be if I had stayed. Keep at it. If I can do it then you can too.
I'm currently in the state of mind that the true way to stay ahead of things is to keep studying, every week, something new. Yes: every week. I take at least 15% of my work-time and use it for self-enlightenment -whether its learning how to put the LuaVM somewhere, tinkering with RethinkDB, sharding my mongo's, or whatever. Constant change is the only constant in this industry; one must change oneself, constantly, to catch up.
This isn't so easy to do if you're not into enlightenment, alas.
Of course it's all a product of culture and supply-demand (systemic), if there are enough great programmers that are willing to learn everything on their own time, then of course it will become the norm that programmers should learn everything on their own time. And, of course, that's great for the employers.
It feels to me that that's the sense in which the young man's comments were meant. It doesn't seem unreasonable in that light. So the compensation he'd like isn't entirely monetary in nature, that's hardly unique.
I've devoted this year to learning some Statistics. Will check if my hypothesis is true in a few months.
"If you don't keep learning, keep reading, keep improving your skills eventually that nasty steamroller behind you will flatten you permanently. Then your career is likely over."
and
"And that clanky monster breathing down your neck has an endless supply of fuel."
Egads. Not blaming the messenger here, he's right. It's a tough field. So the pay is extraordinary, right?
http://money.usnews.com/careers/best-jobs/rankings/the-100-b...
Take a look at these jobs, and in particular, look at the pay in higher salary regions. The best job, software developer, earns 116k a year on average in San Jose.
The average registered nurse earns 122k a year in San Jose. The average dental hygenist in SF earns about 106K a year. Nurse practitioners clock in at 125K a year.
There are all kinds of ways to interpret this data, and in the end, I'm talking about the greenness of the grass somewhere else. Not that I wouldn't welcome comments about these comparisons, I just want to make it clear that I acknowledge these other fields come with their own stresses and challenges and barriers to entry (and I don't object to good salaries in these fields at all). And everyone has to keep learning...
But is there a steamroller that threatens to make dentists obsolete, and do dentists have to bet the farm, so to speak, on whether to learn "enterprise java beans". It does seem particularly relentless (and difficult to predict) in software, and the career stakes are very, very high.
I think programming can be a wonderful career for some people. I think the main reason I pay so much attention to this sort of thing is that I often think about pay and work conditions for software developers within the context of claimed "shortage", as this is frequently discussed (and until recently, often accepted without question) in the mainstream media.
Judging from this informative blog post, it takes a very unique wiring to really thrive for a career as a software developer. Can we really say there's a shortage of people willing to put themselves in the path of a steamroller? (The author of the blog post in no way made this claim, this is just a question I'm turning around in my own mind).
I agree and disagree. It's a moral responsibility of the employer. Work takes up such a large portion of a person's time and energy that if the company isn't invested in the employee's progress, he owes that company nothing. My work ethic is strong as hell, but if I get the sense that management isn't interested in my progress, I slack as a matter of principle. If your manager isn't looking out for your career and you put more than about 10-15 hours per week in on your assigned work, you're just a chump. (In the MacLeod analysis, a Clueless.)
That said, expecting your employer to manage your progress and education is unreasonable, because no company can possibly account for the variations in peoples' abilities and desires. Even if your employer is genuinely well-intended and wants you to advance-- let's ignore the 80% of companies that aren't this way-- your company will figure out where you should go much later than you will. That's why open allocation is the best solution: the workers can figure out what's worth working on faster than central/upper management.
So, yes, it's a moral responsibility to the employer to give the employee time and resources to look out for her career (and, if it doesn't, engineers should slack). However, for the employee to put the self-executive responsibility of picking out what to learn on the company is, in practice, an irresponsibly bad idea.
By my third year I saw the microcomputers were going to be the future and wiggled my way into the group that worked with them.
The problem is that most modern companies have such mean-spirited, insane policies regarding performance reviews and internal transfer that internal mobility is pretty much impossible in them. At a closed-allocation tech company, the only time you can realistically get a transfer is when your performance history is in the top-10%-- in which case, lateral transfer is a terrible idea anyway, because you should wait for the promotion instead of restarting the clock. Closed allocation and Enron-style performance reviews are all about inhibiting mobility, i.e. keeping the poors in their place.
But once you discover you are obsolete it's too late. Assuming your employer will retrain you is a fool's pipe dream. These days employers may drop you, your job, your projects, or even the whole company without much notice, and then you have to find a new job. Expecting them instead to retrain you is not going to happen.
This is why I hope to see a French Revolution-style uprising. Silicon Valley looked like a way out, a "middle path" between serfdom and violent revolt. Now that that middle path is closed due to the VC good-ol'-boy network, I think that a (probably global) class war is just an eventual necessity. It may come next year, and it may come in 50 or 100, but I hope that it's the last major war humanity has to endure.
In programming you need to look forward because the only thing behind you is that nasty steamroller.
Honestly, I get the feeling that this guy was very lucky. He had the autonomy to pick new technologies and he picked winning horses. Imagine what he'd be writing if, instead, he'd learned Blackberry app development. Or, what he'd be writing if his manager, long ago, had fired him for attempting the transfer to the microcomputer team (possibly forcing him to take a suboptimal job due to financial pressure, with long-term effects on his career). He should at least attribute some of his success to having been luckier than most engineers.