>Our hiring “secret sauce” largely stems from the fact that it seems to take significantly less time for someone with leadership and community skills to develop technical skills than the other way around. I’m seeing a large number of people who graduated from code bootcamps 3 and even 2 years ago now handily and gracefully filling the role of senior developer.
This statement makes me concerned that they have greatly devalued technical skills. There are not a whole lot of areas where we would consider a three month course plus 2 years experience anywhere close to being senior. According to http://www.plumbingapprenticeshipshq.com/how-to-become-a-jou... to become a journeyman plumber (mid-level) requires a 4-5 year apprenticeship!
When technical ability is devalued so much in a company, there is a real danger that this turns into a Dunning-Kruger clique, where "senior" developers that have been programming for 2.5 years automatically favor hiring experienced business people over experienced developers (remember you can turn someone who has never programmed into a senior developer in a little over 2 years)
Think about law, medicine, engineering, or the military. We would never call a lawyer that started training 3 years ago a senior attorney. A doctor after 4 years of medical school is called an intern and basically expected to mess stuff up. As mentioned above, even a plumber with 3 years of experience is an apprentice. Why should we as software developers devalue our craft so much?
I'm not going to justify their thinking but I'll attempt to explain where it probably comes from.
The context for their perspective is crucial. Notice that their assertion for "less time to develop technical skills" is followed by a sentence praising graduates of "code bootcamps". They also prominently espouse Ember.js[1].
To make sense of that, we can (roughly) divide programmers into two groups: (1) CRUD LOB Line-Of-Business (2) algorithmic/embedded
(1) is programming the enterprisey, forms & fields, "back office" apps. It was COBOL, dBASE/Clipper, Visual Basic, Microsoft Access, C# Winforms, 4GLs like Oracle Forms & SAP ABAP, and now Javascript frameworks such as Ember.js/Angularjs. Basically, slapping a client GUI in front of a database backend. Whether that client GUI technology is Visual Basic, or mobile phone Javascipt, or iOS Swift app... that choice is more about whatever zeitgeist of programming you happen to be living in rather than any inherent difficulty levels between the technologies. The idea is to take the high-level frameworks+libraries and glue them together to deliver value to the business.
(2) is programming of realtime kernel schedulers, complex distributed computing algorithms, search engines, database storage engines, machine learning, ray tracing graphics and physics engines for video games, audio DSP, traversing graph nodes, control theory for drones and Mars Rover, etc. This would be more "engineering" type of coding rather than "integration/glue" coding. Typical programmers we'd think of in this group would be Jeff Dean (Google MapReduce/Tensorflow), John Carmac (Doom), Fabrice Bellard (ffmpeg).
The programmers in group(2) wouldn't say it's easy to take "leaders" and add programming skills to them. However, that sentiment is often expressed by group(1) programmers. I'm not saying it's the "right" philosophy but it's an observation I've seen repeatedly. The CRUD programming is often seen as just a longer more elaborate version of programming the Tivo / thermostat / lawn sprinkler system. It doesn't seem like group(1) is "devaluing" themselves but instead, they honestly just think "programming skill" isn't really a big deal.
Group 1's job is to save the company money, or make existing systems easier to use/more efficient. They help increase sales by implementing metrics/ab testing/etc. A top member of group 1 will be leveraging technology to automate costs & complexity away. If the company nets more money because a Group 1 person is employed, they're doing their job.
Group 2's job is to invent core technology, often from scratch. They are well-versed in theory and application, and their deliverables take lots of time. If the company gains a patent because of a Group 2 person, they're doing their job.
Group 1 is business (and sometime public) facing, so soft skills are much more important. Group 2 IS usually the business, so tech skills are #1. The two groups have the same components in their pie charts, just in different proportions.
This model also works well for understanding many comments here on HN.
IMHO most of the world runs on project management, social skills, soft skills, "changing the world" by making a better mouse trap, increasing the conversion rate by 0.5%, decrease the customer support response cycle by 2.5 minutes and satisfaction/loyalty survey by 3 points.
I think most people has moved onto (1) because of the universal adoption of the web. (1) makes the most splashes because it's more relatable, accessible and dare I say, more profitable? However (2) still exists, but seems they've declined because in the late 90's, they seemed to be the dominant icons/heroes of computing (Phrack/2600, compare to today's "founders"); it takes more fortitude and individual direction to push yourself to dive deeper into the stack and take on more kernel and distributed computing problems, without the extrinsic reward of money.
In a sense, it's kinda of students choosing business consulting vs. going into (pure science or humanities, not pre-professional) grad school.
But that being said, Unless you are working for a very conventional small or midsize business with no plans of rapid growth (and businesses of these rarely employ large teams that need leadership anyway), it's never going to be "just CRUD" or just slapping a client GUI in front o a database backend". With large datasets or many users you have to plan your storage and query patterns for scalability, when precision is crucial you have to design for consistency when a real-time response is required you need to reduce latency and when business requirements change often you need to design for flexibility, when your company has multiple clients you need to design a good API, and so on. In most cases you must balance between several requirements, and I haven't even gotten to security yet, which is a too often neglected requirement for almost every system out there. I also didn't touch all the peripheral requirements for having automated tests (and designing for testability), designing a CI pipeline and educating junior devs about revision control workflows and DevOps practices.
Handling all of that doesn't require the same technical skills required for writing kernel code, a ray tracing engine or a database engine (all of them different skills in their own respect), but to be able to design a good system and give guidance to junior developers you definitely need more than 2-3 years of experience.
Yes, many companies choose to ignore what I've just said, and hire low-level programmers anyway, and often put an MBA (who did some formulas in Excel back then) to (micro-)manage them. Of course the products they turn out are buggy, hard to upgrade, fall down to pieces under the smallest hint of load, end up requiring expensive hardware, have gaping security holes and all-in-all cost their failings end up costing the business dearly.
This blasé approach to group 1 development is very common in the so-called "enterprise world", but that's just because the organizational elite over there (as opposed to most of Silicon Valley) is not even tangentially technical. That's understandable: take a CS-driven company like Google or an engineer-driven company like Facebook, and you'll get the mirror image. But there's no cold truth about programming skill not being a big deal here. Group 1 development is at the very least as demanding a skill as plumbing.
Frankly there's too much money being made at average skill levels that don't require any degree. I don't think there's an economic incentive to go for more regulation. Better yet, there's a ton of need and some of the work is just plain easy once you do have 1-2 years.
There's positions that don't directly affect life/liberty unlike law, medicine, engineering, and military. Making our industry like theirs would require legal regulation of titles and work.
Open source would then become a legal grey area (because OS stuff is used in software that affects life/liberty right now) unless you are willing to police open source and make sure all contributors are qualified. Or hold the companies responsible for only using open source that only has qualified contributors. Either way, it's a pretty massive blow to the whole idea of open source.
If you want this to happen organically, then that's just going to take time on the scale of generations. Law, medicine, engineering, and military have all had a ton of time to develop this versus software engineering. Right now we're just cavemen protecting and healing our villages with shell scripts and compilers.
To bring the idea into a couple of the other fields you mention, consider:
A lawyer, just out of law school but with 10 years of software experience and recent work in patent submission.
A doctor, just out of medical school but with 10 years experience as a registered nurse.
An engineer, just out of college but with 10 years in construction, specifically welding structural beams and overseeing other welders.
The one that I have personal experience with is the military. Soldiers respect as senior their peers with significant outside experience, even if the regulations do not allow them to be promoted officially.
You can also consider, among officers, the brand new 2LT who was enlisted for many years prior. While they are a 2LT just as any butterbar, they are respected beyond their rank.
I don't think that the article is claiming that everyone 2 years out of a code bootcamp can be considered a senior developer, but that there are exceptional individuals who can.
With Rails you can learn and build a web app in weeks. With AWS you can learn cloud architecture in months.
Spend two years focusing on the right tech, get sufficient depth, and you are an expert.
There are also infinite resources to help with this from books to blog posts to a culture of mentorsship in the work place.
Law and medicine should be so lucky.
The company in the OP isn't in the business of systems engineering, they're in the business of software integration. I suspect if they were in the business of engineering they'd have a much different view of what it takes to be "senior". Because they'd be able to measure it in terms of lawsuits, audits, and fines rather than in slipped deadlines for web apps.
The really hard bit is:
Building a web app that can evolve and grow to build a business over several years, eventually with a large team.
And that's where traditional Rails development fails, and we still haven't worked out quite the best way to resolve that.
I recall my own feeling of getting into another universe after I finally started to systematically look into the assembly code generated by the compiler, and it took me 6 years to develop the need for this.
It has been at least 5 years that I have removed 'Senior' from my LinkedIn profile ;-)
> Without a clear definition of “senior developer", we have no clear path for our own employees to get there.
This seems like an anti-pattern of circular logic to me: Why would you want to “get there” when it's unclear what “being there” even means?
And as others have pointed out, it's not "technical ability" that is being devalued; more likely it's that a particular subset of technical knowledge is devalued -- perhaps appropriately for the situation -- but it just happens to be a subset that you personally feel is extremely important (and I might well disagree with you on whether that subset, or any specific subset, is actually essential).
And that's without getting into the irony of how much we as an industry approve of successful high school or college dropouts who learn to code well without completing a formal CS degree, but then turn around and, frankly, shit on the idea of actually hiring any such people because it'd require lowering our technical standards.
I liked their ideas around direction given vs. direction required. I liked their distrust of the notion of "cultural fit". I liked that they identified leadership and connectedness as distinct skills, and that a senior developer has both technical skills and at least one of the other two (leadership or connectedness).
But true technical expertise requires the ability to make holistic, contextual decisions. That kind of stuff takes experience shipping multiple products, full-stack exposure to at least one or two mature software stacks, and awareness of the historical context of how technology has changed. That is, it's not so much about "years of experience" as it is "diversity and quality of experiences".
This included a slide with a hypothetical "candidate A" who had no technical experience, but great communication skills, and "candidate B" who had extensive technical experience but no "soft skills". Who do you hire for a hypothetical technical role? "Candidate A" was presented to us as the correct answer.
Nevermind the fact that our company has never historically demonstrated the ability to technically train an employee...
So, yes. Devaluing technical skill appears to be all the rage in HR these days.
You'd rather have an honest disagreement than have someone bumbling around the code quietly (and possibly quickly) making a mess that nobody notices until the debt is piled so high it blots out the sun.
He also makes reference to the Dreyfus model of skill acquisition which is worth keeping in mind: http://en.wikipedia.org/wiki/Dreyfus_model_of_skill_acquisit...
However, I think you sell the initial idea short - that seniority DOES ultimately boil down to direction required / direction provided.
The only place this summary would fall short are in those people who can bust out anything on their own in record time, but have trouble with teams... and even they provide direction, if only by setting a technical direction/example.
When you expand to the venn diagram, you're elaborating on the kinds of direction that can required or provided, but that core remains - that ultimately, seniority happens as you go from needing direction to providing it.
Now, within in different industries, the points within each circle can change (compare startups, where now is better than perfect, to industrial plants, where perfect is better than now, to research, where now and perfect are sacrificed to hard and novel).
I'd say it's even likely that, given a different culture and/or problem domain and/or ???, you get different circles entirely - although it's hard to argue with "ability to do the job", "ability to work with others" and "ability to cause direction". What are things that don't fit into one of those?
Some people also argue what is the difference between calling yourself coder, programmer, software developer and software engineer, and even taking the debate as far as questioning if a software engineer is an actual engineer.
To be honest, these debates are sort of empty unless there is clear regulation and disambiguation around what seniority is. You can define seniority in many ways.
I can throw darts on a map and find billions of people capable of "connectedness" and "leadership". Finding you capable "technical" talent on the other hand...
This ease is mostly due to connectedness and leadership being naturally grown out of decades of organic social interactions that people need to do in order to... you know... survive. You can accidentally reinforce these properties just by being 1.) a member of the human species and 2.) alive.
No one accidentally reinforces the understanding required to mentally maintain the thousands of arcane exceptions written by the result of C-Levels pivoting endlessly across outsourced teams for the cheapest price, compiling the world's worst spaghetti code responsible for an everything-on-the-line application.
And if this seems a bitter, then look into your own organizations and experiences: Who gets fired/quits the most? Human Resources, C-Levels, Marketing or the Technical team? Whose ass is on the line when the fires start? Who is expected to work 10+ hour days every day for all of eternity?
Looking for unicorns does not make you a unicorn hunter. It makes you delusional. A senior knows unicorns don't exist even if people look to him/her as one.
You really do need some level of each.
Difference is you can reach a pretty low bar in term of social skills, being approachable and being able to communicate, vs a high bar technically.
If it were popular to use Programmer I, Programmer II, and Programmer III, then they would call it that.
By the way, going to a lot of conferences or having a lot of Twitter followers does not make you more 'senior' than me. It might make it easier for you to negotiate a 'senior' title and salary though, just by virtue of your being good at self-promotion.
Or MTS/SMTS/LMTS/PMTS with a gazillion different levels for each
I have spent many a year being as a mid-level exec in medium size companies. Wherever I've been, I have demonstrated ownership and leadership (I'm not bragging, that's just how I work). However, I have alway felt a 'lack' of technical expertise, probably because I fit so well into the leadership roles I never spend that long on 'deep' technical work. Recently, I have sought to remedy this. I have traded my mid-level 'get stuff done' to become a developer, to scratch an itch that has been there since I programmed BASIC on my ZX Spectrum. I'm no spring chicken and I have spent the past 5 years getting up to speed on modern web and mobile development techniques and have pushed myself as a developer to anyone who will listen. In my latest role I've already been earmarked for a lead developer role - and I think this validates what the article is saying and also what a few comments on here are saying; I have demonstrated that I am technically competent with 3-4 years solid experience (which for 90% of jobs out there is probably enough), but my approach to working; owning problems and solutions, leading and mentoring others show that I can be relied upon to deliver product.
Just to counter the self-congratulatory tone of the above (I'm English, it makes me uncomfortable), I'm acutely aware of my shortcomings both technically and personally; I should write more tests, I should plan more before writing code, I sometimes don't speak up because I don't want to look stupid etc. But I think everyone has shortcomings that they learn to accommodate or change, it's all part of life.
Key characteristics: Not chasing every new and shiny framework/tech Singular focus on stated objectives/milestones/deliverables Understands "big picture" as well as the granular details, so that they can provide advice and leadership about implementation Acts as buffer between "techies" and management in every cycle of iteration
It's very likely that my bullets are worded poorly, but my overall point is that a senior developer is someone who has the experience to liaison between those with ideas and those with technical skills. In some environments the sr. dev is the person doing the tech work, in other environment they're leading a team, in any case they have the knowledge, skills and experience to produce results.
(The Venn diagram is a nice starting point, though.)
So, when you are looking at jobs or hires, always consider the context.
I believe you don't hire people for jobs they've done, you hire people to do a job, and you ultimately fire people for not doing their jobs. What they did in the past might have zero bearing on what the person is capable of doing. A resume only tells you how good they are at writing resumes (especially if you don't follow up and confirm it, and who has time for that?)
Admit that the criteria is undefined, and stop trying to have control where there is none. Put people on the job without interviewing then [0]. Fire them if they prove they can't do their job. But I think it is much more likely that you will find yourself surprised by a person stepping up to the challenge presented to them.
The job doesn't care how you found the candidate to do it. You don't really have any control, anyway. So quit wasting time (and money) on the process.
[0] at least, not in this way. I prescreen candidates for general intelligence and social graces. We do not hire people who think it's ok to tell racist jokes at parties. Unfortunately, this is a recurring issue.
What kind of timescale would you consider reasonable for this?
Because the cost of changing jobs once you have a family means anything less than 3 months is pretty unethical. Unless you tell them up front, but that's just probation, which means good luck hiring anyone.
I am up front about the process and the possibility it could end very early. I typically suggest to them that they start at 10hrs/wk while at their current job to see if they like the work. I've not found it difficult to hire people in this way.