There's a fundamental skill that a good programmer has to have, and that is to be able to take a novel problem that they haven't seen before and break it down to solve it in a sensible way.
There are plenty of programmers who fake their way through a career without having that skill. They just copy stuff and never really understand it. They're fine for some types of programming career, but if your business involves solving new and novel problems then you have to know which type of programmer you're hiring.
A contrived live coding exercise gives you a strong signal on this. It does have a decent chance of producing a false negative, but only a very small chance of a false positive, and that's the trade that has to be made with this kind of approach.
Is a better option to not do this kind of assessment, hire the person, find out that they can't do the hard bits and then fire them within 6 months? I'll take some convincing of that...
But who does that professionally as a stand-up performance, clock ticking, a judge breathing down your neck who has been equipped with a script that tells him things like “if candidate doesn’t do X within the first ten minutes it’s very bad”?
Doing well in that situation depends more on social performance skills than problem-solving skills. You’re essentially trying to infer the interview script that the interviewer has in mind and act it out convincingly.
If someone says “I’d like to start by taking ten minutes alone in another room and think it out first”, will they get hired? Probably not. It’s not the expected performance even if the solution is fine.
Requiring salesmanship to enter a company may well correlate with requiring salesmanship to advance and be recognized at the company. (I don't regret that job, because the rank-and-file were great, but I don't want the same structure of overlords.)
This time around, I am fortunate to have effectively lifelong runway, so I'm now trying to handle an interview as if I were discussing a problem with a co-worker. If that filters me out, that's probably best for me, too.
When interviewing I always start with a discussion about the problem. On the whiteboard. I'm explicit with the candidate that this is totally expected.
I'm actually measuring something perhaps more important than the code that will get written later: can this person formulate a plan and have a technical discussion.
As for nervousness for being forced to partaking in a stand-up performance, I'd argue that "social performance skills" can work against you, since the more "antisocial" you are, the more you can ignore (or are oblivious to) what other people think about you and you can focus on your task. It's only people who actually have the minimum requisite "social skills" that would be conscious of other people intensely watching them "perform".
But even if you're right -- interviewing is inherently a "social performance" activity. If you're not coding you're doing some other stand up performance to present yourself anyway.
I’m talking about the “detect a cycle in a linked list” kind of question. If you ever actually need to do that in practice (though I would question the choices that lead up to that), it’s easy enough to google. The hard part in practice is figuring out that your messy practical problem decomposes into an algorithmic question, but that skill is rarely tested in interviews.
[Edit: I in mind ‘Detecting the cycles’ rather than ‘Detecting a cycle’ but wrote the wrong thing. Mea culpa.]
Your actual job will be more like implementing a maximally performant directed graph in safe Rust. (Not really. It won't be that interesting.)
But, testing if the candidate has seen a problem before may be a proxy for something meaningful.
Honestly speaking, 70% of the candidates can do the job, but I fail 90% because they can't pass the coding question. It's a filter with a serious problem. Yeah it's all we got, but that filter rate justifies a big discussion because it's crazy.
But with platforms like leetcode and more, the candidates are split into those who have had the time to prepare (and see the problem before) and candidates who can’t put in that time. And most modern interview processes do not give you credit for thinking through it. You only get through if you solve it the right way, and fast. There is no time to “think and solve a novel problem you’ve never seen before”.
Interview processes that do it that way are fundamentally broken. I wouldn't want to work at such a company. Interviews are two-way streets, and that would be a case of the company failing the interview.
The real benefit of this sort of interview practice is not learning whether or not the candidate can arrive at the correct answer. If we're at the point where this level of interview is even happening at all, you have (or should have) reasonable confidence that the candidate is capable of solving the problem. Whether or not that they do so in the interview is irrelevant. The real benefit is in being able to see how the candidate thinks through the problem.
While I think these whiteboard exercises are not the best approach, when I've been the interviewer at companies that required it, I chose a very difficult problem, gave a time limit, and told the candidate that I do not expect them to actually arrive at the solution within the given time. I just want to see how they approach the problem.
The problem with this is that many, many interviewers do this badly. They think they are looking for someone who gets the right answer. Or they are not good at assessing how other people break down and analyze problems.
It is artificial. All interviews are artificial. This is why everyone would prefer to hire candidates they already know a lot about. But a good interviewer isn't looking for a perfect performance.
Programming puzzles are the equivalent of the GRE for programming jobs. It might tell you something about the candidate, but I'm skeptical that it measures the most important traits and skills that really good programmers have. It's an empirical question, but your skill at solving programming puzzles might not predict your ability to systematically break down large problems, write readable, robust, well-documented code, or design and debug complex systems.
I also have my own theory about this which might just be my own personal bias. I feel like standardized tests encourage conformity in thinking. By relying on standardized tests, you are selecting for individuals who are primarily motivated by external measures of success and approval. On the flip side, you are selecting against individuals who are intrinsically motivated to learn and build things, but who may not care about those external factors. My theory is that organizations with too much of the former and too little of the latter are going to have issues with groupthink and will be less likely to be innovative.
Totally agree
The author suggests take-home coding exercises with "maybe ... a live call to discuss their solution" as giving a stronger signal than a live coding exercise.
The author also suggested to "present them with a complex technical problem that requires architecture-level decisions across various parts of a technology stack, and have them talk through how they would approach it" as a better signal than a live coding interviews.
So even if what you say is true, you aren't presenting a devil's advocate position. Instead, it makes it seem like you think the alternative to live coding is no assessment, which the author very clearly argues is not true.
Its more that businesses don’t know what type of programming they do and are just hazing people for no reason
More like one person is just hazing people due to a strongly held unobjective opinion like yours
Don’t know the better option, just the limitations of this one
Since there is no CS equivalent of a professional engineering exam, employers have no guarantees that applicants will meet minimal qualifications, and so we continue to see an arms race of increasingly complicated whiteboard / leetcode interview processes.
Immediately after the time was up I magically started to realize how much I'd over complicated my one almost finished answer and quickly came up with much more efficient answer. Suddenly it started clicking how simple the first two problems were and how easy it would have been to crank them out if I wasn't panicking about the tech bubble collapse.
The recruiter later asked me how it went and I grumbled something about how it was a leetcode test. He said "oh well they need to make sure you have the skills for the job." At that point I was over the whole thing and honestly pretty fired up.
Over the next week I proved to myself that I do have the technical skills for that job, and that's honestly what counts.
¯ \ _ ( ツ ) _ / ¯
Fortunately, I interview well and can discuss many, many technical concepts deeply. But put me in front of a whiteboard and I couldn't tell you my mom's name.
- C/C++ on Win16/Win32
- Assembly language development with Z80/8051/ARM on embedded microcontrollers
- Java (core java, Servlets, J2EE)
- Ruby on Rails
- NodeJS / Javascript
- Worked with AWS tech (the full stack)
- Relational DB (MsSQL, PostGres, MySql), NoSQL db (MonoDB)
- Coded for Linux/Unix, MacOS, Windows 16/32, PalmOS, iOS
I can provide reference for each of those skillsets form my past colleagues.
You know what - I probably couldn't pass half of the insane coding puzzles these interviewers throw out. Not because I can't solve them, I just don't remember enough of the syntax or library semantics of the top of my head.
At my experience can we just assume that I am a competent coder (maybe not the top 1%, but at least in the top %20) and talk about the job and how I can contribute ? I mean its almost insulting if you ask me to make a linked list/reverse a binary tree or other such nonsense looking over my shoulder me with a time limit.
If there was a way to verify what you say reliably, then of course that would be better. But there isn't, and writing down that list is extremely easy - in fact, half of the CVs I've ever seen look very similar in terms of the length of the list, amount of technologies mentioned, etc. Even for people with far less experience.
There has to be some way to check whether someone actually knows what they're doing. For sure some of the time a strong reference is enough proof. That's why people in the industry a long time with many contacts will often go from job to job without even interviewing anywhere - they just move to places with former colleagues that already know them.
But for a new place that doesn't know you, thinking it's insulting to show what you know is... weird. Is it going to be insulting on day one when you actually have to do the work?
> At my experience can we just assume that I am a competent coder and talk about the job and how I can contribute ? I mean its almost insulting if you ask me to make a linked list/reverse a binary tree or other such nonsense looking over my shoulder me with a time limit.
I might inclined to agree had you not gotten all the names of the databases in your list wrong.
jeez, i would reject if they took more than 5 minutes
When I'm the applicant, I make it a point to take control of the narrative by saying something like:
> If it's alright with you, I'd like to approach this as an opportunity to expose how I approach problems in general rather than how I'd solve this specific problem. I'll speak stream-of-consciousness as I go through it so you can get an idea of how I'm thinking about it. Feel free to ask questions if you'd like; I'll rely on you to decide whether it's more important to you that I complete the task or explain my reasoning. I'm happy to switch to pseudo-code or just discuss potential approaches if we run short on time.
When I'm the interviewer, I open with pretty much the same thing. My goal is to put the applicant at ease (to the extent that I can) and make it clear what I'm trying to get out of the session:
> First, let me say that it doesn't matter to me if you complete the exercise or not. At this stage of the interview process I'm confident that you're more than capable of solving the problem, so lets use this as an opportunity to get to know each other and see if the way we think about logic is compatible. I'd love it if you could point out things you'd change, but don't worry about trying to 'finish' or end up with production-ready code. It's just a means to an end.
i had one who could not deal with that at all. he preferred to show me some of the code that he had worked on. fine, i let him do that instead. i passed on him not because he refused the coding session but because we didn't communicate well enough for me to be confident that i could work with him.
as a candidate i would skip the introduction though and just start talking like i would when pair programming, mainly because i'd be uncomfortable to ask for permission first.
It’d be more akin to pair-programming, which some of the interviews I’ve conducted have evolved into, depending on the strength of the candidates.
Would that capture enough to get a sense whether this person gets the gist of the code being written such that they could replicate the same output in a less contrived scenario?
That job had a lot of people working in pairs very frequently so it made tons of sense to do it that way there.
there is also the problem that instructions given by the candidate may not be clear and then i have to decide whether i just assume what they meant (because i already know how to solve the problem) or if i try to take them literally leading to frustration.
What else are we supposed to do? Take the fact that you can talk a good game as enough of a signal to invest 10s of 1000s? Assume that everyone with 20 years experience is as good as everyone else?
The problem is that there are no reliable signals. Most Developers I have interviewed have a massively inaccurate ability to judge their own ability (in both directions). I've lost count of the number of times candidates have rpomised that they can just learn whatever they don't already know and haven't been able to do it to any degree.
Qualifications are meaningful in some contexts more than others but most people in the UK don't have comp-sci qualifications.
So yes, I will use various coding exercises because depending on the level, it shouldn't phase someone to be given something quite simple and to see how they approach it (do they write tests first? Ask some good scope questions? Explain why they've done something the way they did?)
I have failed one of these tests in the past thinking I was a good Developer (I am!) but I don't blame the test or the process, I realised that my approach was haphazard and not an objective good look to an Interviewer so it was actually helpful.
Interviewer: "Gosh darn it, you're hired!!"
Whether this is actually the case comes down to the hiring manager.
If the hiring manager decides to optimize for performance in the coding interview then yes everything said here is true. They will typically hire people who can perform well and fast at simple coding test type problems above other any other desirable attributes.
If however, they simply evaluate the ability to work through (at any reasonable pace) a fairly trivial coding task, but make the hiring decision on bulk of the rest of the interview, then it shouldn't be a problem.
The problem is most hiring managers have not been selected for their ability, or even trained, in interviewing. A coding test is easy to set up (or copy from somewhere), administer and evaluate. It is often literally the least they could do.
What this post describes is simply hiring managers who lack interviewing skills. Personally, I would probably want to avoid reporting to such a person.
People who cannot perform socially or technically on the spot in a completely unnatural setup are sort of left in the dust. I'm not sure i have a solution other than throwing out technical interviews and actually trusting peoples prior work. I have work in the public (published academic papers, patents) but none of that seems to mean anything in an interview lol. Its only about can you perform right here right now for 1-4 hours.
I've said it on other threads. (1) trust peoples past employment, use background checks or something to make sure they actually worked where they said they have worked (2) look at their body of public work if they have any (3) just hire the people that look good off those two metrics and youll probably end up with a good employe 90% of the time and save countless hours and dollars. I'm almost convinced random choice + team vetting of resumes + a little background check would be just as effective as endless technical interviews.
Unfortunately this varies dramatically from person to person during a high pressure interview.
I interviewed a very senior developer once that actually used to be my current manager's boss. The coding question was pretty simple and most people didn't struggle too much with it. He BOMBED it. Like, you could have taken someone with 30 minutes of coding experience and they would have gotten as far as he did.
It was also very obvious he was just incredibly nervous so I put him through to the next round anyway. We hired him and he was great.
I agree, although this is why I feel a small project and participation in code review is a more realistic and useful gauge for real world ability.
Can the person understand the code base well enough to propose a sensible change, and how collaborative are they in the code review process? Do they document/comment their code, add tests, do they answer questions and take criticisms patiently, is the solution simple, does it work?
You'll learn about zero of these things watching them write a fizzbuzz in a shared editor window.
I don't think that's a fair statement. One of the alternatives, take-home coding exercises (with possible live call afterward), is a kind of "coding during the interview process", yes?
The essay more specifically concerns a certain specific type of coding evaluation, with live evaluator who is unknown to you, in a setup which inhibits many normal problem-solving techniques (pacing, drawing notes, thinking silently for 10 minutes, etc.)
> It can literally be a simple task that one would expect any working software developer to be able to complete without too much fuss.
I think the two of you are in agreement even for that point. That is, the author seems to agree, writing "They likely do a great job filtering out people who are incapable of programming".
Yeaaaaah, that's not how any of this works. I've done 100+ interviews for MegaCorp. Who ultimately gets hired is ultimately a dice roll no matter how "data driven" we call it. Did you get a good loop? Was one of your interviewers in a bad mood? Did you end up with a hiring manager who "used to code" and now measures everyone against whether or not they use out dated OOP techniques everywhere? Ah crap, did you get Gary? That guy sucks so much. He asks "hard" questions to make himself feel good.
The sausage is what you'd expect if you remember one key thing: it's humans on the other side of the desk. They're finicky and arbitrary ceatures.
Grinding leetcode at least scales horizontally to a huge number of companies.
Homework is typically useless outside of the single company you're doing it for.
I generally refuse all takehome assignments unless:
1.) It sounds uniquely interesting and fun to do. 2.) The company is prestigious enough, or pays well enough, that making any effort to try to get the job worth it.
It really depends on how you create the test and how you review it. You can prepare one where the actual solution is 10 lines of code and anything else is exactly for showing what the candidate knows / could do.
I used something like "read data from CSV, write it to sqlite, treat it like a mature production app, feel free to use placeholders (usage doc goes here), go nuts". Then the test was really about knowing about error recovery, encoding issues, documentation, error reporting, monitoring, ci pipelines, etc. Many badly organised tests don't make the whole concept of a take home test bad.
No take home test is even needed. A simple conversation would let you know if they understand the importance of those things.
Having done take home before from the hiring side, it was incredibly time-consuming for us. We had someone anonymize the three finalist submissions, and then we had three people each individually review and comment on each one, and then we got together to discuss and choose the final candidate. Once we agreed on one, only then was it de-anonymized.
All-in, it took way more time than a single developer doing three live coding interviews. But my guess would be that most companies wouldn’t be willing to be that deliberate with take home.
Maybe programming in another few years will just be glorified autocomplete and little more.
And perhaps testing people on how to write code was a mistake to begin with. It's one thing to write code, but reading code is another.
That solves the skin in the game problem.
At least, that way you might get something out of it instead of nothing, although I definitely do agree that being paid for it is the only way to solve the skin in the game problem.
Of course, in order to be able to assess the answers to these questions in an effective way, you need to be able to very knowledgeable yourself, and that's the real root of the problem here.
What's important is how people approach computational problem solving, not if they can write a solution in 1 to 45 minutes. Really, who cares?
One of my go-to examples is trying to work from home. Which is great, yet has its challenges. We have three German Shepherds. They are lovely. However, when a delivery arrives or the gardeners are out nearby, well, it's mayhem for a few minutes. I've come to understand that I should just take a coffee break when that happens. I can't even do mid-level math, much less focus on a difficult CS problem during those moments. And it takes a good 30 minutes before my head can be back on task.
Stop treating software development like performing art or athletes having to perform in the heat of a game. That is not what we do. At all.
Write a function that takes in a string and returns 1 if the amount of letters in the string is odd. It returns 0 if the amount of letters in the string is even.
If you can do this, we are good. I dont need NP hard or whiteboarding or algorithms. I can tell how good you are just by talking to you about your background.
"I can't remember of n % 2 returns 0 or 1 on even, so in place of that I wrote (if ((n/2)* 2 == n) for even, assuming integral division"
Another problem is that we're prone to thinking that being able to do well on tests equates to doing well in life and work--despite a stunning lack of evidence in support it.
After many years of sitting on both sides of the table, I've have come down to this: "hire lightly and fire lightly". In practice, this means beyond the (very) basics, hiring is not algorithmic; it's a crapshoot.
It's a real skill you'll actually employ. You're coming at the code cold, which is actually a realistic scenario you'll encounter on the job. Your ability to catch bad ideas and prevent them from getting literally codified is a valuable skill. And all of that is worthless if you're in a state where you can see a mistake, but are too afraid to speak up; this gets tested too.
It might not be so great for newbies and people fresh out of college, but even they should be able to read the code and discuss it.
The candidate would come to the office (when we were still doing in-person interviews) and meet a couple of the developers for a little bit of chatting. They would then be given a couple of printed pages of Java code with a few basic classes representing banking accounts with some typical methods for depositing, withdrawing, persisting to a database etc. (with all of the details stubbed out) - and they would be given these instructions:
Read the provided Java source code files and identify any problems that you can find.
The problems can include things such as actual bugs, design problems or code quality issues.
You do not need to search for syntax errors - the code does compile.
You have 15 minutes to find as many problems as you can.
The candidate would be left alone with the papers and a pen and would spend the next 15 minutes looking over the code by themselves.The rest of the interview would then be spent discussing their findings. Most candidates would find the obvious problems in the logic, missing null-checks etc., while trickier things like synchronization issues were missed by quite a few. Even though we had a list of all the bugs/issues that had been put into the code, the important part wasn't for a candidate to check off as many of these as possible - the important part was the discussion about the issues that followed.
After the candidate had told us what they found, we would start hinting about the remaining issues and eventually tell about all of them. How quickly someone would pick up on an issue when it was pointed out told us quite a lot. It was a way to get a feel for how the candidate thought and reasoned about code, without the pressure of them having to actually write code with someone looking over the shoulder.
Throwing it through find bugs is always a good option, and should find something
The juniors spot obvious errors. The seniors also spot logical flaws and the conversation quickly moves into "how could this be done better, overall".
as a devops engineer as my main job, I also try to explain to interviewers that on any given day I am reading and writing half a dozen languages of vastly different paradigms, and although I'm very proficient in many of them, I definitely need to reference things that maybe shouldn't "need" to be referenced (like confusing bash/python loop syntax is very common for me as an example). This rarely ever slows me down in reality, but will definitely cause me to fail interviews I shouldn't.
If I was an interviewer, I wouldn't care if a dev knew whatever esoteric language syntax or API calls by heart. I'd just expect them to know how to use them intelligently. The former does not always imply the latter.
An impressive number of people still fail the interview despite the questions being pretty simple (in leetcode terms, probably on the easier side of medium).
YES! Yes, exactly. A consistent and unbiased process that reliably weeds out people who is very bad at programming is incredibly useful. I'm not convinced live coding interviews are either of those things, but assuming they are, they are absolutely worth the listed downsides. Do they filter out lots of great programmers along with the bad applicants? Yes, totally, and that's a major waste. But all of the listed alternatives are less reliable, less consistent, slower, and friendlier to cheating.
I would love to eliminate live coding interviews where I work. I hate the things. But I have never encountered a mostly consistent and kinda objective solution that compares. I was hopeful that the essay was leading to a proposal for one and disappointed once again at its lack. Please, someone tell me what the giant tech companies should use instead, and I will gladly throw these "please reverse this linked list" interviews into the trash.
and very unnatural
and insulting to an experienced person (whos been promoted, and whos survived many layoffs in the past, etc) with many accomplishments. someone who has clearly solved and shipped, repeatedly. and one who has artifacts visible out in public. and has praise testimonials from former bosses, coworkers, clients, etc
This type of interview related blog post seems to hit the front page every week. Almost everyone agrees that interviewing doesn't feel good and seems overly complicated/difficult/whatever.
I hate that interviewing is a skill that has to be developed, in large part, separately from other engineering skills. I also don't really enjoy SQL/Database work. But I've gained enough competence in SQL that I'll be fine in most jobs. The same is true for my interviewing skills.
This format is popular because it's the best time/effort trade-off for both the company and the candidate. It's massively flawed, but everything else attempted so far turned out to be even worse.
Is programming weird because you can just ask someone to prove they know how to use a hammer? And so other industries just have to hire based on work history and/or bias "culture fit" during the interview? And they suffer terribly from people who can talk the talk but not walk the walk?
Or is programming weird because there's so much propensity for people to be able to talk but not walk? We rely on nerdspeak and jargon so much that just being able to prattle on in a dilbert-esque way would otherwise convince someone to hire you?
There's lots of other ways, but some of them suck.
Some fields attach a lot of weight to your alma mater, maybe they only hire people who studied law at Yale or Harvard.
Some fields require not just a degree, but also years of study under an industry veteran. Sometimes that also involves hazing like working 70-hour weeks, for some reason.
Some fields require work-sample tests where you show up at a given location and demonstrate your abilities on demand.
Some fields require not only a degree, but also years of working for free in order to break into paid work. And the paid work is far from guaranteed, that free work only pays off for 10% of people.
Some fields don't offer permanent employment, instead hiring people for much shorter periods - so bad hires can just not be rehired for the next project.
The candidate gets some code and a ticket description of what the code should do.
If he spots an issue, he can fix it right in the code or simply explain it.
I'd that testing the ability to read and understand code is both less stressful for the interviewee (no "judging every key stroke" stress) and more helpful for the interviewer.
Works great.
In my experience conducting these live coding interviews, it's almost a universal rule that if the candidate starts coding immediately, they will waste a lot of time on irrelevant details, or take a long time to see that their approach can't work.
I always try to encourage the candidate to talk through their solution or draw a diagram if it helps them. Candidates who follow this advice always perform better.
To me this is the key to the whole problem. For many companies live coding interviews are a cargo cult approach to interviewing. Since the people that do them don't understand how to evaluate others they become a checkbox exercise that only evaluates if the candidate would tackle the problem the same way the interviewer thinks they would tackle the problem.
I refuse to take those interviews. I had one company that promised they wouldn’t give me a live test, and when I told that to the interviewer who was trying to give me a test he said “hm, well we’re going to do it anyway”
I passed the test and was given an offer which I shot down for the company which respected my terms
In the end the other company was left high and dry when they needed the new team member
Just do it. Enough of us see this for what it is. If we just act how we think we change the industry
The other problem is after lay offs, if all companies ask you to dance then you have no choice. Dance or stay unemployed.
Obviously nobody wants to "dance" and nobody would if they had the choice.
Sometimes you’re desperate and will do things you otherwise wouldn’t. That isn’t and should be thought of as the norm
The author concludes with a decent summary of the issue (ie. Is this the best method among the worst?). But doesn't actually find a better way of doing things.
And in the million forum threads on this issue, no one else has either.
But I don't know that many people. Properly conducted live coding is the least worst alternative out there for complete strangers.
Works great.
There is a nice side effect about doing this as a coding interview; there are often obvious indicators that a candidate is a poor fit -- for instance, big knowledge gaps in the standard library of the primary coding language.
More important is how the candidate structures how they would solve the problem, and how they communicate it to the person on the other side of the discussion. Are they taking testing into account? Do they iterate rapidly, or have a monolithic solution they have a hard time conveying?
For lack of an effective whiteboarding solution with remote interviews, coding interviews are here to stay. They should be reframed, though, as focusing on collaboration -- not on being a leetcode champion.
The way we do these at my current job is extremely productive for us. We look for two things, apart from basic competency: problem solving, and asking for help.
It's structured as a two 90 minute pair programming sessions. Interviewee shares their screen, and we work through the problems together. Obviously it's pretty hands-off, but we guide and nudge where appropriate. Here and there, when they use something that relates to a deeper topic, I'll ask questions to gauge how deep their knowledge goes. Like asking if they know how C#'s foreach works under the hood. Not as a selection criteria, but simply to get a sense of how much they know.
Use of a search engine is openly encouraged. A lot of the time, we don't even care if the program actually runs. If they struggle with syntax or the correct function overload, we'll help them out after giving them a little time to find the solution.
I also throw in a problem designed to get them stuck, and ask questions I expect they can't answer. A good programmer asks for help and admits when they don't know. A bad one bullshits their way through.
We want to hire programmers who can do a real job in the real world. Implementing red/black trees on a whiteboard blindfolded isn't a job skill, it's a party trick. That's not something a programmer will ever need to do.
In the real world, real people use google and stack overflow. They don't have encyclopedic knowledge of the entire language's syntax. They ask their coworkers for help or opinions.
Our interview process is designed to show us how a person will function in a scenario as close to the job as possible. Because that's what we're hiring them for. We look for their ability to work through a problem with the resources that everyone always has. We look for how they work with others and how much they lean on coworkers.
This has worked out extremely well for us. We've hired some very talented individuals, and have totally avoided the archetypal shitty dev. The people we hire immediately mesh with the team, and learn and grow the way all programmers do.
Granted, we are a small company and we have the time to have our own programmers giving interviews. We also have a much higher need to be so selective. But every single person who has made it to the technical interview has remarked unprompted that it's the best interview they've ever had. And I mean 100%.
It's because we treat candidates the way we'd treat our own employees. They get to know what the job is like, and we get to know how they'll do the job.
First, how we got here. I didn't experience this first hand, so I'm just making a recount based on what I've heard and read from "old timers". My understanding is that before whiteboard, leetcode-style interviews became the norm, tech interviews were mostly unstructured and quite informal. In the really old times (pre 90s) you could get a job just by knowing how to use a computer. I believe this wasn't that unusual in the 90s and early 2000s. I still remember hearing a founder-CEO bragging that his interview process was a 30 min chat with each candidate, and if he liked them, he would hire them.
From what I could gather Microsoft is the first big company that started using these "coding challenges". Then they became wildly popular thanks to Joel Spolsky [1], the publishing of Cracking the Coding Interview [2], and Google made brain teasers world famous.
Second, why we are still here. First, I believe there is a huge cargo-cult factor. Companies want to copy big tech, and alumni from these companies go on to found their own. This kind of interview has been honed and polished over the years, landing in a local optimum. An entire industry of websites and products has been created, and there are many entrenched interests. People might hate it, but the process works well enough for tech companies that they don't need to worry too much about it. Another under-appreciated factor is scale. This kind of interview sort-of scales well, which is important when you hire at a massive scale. That's why things like "have the candidate come to work one day and pay them" won't work for companies that need to screen thousands of people a week. Lastly, a standardized and "well known" process introduces some guardrails that can avoid some obvious pitfalls. The book "Working Backwards" explain how the Bar Raiser program was created at Amazon when a bad senior leader hire used the unstructured approach to hiring to build an empire misaligned with the company. At a big enough company this is bound to happen sooner or later.
Third, where do we go from here? I find it extremely unlike existing big companies will change their methodology any time soon. It might not seem like it, but for a company like Google it would be a massive undertaking to overhaul their hiring process. It would take years to achieve the level of efficiency and effectiveness of the current system, and surely there would be tons of opposition.
I believe the only way forward is for new companies to experiment with other methods of hiring, particularly at the beginning when they are nimble and can experiment freely. As they grow they will face challenges scaling, polishing and standardizing their process. At some point they will become the next generation of Big Tech, and the cargo cult wheel might spin again.
In any event it seems we need some sort of structured approach to hiring where we assess the match between company and candidate.
[1] https://www.joelonsoftware.com/2006/10/25/the-guerrilla-guid...
About 1, big companies need regular standard hiring sessions also for anti discrimination, and simplifying comparisons of larger numbers of candidates. Bureaucracy can come about from involving big numbers of people.
1. Tech phone-screen: work through a simple but relevant to your job exercise (no leetcode stuff and code didn't need to run... i.e. could be pseudo code).
2. Take-home exercise: where I got the chance to write quality code at peace and a technical design doc describing my process and findings. I'm certain the take-home made up for 80% of the decision.
3. On-site: chatted with both engineers and non-engineers about the exercise, and how I would've approached new hypothetical but real-life requirements.
Everything past the take-home exercise mostly revolved around that exercise. If let's say I had cheated the whole exercise, I guarantee you it would've been evident immediately for many reasons – including the fact that when I was working on the exercise I discovered and reported a bug in their own production software.
Sure, you can often tell the folks who know nothing when asking them to explain the code, but it's increasingly difficult to tell a great engineer apart from a mediocre one that's cheating, and once you start hiring cheaters the toxic effect to culture sets in fast.
"So at that point, do they want to see you muddle through it, or would they rather see that you know to have ChatGPT run through the initial pass and then refactor?"
"If a company is evaluating engineers with questions that can be easily answered by AI in seconds, what are they really evaluating for? Perhaps they’d be better off hiring a chat bot."
I would hope that all SWE candidates would understand software licensing to some extent and how to behave in a way that would not put the hiring organization in a legally risky position.