Were algorithmic "gotcha" questions still asked? Were the questions easier? Was the bar just higher?
I'm also guessing there was no 'coderpad' or 'hackerrank' - was everything just done on a whiteboard and pseudo-coded?
People were super harsh about attention to detail, e.g. you had to write code that would compile on the whiteboard and you would get penalized to some degree for making mistakes, especially if there were enough to make it clear that you hadn’t really been using the language you claimed to be proficient in.
The book isn't useful anymore for interviews, but it is a fun read if you like brainteasers.
I’m just bitching about having to code a binary search using a white board at Google when I had previously written a search engine and indexed two country’s by a man who wears a cowboy hat to work. (To be fair it was a nice hat)
I didn’t get the job, they claimed not good enough at coding, wtf! Well f*k them then.
Unfortunately a lot of tech is far too biased towards a small set of learnable skills that are sometimes hard to demonstrate in an interview setting, and the result is that a lot of good people don’t get offers for jobs that they could totally crush in.
(And yes, we did these on whiteboards.)
It wasn't that long ago that Joel Spolsky proposed fizzbuzz as an interview screen...and he really meant that you should ask that question. The idea was that you demonstrate that you can write code using a simple test, then move on to more important factors. Can you imagine such a thing today?
Where are you seeing LC hard questions? I’m in a fairly big Slack where people have been comparing notes on interviews for years. The number of people who actually get LC hard problems is extremely small.
A lot of people will claim they got LC hards immediately after an interview when they couldn’t solve them, but when you ask them to describe the question and someone looks it up, it’s always a medium.
There were also sites where people could report recent LC problems they received from specific companies. Hard questions rarely made the list of commonly asked questions.
The only exception seems to be people interviewing at certain companies in India for some reason.
In fact, I had a rather hilarious circuit where one interview hit me with a leetcode hard that I couldn't solve, and I worked out the answer when I got home. Later, I went to another interview that asked me the same question. They thought I was a genius.
I remember a fad that year was asking questions about doing depth-first search in 2D tensors. Once you learned that core trick well (mainly about handling the boundary conditions and traversals without messy code, which takes practice), about 80% of interview questions opened up to you. It was crazy how many companies were asking variations on that core question. So you'd do a few interviews naïve, bomb them, figure out the answers for the fads of the day, and be well-prepared by about your 3rd or 4th go-round.
Idiocy, all of it. Even leetcode medium is too much to be pushing for a correct answer in an interview [1]. It's Kabuki theater for engineers, where the interviewee is pretending not to have memorized the answer, the interviewer is pretending that the interviewee hasn't memorized the answer, and everyone is pretending that this is a "signal" that matters.
[1] Assuming that they haven't seen the question, of course. I will say that a question of that difficulty level can be useful to push someone to their limits for other reasons. But these days, it's mostly just a pass/fail screen, and you do the leetcode medium perfectly while tapdancing backwards, or you don't move to the next round.
I realize this isn't quite your point but the GP also brought ups that not too many years ago fizz buzz was the literal test. There's a pretty big jump from fizz buzz to leetcode hard. In face *any* leetcode is a jump up.
While not quite the same issue one thing I've found is I want to see more LOC from candidates. And a lot of the algo questions don't lent themselves to that. What I really want to see are their habits, style, and preferences.
In normal hiring, i.e linkedin, or even college hiring where they look at resumes first, this nonsense rarely happens.
The other 1/2 quickly solve it and we move on to better stuff.
I've never been fond of whiteboard technical interviews which used to be the norm, I really struggle to draw and talk at the same time. I do fine in interviews normally, I tend to be more of a delivering value for the business kind of developer and strongly emphasise this in my interviews.
I'm not writing operating systems and neither are most the other people I work with and hire.
These days, I don't do leetcode. If a company insists I walk away. I have better things to do with my time than memorise a bunch of useless information (for some value of useless).
I'm mostly on the other side of interviews now and am firmly in the belief that you can get a limited set of signals during an interview. The "Thinking Fast and Slow" view is that we're not very good at evaluating people from a gut feeling.
You might think this would push me towards leetcode and other quantitive measures, however I'm much more interested in working out whether you can be good on a team. The last thing I want is an asshole 10x engineer that makes everyone else unhappy. If you can't actually program I'm going to work that out by watching your PRs and you won't pass the probation. I'm not suggesting I do NO checking in the interview, just that I put limited stock in what can be read during this process.
Leetcode problems are almost useless for determining what matters in a professional engineer. But the stuff that really matters (communication skill, clarity, patience, flexibility / lack of dogmatism, taste, constructive criticism, political savvy, prioritization of constraints, willingness to write documentation, reading code and finding bugs, etc.) is not something a new-career engineer a few years out of an undergrad CS program can competently evaluate, because they're probably not very good at these things themselves. And in this industry, that's largely who is doing the interviewing.
But the cargo cutled nonsense before leetcode were the brain teasers. "How many ping pong balls can fit into a 747?".
Also nonsense.
But are we accounting for their comfort and making sure they all have TV screens? Do we have to account for FAA regulations? Is it okay if some of the ping pong balls get damaged as we try to fill the aircraft? Are we completely emptying the aircraft or leaving the seats in? Does the plane still have to fly and if so which areas have to be accessible in FAA regulations? What is the weight of a ping pong ball and if you have enough of them does that come close to the allowed take-off weights?
Some of how you respond to these is an indication of how you will respond to challenges and frustrations in the team.
Also, a famous question around that time was 'why are manhole covers round?'. (Thought there was a book from that era with that name but not seeing it; only found a newer book.)
Remember doing interview prep at the time by trying to memorize the answers to a bunch of these puzzles. Leetcode is useful in comparison to that. I had an interview once where the interviewer could tell I knew these and he was asking me to tell him about others I knew about.
The thing is that at the time there were no books around commonly asked interview questions, online leetcode lists, etc. so the assumption was that the applicant was basically going in cold, and that there wasn't an expectation that they would be able to instantly solve the problems.
It was more important for them to talk out loud so that the interviewer could observe their thought process.
Also, topcoder started in 2001.
All the other companies did the same as the non-leetcode companies do today. IQ test type things ("next shape in the sequence"), Fermi problems ("how many piano tuners"), behavior questions ("tell me about a time when..."), code review ("look at this [bugged] code, what improvements would you suggest"), systems analysis ("explain the sequence of events that happen when..."), systems design ("draw an architecture diagram of a project you recently worked on and explain it") etc.
Still a great read. Base idea is that they're smart enough to solve problems and they get things done and don't just blab about ideas which they won't implement. So they'd write code to prove they can get things done.
Brainteasers were okay, but bad at testing the latter. This somehow evolved into Leetcode culture or take home assignments.
Hunter & Schmidt 1998 [1] pretty much boils down to this. Their meta-analysis, predating Spolsky, winds up stating that the only useful thing is a combination of work samples & an IQ test. Never the order and that smells a lot like "are they smart and can they get shit done?"
For instance: What does 'printf("hello, world\n");' do? Obviously, it prints something, but how does it do that? Pretty quickly you're talking about includes, macros, libc, linking, machine code, system calls... One question can easily fill an entire interview slot.
The fun thing is there's no "right" answer. Nobody is expected to know everything about how software works, but everyone is expected to know something. This format gives the interviewee the opportunity to show off what they know best, and the interviewer gets to pry in to see how deeply they know it.
I'm a low-level guy so that's the direction I tend to probe. Usually someone else asks a similarly abstract high-level question. One of my favorites is: "Design a parking garage". Again, there's no right answer. It's a prompt for the candidate to show what they know. Very quickly they're coming up with functions and class hierarchies and/or data structures for vehicles, spaces, turnstiles, payment kiosks, figuring out how to pass them around, etc. The interviewer has plenty of opportunities to pry into design tradeoffs, add complications, and so on.
The grand idea is to have a deep conceptual discussion instead of just seeing if they can write a few lines of code. This also demonstrates how well they can communicate. The catch is you have to be sure they give some actual concrete answers in a few places, and aren't just fast talkers.
Maybe?
> The fun thing is there's no "right" answer
And this is the key. keep an open dialogue. Always probe a layer or two down. Don't enter with preconceived notions of what's "right'. It turns out very few things in our field are binary (har har).
Instead - can they talk shop? Can they demonstrate that they didn't just read some crap from a blog? Is this from experience? Do you like them? Do you think they'll get along?
We used to do a take-home assignment (which probably LLMs have ruined now) and then extend it further in the pairing interview. There was no one right way to do the assignment. Different approaches (functional, object oriented, tdd/bdd) would all become part of the discussion.
Especially with recent advances in AI-assistance it becomes more and more crucial to learn fast, and have the ability to apply knowledge to actual problems, no matter where the knowledge originates from. That, solid bug-hunting capabilities and a good understanding of the big picture and the business problem you are trying to solve.
I‘d even argue that nowadays communication skills are much more important than any memorized knowledge about algorithms or a given technology. Communication is what makes you a successful developer, ironically - even when prompting an AI assistant.
Fun fact: 10 years ago I had an exam-like testsheet that I handed out to candidates and gave them one hour to fill it out, including paper-based coding. It makes me feel seriously embarrassed when thinking about that with today's experience :) ..
If I were doing it all over again today, I'd skip the whiteboard and bring along a laptop loaded with our compilers and toolchain and any supporting libraries needed to solve the problem. I'd mirror it to the screen in the interview room so we could discuss the solution as we went.
In web dev interviews, in the UK, the bar was much lower. It was rare to be asked anything that resembled a leetcode question. Everything was focused around the technology stack you'd be using, the culture of the company, and 'agile' (before people called it that).
But as I remember it, they were always presented as "show you can think" not "know the answer".
Change hasn't been too drastic since I entered the business in the late 1990s.
Salaries used to be less inflated, so hiring interviews were a little more lax, because hiring a $60K/yr developer is much lower stakes than a $150K/yr developer.
You were also competing against fewer candidates. "Back in the day" before WFH they might have had 10 or 20 applicants instead of 200 or 2000. So there was less automation and more human factor.
There were often still coding exercises. Your portfolio/work history mattered, as it does today.
I think there was much less awareness of "software engineering" (sustainable, scalable processes like source control, CI/CD, etc) as opposed to like, "just hire a guy who is smart and writes code good."
A lot of programming jobs at smaller shops were really kind of like hybrid sysadmin/coding jobs. You might also be fixing peoples' printer drivers and shit, in addition to coding reports and data imports or whatever.
The business side along the lines of 'what value can you add', and the technical side was more about describing problems you had discovered, communicated about, fixed, and details about the methodology, or APIs involved.
I rejected interview requests with google et al knowing of the time wasting LC approach. I think most developers are there to solve real world business problems, not rewrite an OS, although actually that is currently my hobby project :)
With AI assist, the focus on solving business problems comes back to the forefront, and the LC can be done by the AInt so bad coder. Or I think it was mentioned here recently that coders will soon be relegated to AI reviewers, what a horrible thought.
In summary ;) I think I'd be interviewing programmers focusing more based on business domain knowledge at this point.
The short answer to your question, and to try and not start up another HN thread on how good or bad differing interviewing techniques are, is that we did things similar to how most recruiters and hiring managers are still doing it across the world. I think the big difference is more around getting to the first interview. Technology means that there are more applicants, automation of filtering (bad), and remote interviews than there used to be in the past. The funnels to get candidates may have changed, but in my experience, once in the final stretch things are pretty much the same as they have always been.
I find it interesting that there’s an assumption that if you exist in this world as a professional software engineer and are successful by all reasonable measures that it somehow predisposes an exposure with leetcode.
I’ve never used it for interviews not because I put energy to avoid it, but because I don’t think it’s all that popular, or maybe I just don’t interview often enough. If I want a new job I wait until I’m emotionally done with the one I’m at and none of the places I have ever interviewed have used it. What, is it that if you’re not using it today then you’re somehow “behind”?
I don’t understand this post at all. What a loaded assumption. Does it do something only Leetcode can do? Is it some holy grail? I’m just burnt out on this tenor of the community here, as if any of these platforms are seated as some kind of hegemony of “the engineering scene” Yawn.
An interview is and will always be a balance of your technical skill and your ability to present your work and deal with timely feedback. That’s it.
So for people who wonder wtf this post is about, you’re not alone.
An interview is whatever the interviewer wants it to be. Often it’s a probing of your knowledge of data structures and algorithms via Leetcode style questions. Like, very often.
How many times have you interviewed in the last 10 years and for what kinds of positions?
That's the ideal case. However, reality can be very different. Some companies reach for leetcode-style questions because they don't have a clue how to do the interview process the way you describe it.
> What, is it that if you’re not using it today then you’re somehow “behind”?
That is not something you get to decide. The hiring manager will judge you if you are behind if you can't answer leetcode-style questions. This is not a question of knowledge, but a question of power.
Second job similar actually, only this time it was their awful proprietary word processor (a Wang clone I think) and it's macros ans PL.
Then for a big-name computer company, a full day of IQ tests, personality tests, 3-4 interviews, whiteboard pseudo-code, team meets, etc - but all candidates processed in a single day and an answer a few days later (I got the job) - not the same as today where it's spread over weeks or months and you might be ghosted at any stage.
Explain as detailed as possible what happens when I click a button?
——
Around 2004, I recall college friends talking about an exam Google published. I think one question was: What is DEADBEEF?
——
In 2008, the “cracking the coding interview” book was published, and it is like leetcode but the problems are well formulated.
At least in Brazil, some companies still use this approach. My current company does for high level positions, as well as another one where I was interviewed. They include two or three architecture questions, though.
I know it’s unpopular to say here, but I found studying data structures and algorithms to get ready for job interviews made me a better developer.
"How did you do it?"
"What problems did you encounter?"
"How did you solve them?"
It's not hard to recognize an impostor.
And definitely lots of algorithmic questions.
Better questions than leetcode, though, because they were usually grounded in an actual problem the interviewer had.
(You can approach "how they think" as a question with strict rubrics fwiw, but it's a somewhat uphill battle. And not one you can win as an M1, usually)
if you need a job or financial aid kindly contact us now via email : shalomagency247@outlook.com
Thanks.