In my email I have an "interview prep packet" from them that essentially tells me to brush up on algorithms and read Cracking the Coding Interview to prepare for their interview process.
I'm fairly happy in my job. If they offered more money or a really interesting project I'd consider working for them. But I'm pretty lazy about redo-ing college algorithms class during my free time at home to go work there, so I probably won't.
There's an opportunity cost with interviews like this where an M.S. and long career of getting shit done counts for very little and memorization of undergrad level topics that you can look up in two minutes in Knuth if you have a problem that requires it can make or break an interview.
I've made a career fixing a ton of horribly shitty, inefficient code that's been produced exclusively by people who pass these interviews.
That's who they want to hire at the end of the day: some coders that don't get too critical about their job and do what they are asked to do, even if it is repetitive, stupid and doesn't really make sense (such as re-studying algorithms implementation details for two weeks before an interview when it can be looked up super-easily online).
I'm pretty much the opposite of what you think, so if my desire to study for the algorithm interview is your litmus test for that, kinda proves my point.
Not everyone that would be good for Google has a burning desire to work for Google. Google might want to consider that.
Spot on. Certain companies ask silly question and to perform tedious exercises and, in the debrief, investigate if the candidate complained.
(disclaimer: I worked for Amazon and never seen this pattern in the company)
some coders that don't get too critical about their job
Couldn't be farther from the truth.As someone who works there I've noticed this general trend of animosity towards Google, it seems to be fueled by the fact that people feel a sort of inferiority complex when they don't clear an interview or they feel they wouldn't be able to crack it if they ever gave it a shot, this leads to an overcompensation of attitude in the other direction, a sort of "sour grapes" narrative where there is an effort to downplay the prospects of working at Google or ridicule the people working there like you did just now.
Well, if that was the goal, boy did we ever fail to deliver on that one ;-)
What happened was that the interview questions inevitably got leaked and accumulated (this is the result of a post-highly-super-indexed and centralized internet) and it became a race to the bottom for candidates.
Imho the test may have worked in year 1-4 of google but it no longer flies since college kids spend an eternity studying them at home. It’s like the SATs all over again for these kids.
I still think these tests are generally good for testing how good somebody is. If you somehow got through cs without knowing how to 3 color a map vaguely (not talking perfect answer here, just vaguely correct intuitive explanation) even if after 15 years of work, something isn’t right. These sorts of questions definitely will weed out your local web-dev baddy or even dev-bootcamp baddy, which Silicon Valley is starting to be flooded with now.
I'm a big fan of not wasting time, which I why I get stuff done at work and have been promoted twice at my current company in the past 3 years. As a sibling commenter suggests, if Google wants employees who blindly do what they're told, then I wouldn't be a good culture fit. I was taught critical thinking skills in school. Respectfully questioning my superiors' plans from time to time has been a valuable skill.
And this is the big thing I've noticed about many Google apps (especially dev tools): they are all incredibly (for want of a better word) hacky. And the documentation... while I can very much appreciate that they are serious about it and work very hard, it's a huge mishmash of marketing speak ("It washes your socks and makes dinner for you!") with the crucial information you need hidden away in secret corners. I literally have to use Google the search engine to navigate any of the documentation.
I don't mean to be so negative. It's not that it's so terrible in reality, it's just that it's not anything particularly great. I literally can't think of a single offering they have that I would aspire towards. There are lots of smaller, hungrier companies making much better products. And by extension, I think if you work at those smaller, hungrier companies you have a better chance of learning more, becoming a better developer and even being happier with your job.
So the thing is, unless I'm alone in my assessment, I think the main things people are looking for from a job at Google are status and money. Additionally, I think a lot of young people believe that talented developers mostly work at famous companies. In my experience, this is the opposite. Back when Microsoft was the powerhouse, I knew a lot of MS developers. Some were amazing. Most were average (as you might expect when a company is hiring thousands upon thousands of developers). However, you had a much better chance of actually working with someone amazing if you got a job at a small shop. I still think this is true.
Personally, I don't complain about the "You have to be this clever to ride" interviews. Yes, they self select for people who are not like me (bulldogs that work away obsessively at problems until they are solved without using any particular magic insight). Yes, it means that I'm unlikely to get paid at the very, very top of the payscale (heck, if I wanted to get paid more, I would have been a lawyer -- I want to write code). It just means that it's that much easier for me to find companies that are a good fit for me. I don't really see a problems with that.
Working for Google is the only way to get bugs in Google products fixed, or your feedback even listened to. ;)
with all the android privacy issues researchers keep uncovering, i feel like Google has really driven a certain fraction of the labor pool which cares about protecting user privacy directly away from itself
> they are all incredibly (for want of a better word) hacky.
yeah, i haven't seen much of anything to feel inspired by either (although I do think Google Maps team has put out a really solid product). but i have to deal with the Android SDK and other google libraries for android, and "hacky" seems like the best single word description for that stuff, IMHO, too.
what makes me laugh about the android documentation is that even though Google's mission is to organize the world's information, the best android documentation and guidance i can find is organized by Stack Overflow.
Well, the interviews are marginally effective at identifying new CS grads (who also are cheap and have few outside commitments - companies love this!) but not much else. You're exactly right to view it as an algorithms final exam or an algorithm puzzle contest, but Cracking the Coding Interview is an embarrassingly bad book, in spite of (?) its ostensible purpose of helping people study for algorithm puzzle interviews at Google, Facebook, etc..
I like how you made this bold claim with zero justification as to why.
I love this comment.
I would say they probably made the decision like we all do when development work, that solving this n+1 or bubble sort, or api package doesn't have time in the budget.
Getting to market sometimes is more important to managers than optimal `correct` code, when the market isn't willing to pay for the services of `correct`.
Haven’t had a chance to hire any ex-FAANGers yet, do they compare well to the general market?
In short, I think the exact same question is interpreted as a cool algorithm challenge or a recall check, and the interviewer will be fine with you rederiving the answer on the spot if you are quick thinking enough to do so.
They seem to have no lack of talent and it certainly doesn't seem to be negatively affecting them. I wonder if this will change eventually, and Google will become IBM v2.0 (that is, a respected and profitable company which is mostly boring/unexciting).
Once upon a time, skill at doing these sorts of problems might have correlated (imperfectly) with general aptitude as a programmer or software engineer. But the very act of trying to leverage that correlation for hiring purposes probably also made it go away. Now you've got a whole lot of people practicing hard on these sorts of problems, spending huge chunks of their free time grinding away on Project Euler and Advent of Code and HackerRank. That muddies the quality of this stuff as a proxy for what it was originally trying to detect: natural aptitude. I'm guessing having time to level grind like that also correlates inversely with other traits that are desirable in a programmer.
In the silicon valley I have a fair amount of colleagues that expect of any dev to spend a fair amount of their free time grinding on even more dev.
No surprise that there is a lack of diversity in the profession as a result.
The relentless scepticism about people's achievements is to some extent understandable (we've all run into the senior person who can't do fizzbuzz), but it ties neatly in with the idea that every new hire should be 25 at most.
Edit: By "over 50," I mean age demographics in general, which is the main thing that raises your family obligations. The only demographic division that I can think of that would reduce someone's willingness to abandon their personal life would be age.
I am not sure I follow. For people who want to, there is a section where they can find answers even if they have not solved the problems. What was 'wowzers' about HackerRank before you learned that?
It’s really a travesty that we can’t teach it the same way that we teach maths or natural languages.
The end result is we’re left trying to divine whether someone is the programming equivalent of being illiterate. As with illiteracy people find ways to fake it.
As an aside, as someone who also studied linguistics [1] (and a couple of foreign languages at a beginner level), I am very confident in saying that our approach to teaching natural languages relies almost entirely on natural aptitude. It is just that, absent a serious mental disorder, all humans have a very large natural aptitude for natural languages
[0] Probably because we are absolutly terrible at teaching it.
[1] I assume not what you mean by teaching natural languages, but the topic of language acuisition (including in adults) does come up; plus it gives some perspective on how teaching language would look if we didn't rely on natural aptitude.
If a developer-to-be doesn't understand the framing context of what they are doing they are being dropped in a lake with no sense of direction.
Its why all the "naturals" started as geeks who played with computers from a young age. You learned about the environment you would end up working in and later on when you hit the grindstone and actually started creating gears to stick on that machine you had an idea what the result should look like and knew the tools in the shop when you set out to start building it. Even if you didn't know the steps involved in the process, you were familiar with the environment.
People who haven't spent time engrossed in computers, such as the myriads of youth entering a cs 101 class thinking its an easy career when the most exposure to tech they have had is maybe updating their phone and using apps for Facebook and Twitter and maybe owned a video game console with no tinkerability as a total black box drop out so fast. Their professors lead them to an anvil and tell them to forge a steel rod without any wink of an idea what a hammer is.
Its just not an answer anyone wants to hear, because the solution is only to have what amounts to an entire degrees worth of learning to predicate the actual study of programming. But you don't want your brain surgeon to go to medical school after having never studied high school biology or even more generally learned to read.
edit: now if you'll excuse me, I need to do some dynamic programming problems.
Ultimately, its just studying for the test, very much like the ACT/SAT in high school. You can be great at taking tests but ultimately a terrible student or vice versa.
On the other hand, I genuinely do feel theory is really important. While not knowing the minutia of a tim-sort doesn't indicate that you'll be a bad engineer, not knowing the runtime efficiency of a sort can lead to some really awful code. Not knowing when to use a hash table instead of a nested-for loop can be a sign that you don't really know what you're doing, and not knowing some rough theory on concurrency indicates that I might be stuck debugging your race conditions or deadlock.
I try to not be a complete jerk and I won't do stuff like give out an NP-complete problem (which an interviewer gave me once), nor will I ask for intimate details of how one would implement CSP, but I do tend to focus on theory-heavy questions more than my peers, but I try to give a fairly-generous amount of hints so that people don't get too stuck.
How do you know?
I understand that FANG have themselves come to the conclusion that brain teasers are not necessarily very predictive for engineering performance.
But CS/programming questions for CS/programming roles? That seems sensible.
You can study lots of vocabulary to achieve better results on the verbal section of the GRE or similar tests. But afterwards, you will, in fact, have better vocabulary, I submit.
I am surprised that you categorically deny that they were better engineers after studying CS/programming questions.
There’s interview cake and leetcode, but I think people would pay $2k for a class that focuses on the questions and in person whiteboard practice.
They could collect information about the interviews at the major companies and then use those to create the program. For payment could also help candidates negotiate and then take a cut of the signing bonus.
If this isn’t part of lambda school already I think it should be.
I already work at a competitive tech company, but I’d sign up for this in a second to help me stay competitive for interviews.
Worse, there's this huge emphasis on "big O" with zero focus on clearly egregious bad practices (tons of copies, outrageous memory usage, casting between strings and numbers all the time). I've rarely seen clearly bad algorithms be deployed but I have seen plenty of unperformant code go out.
Probably biggest influence on this I think is the culture, if nobody tells you that your code is shit (in a kind way) you'll never learn better. But then again, some people are so hard-headed and full of themselves that they get defensive and never actually admit their faults. Though giving and receiving feedback is not easy, can give you an identity crisis once you realize you've been doing something wrong for years.
And then you have a bunch of Ivy League graduates who have spent years learning algorithms and are burning to use them but there's no actual problems that really need them.
No matter your background, what school you went to, or what randomized experience you got in previous jobs, every person has equal opportunity to study and practice the same algorithms on their own (as opposed to being lucky enough to be able to afford a top-tier $$$ CS education, or to being lucky enough to have the connections or chance to get certain previous jobs).
And thus, when applying to jobs, it becomes something more akin to a raw-ability IQ test, which you can argue is "fairer", especially when management realistically knows developers might be shuffled around all the time, and that the extensive SQL experience they were hired for will mean nothing when project requirements switch to a basic key-value store.
On the other hand, if you are interviewing for a highly specialized position that is fairly certain not to undergo change, then it makes sense that specialized experience could rightly count for far more than any kind of generalized intelligence or ability.
Well, that's just not the cause - there are many groups of people who lack the opportunity to study and practice. Couple of examples: people with kids, people working 12 hour shifts, people without access to teaching materials, people without a sufficiently advanced machine to run dev environments, etc.
testing whether someone is willing to prepare for a thing is a relevant work skill test too.
Managing complexity is a valuable skill that should also be screened for at interview. At most top companies you are there for at least 4-5 hours so there should be plenty of time to evaluate that skill.
I think whiteboard questions are good, I want to know that this person is capable of writing difficult code if we need them to.
I also think we probably ask too many of them.
I have been on the hiring side and my experience so far has been that almost always the feedback is close to identical across multiple whiteboard questions. The questions are also so abstract that asking multiple to "prevent bias" seems ineffectual. What bias could there be, you either solve the problem or you don't. Bias is more likely to come in on the behavioral interviews. There should be multiple of those for sure.
Generally someone is either a good enough coder or not and they will display that consistently across all the interviews. You will see the same stuff throughout (good or bad variable naming, good or bad communication etc) thus asking > 1 coding question by default is a waste of everyone's time.
I've worked for two non-technical companies as a software developer and one highly technical company, and interviewed at a few Silicon Valley companies.
The difference between the interview processes is staggering; my current job's interview was two hours of conversation, no code tests, just a general assessment of "do you know what you're doing" by the hiring manager and a couple other members of the team. The highly technical company had a code assessment then the in-person interviews had zero coding.
The SV companies must have a good reason for this, but golly the amount of coding in those interviews is nuts. I'm a process over code speed kind of coder, and I've failed every SV-level test because of it; my code comes from talking to non-technical users like medical researchers and study operations managers and tossing something together in Python or a cloud service that makes their lives easier. Needless to say, I don't go over algorithm fundamentals on a regular basis, and I generally fall out after the first or second interview.
It's especially odd that interviews are so intensely focused on those couple hours since I personally don't see any dev or any resource for that matter contributing in any meaningful way in so fast a time, or even within 90 days. I'm not sure how this problem could be solved with the limited time companies can dedicate to interviews, though; maybe rely more on portfolios?
A lot of more traditional software development roles do not have much testing. Sometimes they will have a quick online timed test or a simple question on an intitial phone screen. That isn't to say that all companies do not have testing. But it's definitely the startup and big tech worlds that have the majority of it.
Where are you learning DP problems from? I am very bad at those and need a few good references so that it sticks in my memory.
Here's a few for you to try. Some of these are pretty hard, but you should be able to find solution sketches online if you google the contests they are from.
https://open.kattis.com/problems/increasingsubsequence
https://open.kattis.com/problems/maximumsubarrays
https://open.kattis.com/problems/tray
https://open.kattis.com/problems/dinnerbet
https://open.kattis.com/problems/hyperpyramids
I've solved all of these as well, so if you get really stuck feel free to reply here and I'll try to guide you through them.
The tech industry's preferred form of gatekeeping is asking people to do algorithmic puzzles, which is far tamer and less exclusionary than what other industries do. If you believe that the only thing standing between you and 400k/year is a few dozen hours of practicing leetcode, why are you whining about it instead of taking advantage of the situation to wildly enrich yourself with a fairly modest amount of effort?
1. At least for me, I didn't know the game going in so I wasn't prepared for it. A lot of folks I interview don't either (though its becoming less common... almost everyone is somewhat prepared for it now), and I feel bad for them.
2. I feel like I'm a competent developer, but I'm not very good at the game, even when I practice and study a lot for it. I can barely eek through the process. And at least for me, I don't have a wide, general set of skills... I'm kind of only good at writing code, so when these artificial barriers are erected, I see future job prospects disappearing, and I don't really know what to do about it.
3. Perhaps it goes hand in hand with engineering, but I criticize absurdities an inefficiencies in business all the time. There's something particularly hypocritical about an industry which prides itself on meritocracy developing a process which not only fails to recognize qualified developers but often actively works against them.
4. I think we actually need more engineers. A lot more. Gatekeeping is preventing this from happening, leading to a general population wholly unequipped to understand and maintain the software that's taking over their lives. Programming should be a lot more like reading and a lot less like specialized medicine.
If I wanted to switch companies I'd be interviewed as if I were a recent college graduate.
Anyone experienced and good at this isn't going to have time for that because they're not going to jump through hoops to increase a 275k salary to a 300k one.
So what the interview selects for are people desperate enough to study hard enough to fool the interviewer.
And this is one reason the industry leans heavily toward young privileged male candidates and loves to reinvent the wheel every six months.
>leans heavily toward young
So, it's a problem to interview everyone the same way? But it's also ageist because the interviews are accessible to people with no industry experience yet?
I seem to do OK when an interviewer:
- asks me a question that sounds like a real problem, not a contrived one (although on occasion I'll have fun with a contrived puzzle if the interviewer has a sense of humor & makes the process light hearted)
- doesn't push me down a path that requires me to implement a "simpler" solution I'd never consider (e.g. asking me a question that clearly wants an O(N) solution & then pushes me to try the O(2^n solution first)
- talks like a person with a problem, and not as someone who clearly knows what they want & simply won't say it
- doesn't try to "see how I think", because I code as much in my head as I do on a screen, meaning most of the code I throw into a text editor is the latest thought in a stream of random ideas until I get to one that works
- doesn't constantly interrupt me
- states their actual expectations, such as "I don't expect you to finish, what I'm really looking for is X"
To your final point about expectations: One question that we try to ask ourselves about any particular interview question is "how may bits of information are we getting out of this?" Trivia questions are usually less than 1 - you'll find out if they know some trivia, but you don't honestly really even care. The best questions are those that get you several bits over the course of the interview period, rather than a single yes/no answer at the end. Questions where there is no final end point or where we explicitly do not expect candidates to finish, are often the most effective; we can explore the pathways that are working, dive into side channels that seem interesting, and get data along the way rather than just a checkmark on some individually-useless problem.
i had to push him to try something simpler first just so that he would get to somewhere meaningful within the hour available.
so which approach is better depends on the goals. in my case i wanted the candidate to solve the basic problem first: (make every person get to their destination, no matter how long the elevator needs) and then optimize to make it go faster.
For example, asking someone to write a self balancing binary search tree in an interview might be too much. And asking a question that ultimately demands using a self balancing BST might be a bit much. It's like asking a question that really needs an associative array, but then imposing the rule that associative arrays aren't available. How about simply asking the candidate about binary trees, if they've used them, and what they've used them for?
Last week an interviewer asked me about if I was familiar with the Egg Dropping problem (https://brilliant.org/wiki/egg-dropping/). I wasn't, but I remember saying something like, "I haven't done this one before, but I'm pretty sure it's going to take O(log n) guesses to find the max floor from which you can safely drop the egg." The interviewer asked me how I'd implement it, and then we got into the weeds when I started asking for what the requirements were. He wouldn't share any information about inputs & outputs, so I just started writing the dumbest thing possible, predictably coded myself into a corner, and burned a potential interview win-win on learning how answer that kind of question for that kind of interviewer.
So, as a courtesy we figured, why not spend a few hours extra with this applicant in the programming test. We set up a laptop with a clean Ubuntu install, devised a programming test that was quite involved. Not algorithmic hard, just more complex than what can normally be done within a 20-minute whiteboard interview. We expected it to take at least 2-3 hours. Google/Stack overflow/etc access was allowed and encouraged. "Just act as like you would normally do when solving a problem."
We spent like 2x4 hours devising this problem, based on our codebase (cutting out something somewhat easily digestible and making it able to run standalone).
It took like one hour to get productive. Explaining the problem, setting up editors, compilers, etc.
We took turns, but most of the time someone in the interview team (of two) sat next to the guy. We did give him some alone time.
This is probably nothing new in terms of interviewing techniques, but to us it was such a revelation. We learned so much more about the applicant. Perhaps it worked well with this guy because he happened to be a bit more outgoing than our typical successful applicant. We'd never felt so confident about giving someone an offer before.
I'm really looking forward towards testing out this approach with local candidates to see if we can replicate this "data gathering success".
Last time I was involved with this interview style it always seemed to take an hour to get setup which meant a long interview of 3-4 hours particularly if a candidate went down the wrong path.
In the end we optimised for SOLID principles with a blackbox dll that had a function that slept for 2 seconds and a calling class that had mixed responsibilities (logging and calling the dll). We started folks off with a test or two and hoped they'd inject a mock to get rid of the delay and split logging off into a separate class.
I'm not saying it was a great test but you could do something within an hour or so then maybe spend half an hour talking through what techniques they'd use for a more complicated scenarios.
If you can afford the time then more realistic testing is great and I do think you should try.
There's also a question about how to mix question difficulty. Should you ask nothing but easy questions, or is it good to throw in a harder question or two to see how the candidate reacts to something they can't answer? I can see a good interviewer getting a lot of signal out of that, but in the hands of a bad interviewer it would not work well.
A good interviewer, in my opinion and experience, will try to get to know the person first. They'll put the candidate at ease. They want to try as much as possible to be talking to the person they're possibly going to be working with, not the anxious candidate that just walked into the room.
yesterdays candidate told me he felt very relaxed, much different from all the other interviews he was doing.
and yes, figuring out if i want to work with this person is my primary goal.
(Of course, you can argue that 3 months of employment isn't enough - but it's a lot more performance data than a bunch of interviews gives.)
Then I really have to question exactly how they did this analysis. I can't imagine most companies would have the same objective function.
i mean what if we did that for math? some people would be good at geometry questions, others at algebra, others at topology, real analysis, etc.
what if some great geometer interviews with a fanatical algebraist and then fails to make the cut? this candidate is a great geometer! were the questions "too hard"?
I recently did something similar, that was of my own devising.
At one point several years back I wrote down some "code" in my own "shorthand" form; it was meant to implement a library and some test code for a microcontroller project I was contemplating at the time. I basically wrote it in such a fashion so that I was quick to get my main ideas down without being too "wordy" (whether in code or otherwise).
Then I put it away, and didn't revisit it again - until recently.
A couple of weeks back I found that code again, and looked at it - worried that I wouldn't be able to recall my shorthand or what I was thinking; in short, worried that my ideas would be "lost".
I looked over the code, walked thru it in my mind - and after a few minutes it all came back to me, and I was able to understand again what I had originally created (and where I could make improvements as well). It both left me feeling optimistic about the process, as well as a bit excited that I was able to remember it and improve on it - that I didn't have to worry about it, and that my shorthand "pseudo-code" was legitimate enough that it could have been real code for all that it mattered.
i didn't expect critical discussion because they were not native english speakers. (in fact they hadn't had an opportunity to even use english outside of talking to their teacher, so when they were able to engage in friendly arguments over how to solve a problem, that was quite an accomplishment, even more so for asian culture which is generally rather submissive (you don't argue with your boss))
I'd be interested to know how many of these interviewers actually think they're able to identify a solid candidate this way? Not to mention, are they even factoring in how many people don't test well but are otherwise superb software engineers?
Ultimately it seems like there is a soft element to interviewing that is being tossed out now, which is: do I think we can work with this guy/gal? Are they someone that can become part of our team on a personal level? Can they get good work done? Fizz Buzz can't tell you that. What can tell you that is experience. It's a hard-to-put-your-finger-on-it X-factor that I think companies think they can ignore.
A colleague offered this simple heuristic for the soft side of [the] interview:
"If they are going to be equal or junior teammates, I ask myself, 'Would I feel comfortable sitting in a conference room with this person and hashing out a design or troubleshooting a problem for two hours?'
If I'm evaluating someone who is going to be senior to me, or my direct boss, I ask myself, 'Would I feel comfortable following this person's technical instructions if they handed them to me in a document?'"
In my personal professional opinion, this measure is more useful to an engineering org. than any "Stump the Chump" style technical-trivia screening.
Sure, I'd love to have a "mind palace" like Sherlock. Alas, I do not. I often admit this as early in the interview process as possible to avoid wasting time.
I hate working on software that was written to be read by an audience with perfect recall, because, as a human, I just don't have that. Give me code that assumes I have the memory of a goldfish, and can't keep track of anything that isn't right in front of my face.
I'm pretty sure that's what half of Dijkstra's papers were trying to say, weren't they?
Devil's advocate: It's also the thing that is very hard measure without bias, unless you do something reductionist like total years of experience.
> how would you explain the with statement to a junior developer ? then increasingly difficult questions that go into the language runtime/concepts.
one other favourite question of mine is:
> Imagine, you got a standard website the serves data from a database. When a customer types in the url into the browser bar what needs to happen until the customer see the website.
> Go as deep as you can in the answering the question.
When you got an answer, you'll see frontend engineers explain more about the browser, while backend engineers talk more about the backend.
There was one very senior engineer, that actually talked about the ethernet layer, he talked for more than 15min. Most medior engineers are done in 5mins. ;)
That's a favorite of mine too. It sets up a broad range of discussion without needing to spend much time setting up hypotheticals and lets the interviewee really delve into topics for which they have expertise and/or interest. It's also easily adapted or sets up follow-up questions for different positions or levels, e.g. "with TLS," or "how would you debug a problem with symptom x?"
it put the pressure a bit on you as interviewer, since you need to adapt your questions. but in general i found it quite easy when you have a rough plan on the topics to ask.
I've toyed around with starting my answer with what happens when the physical return key on the keyboard is pressed. Could spend quite a bit of time on what happens before the browser even knows the return key was pressed :D
I'm not a fan of this because it wastes time. Its better to give an overview of the steps and then if more information is required the interviewer can ask for it. They can keep asking for more information until they are satisfied the person knows enough or the person can not answer anymore.
however, i would not skip depth, since depth indicates how deep this guy/girl can go if necessary to solve problems.
1. A candidate who can solve puzzles but is not willing to do the dirty work with team, solving production issues, doing debugging, bug fixing with usual stuff. 2. Another Candidate, who is willing to learn, is ready to work with team and do the dirty work.
I am on a hiring committee as a Tech Lead, and I always try to weed out 1.
Works great, we hire as interns and then assess them. Someone from Google who we hired full time, was detrimental to team's morale, grunting, complaining about code, complaining about food and what not.
Another experienced smartass was self centered on his skills and didn't want to teach junior engineers anything or willing to admit he needs to update his skills. The moment he realized his skills have no values, started attending pointless conferences. Now his LinkedIn profile has "aware of block chain technology", "attended machine learning seminars".
I said not to interviewing at Google and FB because I don't have cycles to spend months on leetcode. Did I erred? Perhaps. But I am sure neither can provide me same work quality I execute in my current mid size company. I regret nothing :).
1. The problem is pretty well understood (but does offer room for interpretation).
2. Provides time to cover all key aspects (Frontend, Backend, Database, Networking, Debugging, Testing, Caching, etc) in at least some capacity. In particular, it shows you what areas the developers focus on.
3. Provides a more relaxed/realist environment. It's also more accommodating to developers switching stacks - familiar with good programming patterns but not the specifics of stack (e.g. "Here's how I'd do [some specific task] in [other stack]. How do I do it here").
4. It's clearly a throw away task so there's no concern about "interview labor". It can also be pre-prepped so you don't have to worry about jumping too far in.
5. You can cut short with bad candidates and expand the problem for more complex candidates.
I got bit in the ass by this one as triplebyte itself. They asked me to make a tic tac toe game, and gave me iirc 30 minutes (less?) to do it. Except, it wasn't "build a tic tac to" game, first it was "draw a board to the console," "take user input from the console," etc a bunch of instructions in a convoluted path that perhaps another engineer would do when knowing from the outset that the goal was to build a tic tac toe game in 30 minutes, but not me.
So we'd get to a portion where I'd be writing a quick test on user input, or extrapolating something to a function, and the interviewer would say "don't worry about that, just worry about {getting the grid to print to console or whatever}."
Later on I got my feedback and they said they were disappointed with my user input tests and repeated, extractable code in the tic tac toe portion.
Triplebyte is trying to do good things in the interview space but I think they're still learning. All in all my interview with them was about as positive an experience as a harried and bad interview could be, from my perspective.
There were a lot of Googlable boilerplate questions (e.g. "what does malloc return?", "what's a bloom filter?") that, as a product engineer, never come up.
Then there were the classic Big-O notation queries that for most use cases don't come up until much later stage. It felt like the founders were classically trained in CS and over-optimizing for things that aren't practically relevant for the large majority of early/mid-stage startups.
Am I familiar with these concepts—e.g. can I go back and refresh myself when they come up?—absolutely. But often times the skills you'd want in an engineer are:
1. Knowing when to optimize
2. Knowing how to profile and identify bottlenecks
3. Familiarity with the available solutions
4. Ability to dig in and evaluate which is the right tool for the jon
This is particularly pernicious, because it's a trick question, too. On linux, malloc always returns, it will never return NULL. Even if you ask for 4 petabytes of memory on a 128mb system, malloc will hand you back a valid pointer for the memory.
I had a pretty great experience and would recommend TripleByte.
Me: "I'm going to just put some user input checks here, no guarantees they'll actually input X or O, yea?"
Interviewer: "Oh, don't worry about that, assume for now that you'll get X on X's turn, O on O's turn."
Later:
Me: "Normally I'd extrapolate this to a function, so let me just - "
Interviewer: "For now, just focus on getting the program to response to the next user input."
Me: "Ok... well the fastest way to do that right now is just copy paste this code down here."
Interviewer: "That's fine."
But, it turns out, it was not fine.
I felt I was bamboozled.
That sounds like a process problem and not an engineering problem. If I've been given requirements, I'm going to trust that my project managers and stakeholders have done the due diligence to understand their request.
Also you've got to realize that different developers tackle issues in different ways. Not every engineer is going to be super talkative while they're in the mud trying to get something to work until they've hit a wall that they don't feel like they have enough information to overcome. I think expecting an engineer to sit there and talk through every aspect of their reasoning WHILE working is fundamentally counter to the way that most engineers perform their day-to-day jobs.
1. Phone screen which takes 15 or 20 minutes. 2. The candidate fills out an essay, including showing us some code they're proud of. 3. If the essay ticks the boxes we conduct a 1 hour on site interview. We use the same a set of questions for every candidate, so the investment is easy to manage, and our team has a shared set of expectations on what is good or bad. 4. If the interview goes well, we give them a take home assignment. Takes between 2 and 6 hours, depending on how experienced the candidate is. Problem is in C and/or Python (or both) 5. We wrap up with a 2 to 3 hour onsite interview. We walk through the assignment and have a deeper conversation about culture and fit.
The results have been positive for us: we've made some great hires and weeded out some candidates who weren't a good fit.
We've also been able to scale it down to the process we use for interns.
The 1 hour interview has some typical programming interview questions, but we wrap them into a real-world example. The goal isn't to prove they know how to program, but more about allowing them to show us how they think/work out a problem.
Everybody is different.
More important than question difficulty to me is attitude, and I’d love to see whether attitude is measurable and how it compares to later performance, but curiosity and optimism and communication really do go further than right or wrong on math and engineer questions for me. That point might even be tired already, I know people say it all the time, but I’m going to keep saying it because we still have blog posts on question difficulty, when easy vs hard engineering questions are pretty low on my list of what matters when I’m hiring.
People who complain about memorization and difficulty are kinda missing the point. Just like learning math at school isn't really about knowing how to do trig, but being able to think logically and do problem solving.
The other candidates, after answering some of the harder questions incorrectly, seemed very upset with themselves. They knew they were cracking a bit under pressure, but actually showed that they knew the answers when we chatted further. I hired 3/4 of those people because of how well I felt they'd do given the opportunity. All three became leads within a year and a half.
I think personality has a lot to do with outcomes. If you are someone who shows they are hungry to learn and knows how to improve their skills, I will never dismiss you for screwing up a few coding questions.
If you're hiring for Generic Developer Skills you're going to get generic developers - and much less development than a more flexible approach would give you.
I find it hard to reconcile these two experiences. How can I thrive at a top tech company while failing to solve an 'easy' coding challenge. It makes me concerned about what would be of me if I had to look for a new job now.
Now, the ads ask simpler things like floating point precision and function variable scoping (https://www.facebook.com/triplebyte/ads/?ref=page_internal ); legit problems, but not sure if they are an indicator of how good a developer they'd be in the real world working on a CRUD app.
My worst interview ever was with Facebook when a non-native, new college grad gave me a Leetcode hard problem in half-broken english and went back to his work without even looking up or walking with me through the problem.
Give the candidate a project with 300,000 loc, tell them to make the most local change possible that fixes the reported bug. Update the tests to reflect the new logic.
Bonus: discuss architectural changes that would have resolved the bug and/or improved performance.
The philosophy at Google is that it's better to filter out 3 good engineers than to let in a bad one. The consequence of this is that it's really hard to get kicked out of Google.
The other part (whether it's more important to work on long easier questions to see how the candidate works on a large code base) is orthogonal reasoning, and that part may be true, depending on what type of engineers somebody is looking for.
Corollary: "If I follow (what I've heard to be) Google's hiring practices (despite not having their brand recognition or candidate pool or a comparable engineering environment to offer) -- then my company is on the way to becoming another Google!"
Google's problem nowadays is that they have such strict standards only for grunt developers, but not for management. In fact, it looks like that it's more like opposite, i.e. it's better letting in/promoting 3 bad managers/directors than 1 good one.
Just fire the bad ones. You're going to get bad ones anyway. At larger companies you might never notice whether someone is good or bad.
From the way they structure their interviews, it seems like they'll still get plenty of bad ones - it's just they'll get bad ones that are great at algorithms, with unknown skill at everything else (like the actual work done).
- Yes, you have to have vetted your contracting companies. In our case that's something we'd already done (roughly 20%-30% of our staff are contractors)
- This is contact to hire, which is different than bringing on contractors for staff augmentation. The intent is to hire and the applicant is made aware of that up-front, to the extent that their employee salary is negotiated and employee benefits package is discussed.
- W2 employees jump at the chance to be given an opportunity to prove themselves in a new domain. Younger and older employees are especially attracted to this option.
- You're still doing an interview, but the nature of the interview is different since you know the contracting firm has vetted their basic skills.
- So far we've only had to let one person go, they simply never could "gel" with the team. We've had two others we had to let go as we were scammed - the person that did the interview wasn't the person that showed up to work. Yes, it happens. What we don't have is a "revolving door" situation where people are continually rotating in-and-out.
The only advantage I can see is you can get rid of them easily by not renewing their contract but that in itself is not a solution to finding good people in the first place. Plus it's likely to be hugely limiting. I'm not going to give up my current job and move to your city for a contract you might cancel in 3 months.
Many companies are "contract to hire" but never hire and keep contractors on as long term employees.
"Right now our roboticists use a hacked together QT based GUI to manage customer robot fleet data. It takes 1-10 minutes to load on a slow network and is hard to add more features to. I know you have far less information than you'd want but walk through your thought process for how you'd replace this system over the next 12 months. We can make assumptions."
And then the next 15 mins can be an organic conversation about the problem space. You can direct the conversation into corners most relevant to their potential role: "you mentioned using web tech because we discussed how all usage is across the internet. Can you talk about the merits of Http vs. websocket?" "How would you ensure that we don't accidentally take every single customer offline if we centralised our data store?" "What kinds of UI technology would lend itself to robot mapping? Can we just use Google Maps?"
If you really need to dig deeper into technical prowess, find something relevant in your conversation and dig deep into it. "We talked about saving changes to floor layout. Can you whiteboard/laptop how you might implement undo/redo for floor elements?"
After receiving an offer from a big tech company, the interviewing process has already completely turned me off from the idea of working there.
Now despite this being a dream position for many and me having no alternative but to take it currently, the smug interviewers have already gotten me in the corporate mind-set: No matter, the reputation and salary, treat it like any other job, do not bother being loyal - they will not be.
So the terrible interview process has at least the advantage of reminding future employees what they are signing up for.
I wonder what kind of psychological filtering is at play. Do employees feel loyalty after an interview process that is best described as hazing? Are they projecting the humiliation they experienced when interviewing future employees? That's always been my impression.
Usually, interviewers are just bad at interviewing and aren't really aware of what proper behavior is. It actually isn't easy, nor is it a desirable activity nor do you really get points for it in your performance review.
I know these questions are part of triplebyte's product, so a full, repeatable study isn't in the cards. But if someone from triplebyte could just post a few examples of each, I'd be able to get a lot more out of this result.
Hard - implement a subset of regex match in optimal time+space, find the operations required to turn 1 word into another word given a list of transitory words, find the median of 2 sorted arrays in optimal time, find the next permuted value.
easy.. anything that is not hard :-)
Also, platforms like hackerrank are adding fuel to fire. I read the CEO write somewhere that he wished the below "were taught in schools :
1) Communicating complex ideas with clarity 2) Systems thinking 3) Grunt work tasks 4) Boundaryless thinking 5) Self-awareness / EQ"
Please note almost all of these are not evaluated on their platform (they profit from coding tests) or during interviews and almost all are soft / intangible skills (skills which are not immediately obvious about the candidate during a typical programming interview). [ Side note : some could say that coding tests are the problem they have chosen to solve - in which case, why are they worried about these skills ? Are the companies seeing coders crack the tests on their platform, while not performing well on the above skills post hiring ? We could only speculate. ]
All good work is done by teams, and to be effective in a team requires a lot of intangibles which aren't even assessed in a typical interview.
A better approach could be from this article: https://leerob.io/blog/technical-recruiting-is-broken/
Or : A couple of weeks of work with a task being assigned and the mentor or interviewer looking at how the candidate is approaching the problem and whether he is able to solve the problem within the time constraints (an easy task shouldn't take long, and a hard problem shouldn't be short circuited to give a sub-optimal solution.) and other such observable traits can be evaluated.
My two cents!
EDIT : Poor wording above (i.e., couple of weeks). A task should be assigned and evaluated post a time (which is ideal for task completion as per the interviewer). No constant interaction with the candidate and spending loads of time with that candidate - that isn't scalable when the demand-supply equation is imbalanced already.
EDIT 2 : The idea above is not about spending weeks for recruitment. The idea was about being practical about the kind of questions / tasks that are given during interview (example : code a feature or fix an issue we have, as another user has suggested well in the comments). Took me a while to realize we have missed the point I tried to convey for the logistics of how it should be done.
Do you think the MCATs or the Bar exam are flawed because they have books on how to do well?
Whenever there is any kind of standardization for testing, some ecosystem will evolve around it.
Questions about the standard library of the programming language in question are good. Questions about the dusty corners of said library: bad.
And don't ask about floating point: most likely the candidate won't really know more than the usual things; anybody who really does understand them will probably give answers over the interviewer's head :-).
- A take home (2-3 hours) task for retrieving tabular data from an API and displaying it. Here I'm looking for general framework chops, readability, some design sense.
- An in-person (~45m) not-quite-pair programming task, with a real computer, tools, editor etc., for doing a typical UI operation, e.g. truncating text. Starts simple and gets more complex as time allows: make a function to truncate text to x chars; now add an ellipsis only if truncation occurred; now make sure not to truncate in the middle of a word, etc.
It was worded far worse than that. What exactly is that telling you about the engineer?
On the flip side I'm asked to code full fledged applications but not to spend too much time on them... okay...
Another time I was asked to code a luhn algorithm. Oh and do it while a room of people watches you on giant screen cause that's what your day to day job will look like... I failed miserably and still got the job. What?!?!?
For instance, one of the problems I frequently ask has a structure that really encourages people to try inventing heuristics to solve the problem, even though ultimately all of those heuristics fail. Seeing how people react to "but what if your input looks like this?" questions is often very enlightening- can they rethink their approach? Do they just keep glomming on more special cases? Can they deal with someone pointing out that sort of flaw?
I recently interviewed managers at two different technology companies, both nationally known, for a report I am working on. Most managers admitted their technical interviews were flawed, but didn't know any other way to assess skills. They also admitted that a significant number of people they recruit refuse to even take the technical challenge and end up working elsewhere.
In interviewing a couple dozen engineers, I found most just don't want to waste their evenings and weekends on a technical puzzle for a job, especially when there are a lot of companies out there who don't bother with them, so they end up searching for companies that don't waste their time with technical challenges.
Another funny thing I discovered during my research is that just under half of the employees at both companies I've interviewed so far were not able to successfully complete their own technical challenges.
Another problem with technical challenges is that often times the interviewer knows less about the topic than the interviewee. I recently went through the interview process with a local technology company who uses Elixir and Go (both of which I know). During the onsite interview, the interviewer kept saying things like, "Don't forget to..." or "You forgot..." I kept explaining that I didn't need to do as he was suggesting. In the end, my code worked, my tests worked, and I passed the interview. In spite of this, I was rejected because the interviewer, "Wasn't feeling it."
I still have a lot of research to do, but I haven't found anything, so far, that suggests that technical interviews predictably result in top-talent getting hired. It seems to be the same crapshoot interviewing people without using technical challenges is, because in the end, most people decide within the first couple of minutes if they like someone and hire based on that, regardless of the rest of the interview process.
It's just unfortunate that there's so much prepping materials online nowadays that the programming puzzles have become ineffective. It gets worse as many interviewers were not good enough to ask follow-up questions. For instance, addition with big integers is a pretty easy interview question, right? But if a candidate can go as deep as this article: https://bearssl.org/bigint.html, I can be pretty confident that the candidate is really really good.
That said, I personally don't find it necessary to join the rat race. Instead, I'd suggest engineers just take time to thoroughly study just one book on algorithm designs. In fact, an introductory book, such as Kleinberg's Algorithm Design or Udi Manber's Introduction to Algorithms, will be good enough. It may not get you into Google, but it will likely get you into another damn good company. The best part of this approach is that passing interview is really just the byproduct of you trying to become a better engineer.
Example:
1. Write function that multiplies two integers.
2. What if these numbers were real numbers but computer can only operate on integers? How do we use same number of bytes as ints to hold a real number?
3. What if I wanted infinite precision? What would be run time of your algorithm and storage complexity? (don't insist that candidate must hit the known optimal).
4. Can I have complex numbers as well?
5. Imagine complex numbers not only has "i" but also "j" and "k". How do we handle this?
It is astonishing how many candidates won't be able to move past #2.
The key is to look at how candidate approaches handling complexity, create representations and use it to craft clean solutions. Whether they eventually arrive at known optimal/great answers is unimportant.
My point is that questions should be tailored to the work the programmer is expected to perform.
That being said, would you offer the definitions of "real numbers" in this context as part of the prompt, when the candidate asked, or not at all?
This was the key sentence if anyone missed it. "Optimal" for whom, exactly? Not for FAANG certainly. They don't need to worry about filtering too many candidates out, because they have a nearly infinite pool of applicants, and infinite money to conduct a search.
They can ask as difficult questions as they want, because they can pass hundreds and thousands of qualified candidates, and still have plenty more where that came from.
Edit: If you consider "optimal" to be the expected cost of a hire compared to the expected profit, it is fully plausible that if your margins are big enough, asking hard questions is the most effective way to ensure low false-positives. But as everyone knows, comments on articles about interviews are never about the economics, it's only about human ego of feeling rejected.
Switching to more practical, simpler problems allowed me to really observe how they work and solve a coding problem. As the article said, I was also able to add requirements or features to the problem which let me see how the candidate adapts to changing requirements, or refactors their own solution to handle a new edge case. Simpler is generally better if you are timeboxed to 45 minutes.
Until one finally gave me a fairly straightforward homework style project. It was probably more apt for a noob and I threw myself into it and submitted my work and explained what it over the phone. I was in the office the next day and we talked about it and I had an offer.
The homework style interviews are understandably controversial, but at least I got to show my work, me doing my work, my thought process, outside of a few moments at a keyboard.
At one of my jobs I nailed it as the top candidate, by far, of the 79 people they interviewed in person. The interviewers were looking for competence, freedom from frameworks, experience and so forth. I have in this line of work more than 20 years and do it as a hobby so I nailed the interview.
At the job though I worked with a bunch of fresh juniors who only know how to write code the one way they learned in school. According to them I am shitty developer because I didn’t write code in the one way they understand.
Who qualifies the outcome?
Now, 13 years later, I mostly rely on "homework" type exercises. I think they address most of the issues. They are more "real world", no time pressure, etc. However, even those now are being heavily criticized. What's left to be used?
Perhaps Facebook needs and wants engineers that can bust out A* on the spot, but I doubt Nordstrom or Starbucks needs that level of talent.
This has changed the field where now to even get an average job at an average company you need to study at the new normal, whose interview ideas were designed to look for the top 1% of the field.
As I gained more experience (300 interviews and counting, baby) I realized that I pick up more on candidates skills that are not directly related to their performance on particular question. At this point the question itself is just a conversation starter and so it’s better if it’s simpler and more broad because it leaves lots more avenues for the conversation to go.
This shows you way more than what a technical question can answer.
There are edge cases to this. Expertise positions specifically.
I cannot believe that any interview situation is comfortable.
Questions you can answer immediately simply don't test if you can do this. I have no idea how you would test to see if you can do this.
Would Triplebyte mind sharing the data? I'd love if the numbers could speak for themselves, rather than having to rely on an interpretation of the numbers.
Interesting piece!
"Hello, my name is David. I would fail to write bubble sort on a whiteboard"
* Rejecting far more candidates than you need to -- so you can feel like you're hiring "the top 1 percent"
* Giving yourself the feeling that you have an objective hiring process (when really you don't)
* Making your own team members feel like they're super brilliant and special when really they're not
That's what the modern hiring process is designed to do. And in fact it works quite well, to serve this purpose.
I suspect one of the reasons Google is so open about their process and the need to study is so that everyone follows suit. Thereby forcing people to take days off and do homework for even the most mediocre of positions, causing the switching costs of interviewing anywhere to become higher.
What do you mean by "than you need to"? I'm guessing for many companies (especially the ones with this kinds of interviews), the limit is the number of hires they can do, not the number of candidates that apply... So by definition, you need to reject all but n (the number of open jobs)... Why wouldn't you reject them based on performance on interviews (as opposed to, by their CV or luck or something)?
- Workers are going to be around for 15 months or less and they have domain expertise on 1 stack already and I don't need to screen for how they would hypothetically function across all stacks
- Worker's process and resource finding skills are more indicative of the time they will spend on a task
- Worker's process includes collaborate use of version control and code reviews, if they pass the screening but can't really integrate on these things then thats what will get them booted from the team
It isn't always more expensive to have a not great developer. Look in your organization and see if what I experience is true for you, and you'll save everyone a lot of time.
I'm pretty sure it's either once a week or once every two weeks.