The main things I want to see, though, are: can you communicate well about the parts of the problem that aren't clear to you? Can you analyze and compare solutions? Can you figure out something reasonably efficient? Do you understand your solution well enough to code it?
(Speaking for myself, not my employer.)
Received the green light and moved to the next phase, where I spoke with potential teams over the phone, then settled with Google Maps. Met with one of their Tech Leads, cool. I was really happy and though that all my effort to prepare for the “Google interview” had payed off.
Then no word back from the recruiter with a final offer. It turns out the VP of eng saw some red flags in my interview and decided to bail.
I felt really frustrated and while I spoke I spoke with the recruiter he apologized and even said the hiring manager was on my side and people overall liked me but there were two engs that were on the fence. Gosh, I think I met with 7-8 engs, all the seniors seemed to like me. I remember not having the best conversation with 2 engs who were new to the company and could not relax nor communicate well.
Bottom line, prepare but also be prepared for some degree of luck and arbitrary judgements.
Yes, there are great interviewers in google. Engs that are engs in their minds and hearts, who can see the process is not perfect but work to get it better. But unfortunately, there are insecure folks who should be better trained for before interviewing candidates.
That was 5 years ago, not sure I would subject myself to this sort of loop even again. And strange enough they contact me few months after to reinterview but this time I could skip the big loop and meet with just 3 engs... I said no since I was already in a new job.
These guides act as optimizations, shortening the path you need to take to get the job, shortening the stuff you need to learn, etc. But in the end, the path is all you get. If you don't like programming and if you don't like learning, then are you really gonna like Google?
I suppose there's people who genuinely like programming who just need a manual to teach them how to play the game. Lord knows I've practiced my fair share of whiteboard problems when I'd rather be reading about compilers. But there's something wrong about having to play a game to get the job.
There are also many people who are great at programming, wh love it, who are terrible at interviewing. After all, these are two related, but ultimately different skills. You talked about it yourself in your last paragraph, ending with:
> But there's something wrong about having to play a game to get the job.
Sounds like the fault is on the employer that makes you play the game, not on "people want to be employed without becoming employable".
I'm scared shitless of whiteboard exercises (and - probably biased by that - see no point in them). There's no way I'd be able to get through them UNLESS I optimize for .. whiteboard programming interviews w/ resources like this site.
In spite of writing code every day, for 15 years plus, and although I LOVE programming, this just excludes me from the list.
(This hits a bit close to home for me because my employer of 13 years just got bought and I'm in the process of looking for interviews again - for the first time since 2006..)
When I interviewed around 2008, there was whiteboarding, but only psuedo code based which I could do just fine. The majority of interviews focused on questions regarding time/space complexity trade offs, design choices, etc. not on-the-fly fully optimized implementation solutions, first pass. The worst thing I ran into was having to write merge sort as the FizzBizz of the time.
Now, it's an absolute circus. Most in the industry really don't know what they're looking for and how to adequately assess abilities. They're far more concerned with trivia and memory recall and filtering any remote risk of a false negative than actually accomplishing the tasks for the position at hand.
The current process is very well designed on multiple fronts to attempt to delegitimize professionals and is quite optimized at grabbing fresh grads desperate for work experience or finding cheaper labor without raising red flags on illegal hiring processes. Why people have put up with this practice boggles my mind.
The vast majority of roles don't need Alan Turing, Donald Knuth, or Jon von Neumann to accomplish some basic business goals so let's be realistic and stop pretending they do.
No one at Google is going to recommend hiring you if they think you'll be in the bottom 50% of engineers at Google when you join. However, 50% of the people hired by Google end up in the bottom 50% of engineers. And frankly, hell there's plenty of jobs that don't require a rockstar. So in many ways you need to be better to get the job than to do the job.
Theses are 2 skills totally different and not exclusive at all.
A skill need to be learned and practiced. In the case of interview, for a pretty good reason, it's not a skill that we practice usually and once you need to do it, you lack experience and fail even though you could have what needed to do the job.
Is it true that an experienced developer would not be able to pass the interview without studying using a similar guide? If so, then the interview process is... fubar.
In that sense, it's probably a good indicator for Google that the interview advice includes "practice writing code", "make it a habit to validate input", and "learn about data structures", and it's probably a bad indicator for Google that the advice includes "practice writing syntactically correct code on a whiteboard" and "practice solving problems with a 30 minute timer."
> practice writing syntactically correct code on a whiteboard
This probably differs from interviewer to interviewer as to how strictly it's adhered to, but it's not really a hard and fast rule. I'm sure there are some interviewers that will ding you on a forgotten semicolon, but I suspect that most would not.
Personally I look for code that isn't so far from syntactically correct that it's clear you are trying to BS me. I'll even accept pseudocode for the most part. But I've had candidates that try to make up language features, and that doesn't fly with me.
> practice solving problems with a 30 minute timer
I only give my candidates 30 minutes. The whole interview is 45, I spend 5 minutes introducing myself and setting up expectations, 30 on the question, and 10 on answering their questions (after all, they're also interviewing us).
You can tell pretty early whether they're on a solid trajectory, and I'll offer the occasional hint to keep someone on track, or ask tangential questions if they're doing well on time. Not finishing isn't a deal killer, provided you had a solid approach and weren't just running in circles. But a good candidate will finish in about 25 minutes and we can spend some time talking about alternate approaches. Sometimes I'll show them the optional approach and see how that conversation goes.
Nine times out of ten a candidate scores low because they overlooked an infinite loop or code would crash on boundary conditions and candidate wasn't able to realize that even with hints.
Based on their perception of my algorithmic experience from my CV? Google's a weird one.
If the process is so nuanced that there's an entire industry around these types of guides (and Google even highly recommends you buy them!), then the process is fundamentally flawed.
But we already knew that, and as long as others are still playing the game, we are forced to play or miss out.
Even Google suggest to "practice writing syntactically correct code on a whiteboard". This is clearly a useless skill as a software engineers except in getting a job at companies that do whiteboard interviews. Did you try to refactor code on a whiteboard?
How are they able to find people that are able to efficiently debug problems?
When I interview people I tell them, "Bring your own laptop set up to be able to code and debug". And I give them "Fix this site" or "build this thing" kind of problems.
It looks like that it works a lot better to find "hidden gems" and people that are good at "doing" instead of those that are jsut good at "telling".
One of the best interviews that I've ever had was a screen-sharing session that was based around much the same mentality: "I have this problem. Script a solution in whichever language you choose in notepad. Now, here's sample data. Run it. O.k. It doesn't work as you intended, so start debugging it."
We (as an industry) focus on developing, when debugging is an equally desirable skill. If you can't understand why your code is breaking and need someone else to assist you, then it isn't - necessarily - a bad thing but you are consuming another resource that could be best devoted to other things during the time it takes to sort the problem that you created out.
Have you ever actually been on the interviewer end of the process?
Literally over half of the candidates literally don't know how to program! They can sort of string together a Markov-chain something if you sit them down in front of an IDE and let them copy-paste stuff until syntactic errors go away, but put them at a whiteboard and they don't know where the parentheses go in a function call.
(They'll write crap like "()f" or "f()a" when they want to call a function, stuff like that.)
I'd be a lot more comfortable with "We have a machine set up for you, but you can also bring your laptop," as long as the machine is actually well set up and you don't fall into the implicit expectation that passing candidates will bring their laptop anyway. (Most of them will, in the end.)
Still no access to a compiler or debugger, but baby steps.
Syntax is probably something that without fail novices screw up. I saw this in my own subordinates who simply would not see on their own laptops, even with the red squiggle underline, syntax errors in their own code. Also, I'm sure Google has data on its interviewing process and found in some important, measurable way that there's a problem with hires who screwed up the syntax. Correspondingly, none of my decent subordinates ever screwed up syntax once. I think this is the least controversial part of their process because it is dumbfoundingly easy and low cost.
Engineering interviews are overall a mature process and I honestly doubt there's that much innovation in terms of raw skills discovery. TripleByte, for example, doubled its multiple choice question count from 18 to 36, introducing design questions, and now has a brief 45 minute free programming problem. Hardly a huge innovation.
Clearly what's immature is the feedback and communication to candidates that fail. At Google specifically communications problems don't just crop up in threads full of rejects. Bad communications affect people working there, like product managers, admins and designers, who lack the skills they test for despite the tests themselves not obviously corresponding to anything the engineer actually does day-to-day.
It's bad for morale to test for stuff that doesn't matter. That may explain why smart people report the engineering org is not welcoming to people of different backgrounds. The empathy (or savvy) needed to throw out stupid tests is the same kind you need to understand people who don't share the same culture or values as you.
Google doesn't really know that because the company only knows the things it measures. It takes insight too to know what to measure and how things are related, especially when dealing with human beings and not software.
That is too say, their interviewing process explores things at sufficient depth and breadth to make knowing "the" solution not a deciding factor.
on a related note, I write scala on daily basis but scala is not that well suited for coding interviews, I found.
Kidding aside, why do you feel Scala is unsuited to coding interviews, assuming the interviewer well-understands the language?
Keyword research and trends... https://trends.google.com/trends/explore?date=all&geo=US&q=G...
When you search modern interview topic and comments/opinions on the current process, you'll find a few SEs (typically working at places like Google at some point) on that on the side sell training bootcamps, etc. These people will swear every direction that it's a reasonable process in comments around the web referring to their side business. Creating problems they provide solutions to: gatekeeping 101.
They don't test for good engineers -- they test for people who practice these style interviews, and for good new graduates.
It makes sense to ask these questions to new grads, but afterwards there is so much more experience that I feel like is much more important than acing data structures questions.
I am amazing at whiteboard questions, but that doesn't make me a good engineer. It's because I found the trick to solving these, and have practiced them. A lot of it it is practice 'ooo this looks like a graph problem, let me use a graph', etc.
That actually does make you "a good engineer". The vast majority of developers wouldn't even be able to recognize that much.
Prepping while working is draining.
That part is not completely correct. At onsite can choose to write code in a Chromebook which will have a lightweight editor with syntax highlighting.
The most obvious change was to accept psuedo-code (like in the olden times) to reduce cognitive load dealing with: syntax, dynamic problem solving, and dealing with someone asking you random questions shifting your train of thought or changing the problem description on the spot.
This now common poorly structured interview process does a lot to discourage high level conversation and low focus on syntax. It tends to get hung up on language specific syntax, data structure recall, and less on designing and analyzing solutions.
BS atop BS... its BS all the way down
Honestly this sounds like a thought experiment.
Do you have any ethical hangups about entering an environment where falling out of your favor can leave someone stress-crying in their place of work and/or about their livelihood.
If so, how much money would it take for you to join the system anyway?
How long would you stay in such a system if you found yourself already in one?
Back in the real world, in a business context, it sounds like an abusive workplace and an untenable system. Like, that obviously can't last forever.
Is this your idea of an engineering nirvana? Does engineering happiness have to come at the expense of manager happiness?
A very substantial point actually.
I'm not so sure this is better. I was an individual contributor for over ten years until I became a manager a couple of years ago, so I've seen both sides of this coin. My experience has been that typically engineers aren't interested in understanding all the non-technical things that are necessarily part of the decision making process, or worse think of these things as beneath them, asinine, or "easy". Much of the feedback from engineers is negative, not useful due to an overly narrow focus on specific tech stacks or solutions, or lacking context. Sometimes the context isn't there because of poor management decisions to not be transparent but often it's not there because the engineers' bias results in them not seeking it out. Very few things are black-and-white, and certainly it's better to include engineers' in the decision making process at some level, but that's a two way street--maybe engineers could work a little harder to overcome their own erroneous biases.
Return question: Do you have a firm grasp on quite how much money it is?
from collections import defaultdict
def dups(seq):
d = defaultdict(int)
for x in seq:
d[x] += 1
return [k for k, v in d.items() if v > 1]
Assuming Python's defaultdict has O(1) lookup/insertion (which I think it does), this algorithm is a proper O(n) complexity.Of course, once we start optimizing how the original array is stored, we may have exceeded the limits of this problem as a teaching exercise :)
As for time, it does seem to be O(n) to me; can you clarify why you think it's nlogn? It may not be particularly fast in practice when compared to other O(n) approaches like the bitmap, but I don't think the complexity is wrong.
Your solution is nice - it actually gives you more information (how many appearances, not just T/F >1 appearance), but it does require more additional space and isn't necessarily faster. I think the bitmap approach would be nicer if you're ok with using more space; the bitmap is essentially a very easy to find perfect hash function due to the unique input constraints.
> Analyzing the above approach, we have an algorithm that takes O(n) time and uses O(1) space.
Yes, I confirmed that by pasting it into the REPL and verifying that it gives the wrong answer for a one-element list. Apparently the author failed to follow his own advice to always test code that you write.
In Silicon Valley, tech interviewing has become an arms race between applicants cramming to pass tech screens and interviews, and employers coming up with new routines. Sites like Glassdoor and CareerCup are loaded with interview questions that have appeared in those routines, giving savvy interviewees the opportunity to see the questions on the exam and prepare accordingly.
How do you feel about the existence of these sites, and do they affect how interviews are conducted?
This also discourages overly complex or overly familiarized questions. If the question is too complex, chances are it will end up posted online soon after, penalizing the interviewer in time cost. If a new question is recycled frequently, it will also likely end up online at some point and penalize interviewers from using questions they're overly familiar/biased in assessment to based on their own rote learning.
Good for them financially maybe. Professionally, I didn’t see they go beyond the average senior dev. But that just me being sour.
If you just want to know what the interview for your roles would be, the recruiter from Google will happy to give you an overview.
But to answer your question, yes, absolutely.
This is not a thing. Everyone goes through the same level-adjusted loop.
> Another way is to be a significant contributor to some popular open source project.
LOL no. Google is literally famous for rejecting major open source contributors for not knowing how to reverse a binary tree.
AFAIK, you could swap the child pointers and do that recursively for the child nodes. You could also do things O(1) by just changing the comparison function, perhaps by wrapping it to negate the comparison.
They probably didn't want Brew author that much; we don't know the details.
As for popular open-source projects, I don't know about Guido van Rossum or Rob Pike, but Max Howell definitely whiteboarded in his interview: https://twitter.com/mxcl/status/608682016205344768
I have a very solid network inside google (ex-coworkers from another FAANG job) and also from xooglers, at IC and management track.
Even in my last interview failure (see my post here), the recruiter told me I had more than enough to support my application. Yet, an exec made the no-go call.