I don't remember the exact problem they wanted me to solve, but the answer involved a dynamic collection and they wanted it to grow with constant time complexity. They were probably looking for a linked list. But I said I'd use a dynamic array because those have constant time when averaged over a series of appends.
I don't know if I remembered the term "amortized complexity" or not, but it was clear that they had never heard of doing amortized analysis across a series of operations and they absolutely did not get my answer. They got hung up on the idea that some appends force the array to grow and get copied. I tried to explain that that was only true for a predictable fraction of them, but they were stuck on the idea that this meant dynamic arrays had O(n) worst case performance. They clearly thought I didn't know what the hell I was talking about.
I'm pretty sure that was the point where they decided to pass on me.
But now I'm a senior software engineer at Google and I just finished writing a textbook that explains dynamic arrays including their amortized analysis, so joke's on them.
Keep two arrays, of size n and 2n. Initially the first has capacity c = n/2 and the second has capacity 0. Reads go to the first array. When you append, append one element to the first array, and copy two elements to the second array. By the time the first array is full, it has been entirely copied to the second array—so discard the first array, move the second array to be first, and allocate a new "next" array.
The same trick can be applied to many amortized data structures (but it gets very complicated when you have many operations).
Then there is the fact that you basically make every insertion 3x as costly. You better have a good reason to need this given the additional complexity and caveats.
As for the original interview question, there are systems where an occasional longer pause is not OK. Personally, I think that sticking to your gun must have come across as being stubborn and unyielding and maybe not ready to admit errors. As an interviewer, and for having worked with people who always think they're right and inflexible, it would have been a red flag.
Of the many scenarios where amortized complexity is not okay, code in a tight loop where predictable performance is key, e.g. code running game logic, jumps immediately to the top of the list. The fact that you were unable to incorporate this into the conversation makes me suspect you were more interested in putting on a show than a practical solution to the task at hand. The worst case performance for a dynamic array is O(n), but the amortized performance is constant. They aren't the same thing.
You may be a different, more mature, engineer nowadays. But, when I encounter the type of persona your story describes I tend to respond with strong no’s irrespective of whether you have since written a textbook.
Arrays used as backing for lists are faster than linked list in almost all cases, assuming they are implemented correctly (as is the case in Java, which I bring as an example).
Linked lists have a lot of huge downsides that are not easily captured in their naive big-O characterization. Big-O does not tell anything about how efficient things are. Two algorithms can have same big-O complexity yet hugely different costs.
For example, you can easily parallelize operations on array lists but you can't do that on linked list where to get to next node you need to dereference a pointer.
Even without this, you get bonus from prefetching data to cache when searching through array list or when doing things like moving data. Speculative dereferencing pointers is nowhere close to just regular prefetch.
Array lists are denser in memory than linked lists (because of additional pointers). This means that a lot of things works just more efficiently -- cache, memory transfers, prefetching, less need to resolve TLBs, etc.
Inserting into array list at the end is as fast as into linked list (after taking account for amortized cost), but what not many people appreciate is that finding an insertion place and inserting within array list is also the same cost or even better than in linked list.
That is because to find insertion/deletion place you must first run linear search on linked list and that costs a lot. Actually, it costs more than copying the same size of array list.
I think you would have a hard time finding someone who knows me describe me that way. Maybe the way I related the anecdote here doesn't present me well. I am pretty meek and easily flustered. I don't think I was any more confident back then than I am now which is, alas, not very.
> The fact that you were unable to incorporate this into the conversation
In this case, it was an interview with three other engineers simultaneously (a truly cursed interview format), so it was hard to incorporate much of anything into the conversation. It felt much like I imagine a thesis examination where the three of them were in charge of pacing and questions. (Also, it was ten years ago, so I'm sure my memory has faded.
What I recall was them asking me how I'd do some sort of growable collection. I said I'd probably do a dynamic array. I mean, that is the way 99% of growable collections are done today—look at Java's ArrayList and C#'s List. This is a company that does strictly PC games and I was hiring for a tools position. Dynamic arrays are the right solution most of the time in that context.
They asked what the complexity was. I said something like "Constant time, across multiple appends." They didn't seem to get that and asked what the worst case was. I was some appends are O(n) but amortized across a series of them, it's constant. When I tried to clarify, they said they wanted to move on. I think in their minds I was hopelessly lost and they wanted to get to the next question so that I didn't embarrass myself further.
I would have been happy to have a more productive discussion about which problems amortized analysis was the right fit for. My impression was that they had never heard of amortized analysis at all, and thought I was confusing average case and worst case analysis (which I was not). From their perspective, I can see how I looked lost or wrong.
Overall, they had a superior tone that I found off-putting. (For comparison, I didn't get that impression from any of the interviews I had at Google the very next day. My Google interviewers were all kind, engaging, and really fun to talk to.)
This flippant remark followed by the egotistical follow up is the kind of art I read hacker news for, thank you.
If they were not dumb, they would definitely understand what he was saying and not sit clueless.
Working in for example Unity I need to not ever allocate anything per frame if I can help it (and then knowing other gotchas such as that foreach and/or getting the .Count on a list will also cause memory to be allocated).
(Anywhere you're servicing some kind of interactive request might qualify, depending on this size of your datastructures.)
This is actually something I like to work through in interviews:
OK, you've got amortized-constant-time append to your array-backed colletion, but now what if we need it to be really constant time? How could we achieve this? (Same question & trick applies to hashtables.)
(The answer can be found, amongst other places, in the Go or redis source code.)
edit: or in pavpanchekha's comment, they beat me to it :-)
also, just realized you're the author of crafting interpreters. absolutely love that book!
Real time systems including games have hard limits on how long something can take in the worst case, not the average. In games this manifests as: All the stuff has to be done in 16ms to render a frame on time, if we're late the user will perceive a glitch. The consequences can be worse in control systems in any number of real-world systems where a glitch can lead to stability problems.
So getting back to TFA, this is part of knowing what the "right" answer is for the context.
1. Probably not used linked lists (contiguous layout means better cache efficiency)
2. Would try to understand their data requirements and allocate memory up front as much as possible - doing a similar amortized analysis the OP is suggesting rather than a generic "always have O(1) insertion" at the cost of using an inferior data structure (a linked list)
There's also the cost of copying objects especially if you don't know if the objects you're copying from your original array to the resized array have an overloaded copy constructor. Why copy these objects and incur that cost if you can choose a datastructure that meets their requirements without this behaviour.
If you're holding pointers to these elements elsewhere re-allocating invalids those, and yes you probably shouldn't do that but games generally are trying to get the most performance from fixed hardware so shortcuts this will be taken, its a least something to talk about in the interview.
I can see why they were confused by your answer as its really not suited to the constraints of games and the systems they run on.
You don't have to use a growth factor of 2. Any constant multiple of the current size will give you amortized constant complexity.
> On more limited devices (see games consoles or mobile devices) you'll end up fragmenting your memory pretty quickly if you do that too often and the next time you try to increase your array you may not have a contiguous enough block to allocate the larger array.
If fragmentation is a concern, you can pre-allocate a fixed capacity. Or you can use a small-block allocator that avoids arbitrary fragmentation at some relatively minor cost in wasted space.
> Why copy these objects and incur that cost if you can choose a datastructure that meets their requirements without this behaviour.
We have an intuition that moving stuff around in memory is slow, but copying a big contiguous block of memory is quite faster on most CPUs. The cost of doing that is likely to be lower than the cost of cache misses by using some non-contiguous collection like a linked list.
> as its really not suited to the constraints of games and the systems they run on.
For what it's worth, I was a senior software engineer at EA and shipped games on the DS, NGC, PS2, Xbox, X360, and PC.
That doesn't smell right to me, assuming you're talking about userspace applications on newer hardware. aarch64 supports at least 39-bit virtual addresses [1] and x86-64 supports at least 48-bit virtual addresses [2]. Have you actually had allocations fail on these systems due to virtual address space fragmentation?
Certainly this is something to consider when dealing with low-RAM devices with no MMU or on 32-bit, but the former hasn't applied to the device categories you mentioned in probably 20 years, and in 2021 the latter is at least the exception rather than the rule.
[1] https://www.kernel.org/doc/html/v5.8/arm64/memory.html
[2] https://en.wikipedia.org/wiki/X86-64#Virtual_address_space_d...
Q: What are mitochondria?
A: [Long complex scientific answer about ATP synthesis]
Grader: Wrong. Mitochondria are the powerhouse of the cell.
That was not the right answer.
Of course, arrays have better cache locality, but they didn't ask about that.
They were probably looking for a linked list with a small array in each node, which gives more constant write performance and less cache misses on read.
Maybe so, but I'm just gonna go ahead and say - those interviewers were shit. Most interviewers are, so it's not really a judgement on them, but the problem in that room wasn't you.
For games, it absolutely does matter that only some appends trigger latency, because that causes stuttering in the game play. A linked list may be slower in most use cases...but the performance cost is fixed and can be easily designed around.
Edit: Ah, I just saw munificent linked to this in a reply elsewhere in the thread.
http://craftinginterpreters.com/chunks-of-bytecode.html#a-dy...
One of the main reasons I started writing it was to help my job search. Which, ironically, ended up not being necessary because I left the game industry. What's crazy to think about is that if I hadn't failed this interview, I probably wouldn't have gone to Google.
So this one weird failure to explain the big-O of dynamic arrays may have dramatically changed the course of my career. Or, who knows, maybe they failed me for other reasons.
The OP is basically saying any discussion of 'amortization' or anything past something very simple, was completely beyond them.
And your response, like many others here, is gong way off into the weeds, suggesting 'what they were really expecting' etc..
I definitely understand HNers willingness to go into the weeds for no apparent reason, but the lack of social comprehension here is really odd.
The OP has reiterated over and over the social context and yet everyone seems to be happy to dismiss it whilst providing their bit of extraneous extra input.
It goes on like a comedy.
"But they must have been expecting this niche/novel thing way over here in the corner, and well if you didn't get that ..."
It's as though the respondents are validating the poor ability of we techies to establish context.
Any specific requirements in the interview, it would seem, could have been discussed by the interviewer and frankly without providing very specific contextual details, there's no such thing as a 'right answer' because it always 'depends'.
And finally, why anyone would expect specific correct answers instead of a discussion about the constraints is also odd to begin with.
it'd be huge pros that you can explain concepts to people who do not know them in 5mins.
Asked: "constant time complexity"
Answered: "constant time when averaged over a series of appends"
In an interview, the onus is really on candidate to answer the question as it was asked. Candidate did not answer question, and there was a breakdown in communication. Candidates interpretation is that interviewers didn't know what candidate was talking about. Interviewers could just as easily have been frustrated that candidate did not see significance of the difference.
In my experience, interview technique requires establishing a common ground. As an interviewee, I find that the first step of answering a question is to first go through a number of possibilities starting with the most obvious first. They were most certainly looking for a linked list. This is a simple question with an obvious answer. It would have been good practice to start with the answer they obviously wanted, and then try to broaden their minds. Instead, candidate willfully derailed the interview.
To me this looks like a classic breakdown of communication with a root cause of hubris (quite possibly on both sides).
While both sides failed in this account, only one has the benefit of hindsight in writing about it here today. Time has offered an opportunity for reflection. What has candidate learned? "[look how clever I am] so joke's on them"
You'd be surprised! I've done probably 200+ interviews across a couple companies. Over time if anything my questions have gotten simpler.
I look for someone who understands the fundamentals of the stuff on their resume, asks good requirements questions, thinks a bit before leaping into the weeds, can explain their thought process, ideally does some of their own double-checking, and (if it comes up) can debug a problem I spot without my spelling out their error.
And last but not least: I don't want to cause a panic attack, which would be mean and tell me nothing. Sometimes my candidates have been too nervous for me to know if I should hire them, and it's not a good experience for anyone.
So I make the problem as simple as I can and make sure we get through the rest of that in as relaxed a fashion as I can manage. I'll ask a simple (often first-year CS) coding problem or a design problem, rarely both in the same interview slot. I never ask for tons of code on a whiteboard in a 45-minute slot.
For what it's worth, I didn't get a very good vibe from them. The impression I got was that their culture was more aggressive and competitive than I like. (Maybe this was to be expected for a company that made a competitive eSports game.) I'm not really what you'd call a brogrammer.
So it was probably for the best that they said no. I ended up at Google, which has been better for me in every possible way.
I would probably try to give a full answer, such as "there's a widely-held concept X, but my understanding is that this is not completely correct, and in fact a better model is Y." A technical lead that can't work with an answer like that is one I probably don't want to work with.
My approach would have been to go with the new idea and see what discussion can come out of it. If they don't have time to get into it, that's one thing but if they are simply not open-minded to new ideas, I don't want to work there.
Certainly when I got my first job I felt like I was there to take a job, not seeing if the company was a good fit.
I would have no problem saying "many people use MVC to mean 3 tier and other people use it to mean XYZ".
It is very difficult to be completely technically correct. It is also rare that one needs to be completely technically correct in conversation.
However, doing that in front of the tech-lead his boss is likely to be different from a fair discussion. Disagreement about technical architecture is fine. Doing so in front of higher management is likely a different ball game.
Getting into a lively technical discussion when you can see how well people can control their egos to find out the best course of action is very valuable. If I had a chance during an interview to find out how people behave in those circumstances I would definitely go for it. Who knows maybe the "boss" would be present in future technical discussions as well, I wouldn't want his presenc to change the discourse negatively.
And as I've personally been in positions like that (being the tech lead with the boss on the call), I'm usually thrilled to learn new information and discuss technical things, raising my opinion of the interviewee greatly.
Maybe that's because I come from a very straight forward culture where political subterfuge is frowned upon, but it wouldn't actually look very good to me if a member of my team would subtly alter a technical matter for the worse because of political pressure. Would I be able to trust that person? I would always have this doubt that they are "playing a deeper game", regardless if it was for my benefit or not.
And if the tech lead can't admit in front of their manager to not knowing everything? That's a sign of a toxic workplace.
I'm not ashamed to admit that I have previously found myself in a situation similar to the tech lead: the interviewee gave a solution that I didn't know about - one that could be considered better - and my manager was in the room interviewing with me. I told the interviewee that this is a new direction I haven't thought about until now. It's important, otherwise we can't continue an honest discussion about the solution.
You cover both bases, give the blogspam answer that most view is the "right" one, but also show that you have a much deeper understanding.
My favorite question when interviewing node.js candidates was around this type of ambiguity- "Is node.js single threaded or multithreaded?" and most would immediately shout back "single threaded!" but the reality is that its a bit more complicated than that, its a single threaded event loop backed by a thread pool, and I wouldn't ever hold a "single threaded!" answer against them, but start to ask leading questions as to what happens when we have say 5 requests all come in more or less at the same time to see if they really understood how things work. And the point is not to gotcha! them, its just to start a discussion- though if they reply back "well its complicated..." then I know they probably are already there.
In the context of MVC though, I feel this is really overly pedantic. MVC is an idea that is never actually implemented in purity, and understanding the basic idea is all that is really needed, outside of maybe very specific roles implementing web frameworks.
Why not say "there are many similarities and some differences, they're similar in these ways..."
> But the problem was, that this guys manager was sitting next to him. If he didn’t know, I would totally humiliate him in front of his boss. So either he would stick to his guns and refuse the correctness of my answer, to save face. Or he needed to agree that he was wrong, and lose face.
I would love to ask the author if their team lead ever confirmed these suspicions. I would doubt it, but happy to amend my position if wrong. Regardless, to me this sounds like the exact type of culture I wouldn't want a new hire bringing in. I don't need to be managed or protected, especially from my own manager. I need honest answers, not what I want to hear. I wasn't in the room, I can't read the vibes like the author did, but I'll consider the omissions here telling-- there's no description of an anxious or frazzled team lead, or an imperious manager peering down. It seems like the only thing the author saved the team lead from were problems in their own head.
So it was actually good that I didn't raise this issue.
One thing I remember at that place was that the team lead was on holiday, and the manager took over. But some release went wrong because the manager made some fault. He was able to turn this story around that the team lead messed up right before his holiday, and that he, the manager, had to step in to fix the situation. Oh boy.
This is a great cold reading technique that works in magic tricks too.
You have some trick where someone needs to pick from 10 cards. And your patter goes something like "Picture the card in your mind. Ace of hearts. Ace of hearts" If they give a big reaction then you've found their card and performed a miracle. If not then you just continue "Of course, that's just an example..." and continue the patter throwing out other hints. Of course, it's tough to make this your only trick but it can really elevate a good trick to amazing.
Great example at 1.50 here: https://www.youtube.com/watch?v=QI5-NDiY7IM
Every once in a while, purely by luck, the correct card would be at the top. In which case I would just change the second line to "...but I have brought it to the top of the deck for you". My sister begged me to tell her how I did it for years.
Keeping your mouth shut, saying no, and not lying or BSing are three of the most useful skills you can develop. Many people feel pressure to respond quickly, especially in a performance situation like an interview, just as the writer of this article was highly tuned to what his interviewer wanted to hear. Controlling the tempo and direction of your own responses is a way to gain the strategic initiative without being disagreeable.
(It's Monday. Smile a little.)
Wouldn't you have to do this on average five times to get the right one? Wouldn't it be a bit suspicious giving five example cards before arriving at the right one?
* "Does the author really mean what they are asking? Are the mistakes in the phrasing or corner cases intentional, meant to catch me, test my deep knowledge?", or
* "Is the author just not very good with logic / not thinking this through?"
I go with the latter in "soft social" contexts. Never regretted it yet.
This saved my hide recently in June, when I had to undergo a mandatory psychological examination for my gun license permit. Serious official stuff, with my permit on the line… And the "serious" psy-test was exactly as "robust" you'd expect from the field of psychology.
I answered the way I figured the test author construed the questions (= what they likely meant to ask), not what they actually asked. Easy pass.
I don't remember if the wording was specifically like this, but it was something that to me logically obvious answer is "yes", but the correct answer and what they expect is "no", to show that you are a reasonable driver who won't speed unnecessarily.
And I think throughout the driving exam there were several questions like that. This type of thing really annoys me.
EDIT: I found the actual question - although I still had to translate it - and I think I was giving the question a benefit of the doubt from my memory, with the term "planned" as I had written here, the actual way the question was setup was following:
What affects the duration of a planned journey?
a) Length of the journey.
b) Maximum speed in few lone sections of the journey.
It uses checkboxes, so you can check both.
They expect you to check only a), but b) is in my view also logically correct answer as if at any section even if it's shorter than 1 metres, if your max speed nears 0, duration of your journey will reach infinity. But it is worded in such a way that most people will get the hint that B) should not be checked.
Something along the lines of "speeding in most urban areas doesn't actually help get someone to their destination faster. There are always multiple traffic lights and other cars to contend with, so it's not helpful to go over the speed limit and it increases the risk and severity of an accident." It's perfectly solid advice, who hasn't been passed aggressively by an ah-ole with tinted windows only to catch up to them at every traffic light?
The question obviously wants to screen for that tidbit of knowledge, but it's not phrased rigorously enough for the HN-crowd, I guess.
I put false without thinking. I had everything else on the test right and part of the course was the instructor reviewing your test and going over your incorrect answers. He wasn’t a programmer (I think former military tbh) and didn’t really understand why I got it wrong even after I explained it…
The free-response question was something like: “What is the single most important thing you can do to improve your safety as a driver?”
I answered: “Always drive sober.”
The “correct” answer was: “Fasten your seatbelt.”
I can only imagine the grader’s face / thoughts when they had to mark my answer as incorrect.
A vendor offers you season tickets to the Cubs games. Are you allowed to accept?
1) Yes, because neither of you are in the baseball business.
2) No, not under any circumstances.
3) Yes, if you give the tickets to someone else, such as your brother-in-law.
The "right answer" was quite obvious, making it more of an intelligence test than an ethics test.
I suppose in this case it might instead just be a formality? And no one has actually ever failed that test?
That said improvised psychology is as bad as anything improvised, only people are generally more comfortable improvising it (and indeed commenting on it) without knowledge of their lack of knowledge, as opposed to fields that look more like science such as software engineering.
I think they were just trying to elicit the idea that the model defines interaction with the database and that the view defines interaction with the browser client. That there is some relation there between MVC and 3-tier architecture. The Wikipedia snippet that disputes any relationship seems overly pedantic to me.
Though all moot to me since there's very little real world pure MVC or pure "3-tier" anyway. For good reason.
Sorry if my article gave you a wrong vibe.
"There are two views on this. Some people say x, but I would argue that the better view is y."
Giving the wrong answer intentionally is both dishonest - never a quality I would want to have, especially in an interview - and liable to backfire.
Then again, I'd never ask a vanilla architecture question like the company in OP's post, as I've found that the best predictors for job performance are also the very questions that are the hardest to grade. Deep, open questions on a vast field — where there isn't anything close to a correct answer but rather thousands.
I still often go with the good old "what happens when you enter xyz.com in your browser and hit enter?", just because I can take it anywhere I want.
One day, I hit my bonus question "so what happens after hitting enter and before the first request leaving your computer?"
And the guy was like "well, the microswitch pulls the line of the keyboard matrix from high to low, generating a scancode that's then turned into a keycode in the controller..."
5 Minutes later we were all over his self-built keyboard and he had a job offer in hand 30 minutes later. I don't even wait for HR for the good ones.
Unfortunately he got an offer in hardware, and actually I was glad for him.
I wrote something on a whiteboard. What followed was the most surreal discussion I've had in an interview. My function took into account the seconds, minutes and hour for the hour hand, and so on. Just like a normal clock would. The interviewer insisted this was wrong. I tried to tease out of him if we were talking about what to do if the specs made no sense, or if he wanted me to draw an unusual clock face. Nope. He just insisted I had no idea how clocks work. I spent the interview trying to understand what he was really asking of me as politely as I could while he spent the interview insisting I didn't know how something as basic as a clock works.
To this day I don't know what he really expected out of me. I wasn't surprised when I learned they didn't want to hire me. I doubt I would have wanted to work there after that interview.
So I proceed to answer the question, describing what I was doing and writing code, and at one point he stops me to say, "That won't work because X may not be the case." But X was one of the points of clarification up-front, so I called him out on it politely but firmly. I wasn't sure if it was [a] a language thing (he's a non-native English speaker with a fairly heavy accent), [b] an issue with experience (he was being reverse-shadowed in the interview), [c] him trying to test soft skills in some really bizarre way, or [d] some bullshit that won't fly.
I have no real problem with a,b,c, though the more experienced engineer doing a reverse shadow should step in when things go amiss, so I fault that engineer for jumping in. But regardless, I was thoroughly nonplussed by the way Google runs interviews, and I've essentially sworn them off. It wasn't until Facebook that I found a company that I want to work for even less.
Where would you want to work?
Lots of us didn’t fully understand the question when we were giving it initially and it caused a lot of problems. I’m guessing you had the minute hand drawn as if it moved continuously since you mention seconds.
Let me apologize for either myself or my company :)
I'm fairly sure I left them convinced I didn't understand my own research.
Bear in mind, many TLs are in their current role by virtue of having a good idea, supportive management, trust of their team and good/lucky execution. This combination of activities can easily mean that you have a great up and coming TL with ~3 years of experience. Management may be bringing in more senior talent as the upscaling product requires it, but that doesn't mean you start with the same trust, management support, ideas, or in-house knowledge as the TL.
In such a situation, setting out to prove that you can outsmart the junior TL as your first contribution seems suspect...
Bear in mind that often successful companies, have many inexperienced leaders due to growth.
I don't really care what you currently know because software development paradigms can be learned with relative ease for a motivated and intelligent individual who already knows something.
So instead I:
1) Poke around their CV/Resume. Just ensuring that they have the stuff they say to a basic level (some people just spam keywords)
2) Get them to a point where they admit they don't know something. Usually this means a deep dive on something common to the role I'm hiring for and that they have experience in.
If you're unwilling to say you don't know something I usually reject the candidate. I know interviews are stressful and you want to impress your interviewer, but if you're willing to try to bullshit me at an interview then you're willing to bullshit me in a post-mortem, and I can't have that.
Honesty - Intelligence - Experience
in that order, always.
Generally I would assume that multiplication and addition are constant time for big-O analysis, until given a reason otherwise. That might be less appropriate for fibonacci than most other problems though.
I think it was a poor solution. There are ways to respectfully disagree.
"I really think I'm right on this one. Since it's a question of facts, not opinion, we could easily verify it later."
Did the author know they were hired because they didn't rock the boat, or despite of it?
If I were hiring someone, I would want to know I'm hiring someone who can argue politely for what they think is right, and not see the argument in the context of "winning or losing", but in trying to find the best way forward.
Is it?
I think it's an opinion.
How would you empirically test if the answer was right or wrong? What's on Wikipedia is also someone's opinion, isn't it? What's in a book that Wikipedia references is again someone's opinion.
If you can't write a program or other set of instructions to test it... it's an opinion.
'How does Dr Foo say that MVC maps to database-server-client?' would be a question of fact though.
That said, I don't think I've seen MVC used with the same meaning twice... what makes the question even worse.
In one case where I was going for a C++ expert job, I got asked a question (that I can't remember now), gave the correct answer, which the interviewer disputed. I asked if they had a copy of TC++PL on hand, which they did, and I pointed out the relevant section. I got the job.
In another case, I was asked something complicated about "const" in C++ (which has a few gotchas), gave the right answer, but was still disputed. I got the job, and on my first day the guy that asked the question came up and apologised to me - he'd read up on it after.
IMHO, telling the truth about technical matters is always best; I might fib a bit about other things.
I've been passed over for a job because I did not recite the right buzzwords (framework names, etc.). Your job in an interview is to please the person across the table from you. Some interviewers are looking for a sharp technical mind unafraid to challenge them. Others view any challenge to what they think is right as a threat.
Once in a technical interview I was asked what data structure I would use for certain functionality "if performance was really critical." I said it would depend on the size and structure of the data that we needed to support, and when the interviewer said "unbounded," I said that the answer to that would go beyond an in-memory data structure, and if performance is critical you need to be able to project the sizes of data you need to support in the near future.
I could tell the interviewer thought my answer was ignorant and sloppy. He started giving me a few "hints," which showed that what he wanted was the data structure had the best big-O performance. So I told what the best big-O performance was for the problem and what common data structure would provide it.
Then he said, "So you would use that?" wanting to put the question to rest and move on the the next one, and I could have said yes. But instead I said "maybe," and I told him I remembered Bjarne Stroustrup talking about how algorithms classes in computer science education give students the wrong idea about how software engineers choose data structures and algorithms in practice. The university version, he said, is that if performance isn't critical, you just pick a container with the right functionality, and then if it turns out that performance matters, you pick something with the best possible big-O characteristics to get ideal performance.
In reality (according to Stroustrup), when performance isn't critical, you should pick something with good big-O performance, and if it turns out that performance matters, you measure on realistic hardware with the data sizes and characteristics you need to support, and in many practical performance-critical cases you will end up choosing something with theoretically suboptimal big-O performance.
I told the interviewer I liked Stroustrup's approach, and I always used data structures with known good performance by default, but I would measure if it mattered. I didn't get the job, and that was probably for the best at that stage in my career. I've nevertheless ended up working in situations like that, where people living by ideas I knew well thought I was an idiot for not understanding them, when I really just didn't completely agree with them, and those situations did not end well.
Rails took a different path: they called their Ruby application code the "controller" and their template files the "view".
With the benefit of hindsight, I'm not at all confident we made the right choice.
I still think we were right from a pedantic point of view, but having to spend over a decade constantly explaining that "no, in Django the view layer is a different thing from the template layer" doesn't feel to me like it added much value for all of the extra effort!
I could ostensibly argue that I got the job not by telling the interviewer what he wanted to hear, but by doubling down on the right answer and showing that I had the capacity to prove it.
I basically used the great circle distance (using I think a CPAN module) as an approximation - this was for geotargeting AdWords.
Noting how many comments here are of the form "this is what /I/ would accept" or "what /I/ usually want out of /them/".
And the point was specifically how interviewers insert their personal issues into the questions and make others dance.
The article demonstrates an interviewee skillfully maneuvering around their interviewers. It undermines the dominance display which, judging by the comment section, more than a few folks enjoy but also don't want to admit to enjoying.
I'll go a step further and say the desire for dominance in this scenario stems less from avarice and more from insecurity. A new face who who does their job too well, and knows things that the interviewer does not, is someone who can replace the interviewer.
I did not get a callback.
Architecture is more about deciding what direction your dependencies go and where your hard boundaries go
https://youtu.be/o_TH-Y78tt4 (approx 27 mins in)
Until then (and thanks to Smalltalk) it was a composite UI design pattern combining observable and mediator patterns. It was also a recursive pattern, in which the "editor" could itself be a model-view-controller.
Rails designers completely misunderstood it (or deliberately ignored it) and simply reused the terms for something that was only marginally similar, not recursive and was not inherently "active".
Other frameworks adopted Rails' terminology and now we are left with the original pattern having been completely forgotten.
"As an analogy, perhaps one could view..."
Two months later when that person runs across the wiki article:
"Ah, right that's why I said as an analogy perhaps one could view it that way. I thought you knew?"
The author is right, the MVC pattern can be considered from many different angles. It's possible to have MVC just on the client side (e.g. with React framework with Redux you have a store as a model, components are views and the router is essentially a controller). React (the library) is itself is also a controller since it handles the DOM diffing and handles the state reactive update mechanism and thus acts as the glue logic between all the views.
I wouldn't want to work for a company with such a rigid view of software development.
It's a sign of seniority when you notice inconsistencies with terminology.
For example, I know some senior developers who had totally different ideas about what is 'unit testing' versus 'integration testing'. Both are valid views because the terminology is still currently ambiguous.
Does unit testing have to only test 1 class in complete isolation (stub out all function calls to dependencies)? Or is it OK to test a class along with its dependencies (no stubbing)? Some developers say that if you include dependencies, then you're testing more than 1 class so it should be called 'integration test', other developers will claim that it's still a unit test with respect to that class - That integration tests must interact with the system from outside via the API (not method calls on a class). Either way, I think that stubbing out dependencies is a bad idea in most cases (aside from mocking I/O calls to external systems like a database) so if I was to accept the definition of a unit test as being without dependencies, then I would very rarely use unit tests... Anyway this shows that even a simple term which is widely known can be the subject of conflicting opinions and it's wrong to criticize people for choosing a definition which doesn't match your own.
Software development doesn't have much global consensus nowadays and part of the problem is that companies are using bad hiring techniques to interview candidates; companies end up forming tribes of like-minded individuals and completely miss all these nuances and these debates.
Point being, "What is MVC?" is a very expansive question and I'd be very wary of working for any developer who thought there was one right answer to it.
The alternatives I've see boil down to these trivia style interviews which come down to simply memorizing answers the interviewer deemed important/correct. The most rediculous cases include obscura such as "what would you do if [X debugging tool] stalled [Y application process?" or "what would you do if you saw a server with high IO time?"
There are many versions of correct answer to these, but odds are your interviewer has a specific one in mind. In a real discussion of these events there would likely be back and forth on root cause/severity/solution, but you simply can't have back and forth in an interview situation. The starting impression will always devolve towards "this person doesn't know what they are talking about".
I couldn’t think of an answer, and then he explains “It’s quite simple - you would use dynamic_cast.”
I needed the job so I just smiled and said “Oh, that’s really cool - I didn’t know about that.”
There's GIAAAAANT focus on patterns, architecture, patterns once again
and even more talk/discussions about it, yet even people with years of experience get stuff wrong.
I feel like systems programming is simpler in that matter.
It get's complicated fast if you want to discuss more than very abstract generalities.
There are some traditional corporate HR-screening questions that have flummoxed tech people since forever. Those who are savvy about people-skills in a workspace know instinctively how to answer these questions, but some techies have a very hard time with them because they're either being radically honest or awkwardly trying to second-guess what the interviewer wants to hear.
The best thing you one can do is to practice. Interviewing is a skill (for both sides), it doesn't come naturally to most folks.
Not sure I did it right, but I basically walked the two char* pointers looking for a '0' or a mismatch. The guy asking the question said, "No, first you should compare the string lengths -- since if they have a different length they will be different."
I was nervous, but I thought he was wrong. I said, "Well, to get the length you need to walk both strings, so that isn't faster."
He got annoyed and said, "They have optimized functions for that!"
I didn't argue. Needless to say, I didn't get the job :)
One could just as easily form this as a question.
“All three are usually at the application layer in my experience. Would you like me to discuss this, or give you a more general mapping?”
In either case, asking questions before answering is something any good interviewer should be looking for as a positive signal.
A general knowledge question like this, though, seems designed for a quick answer, so I’d only ask the question if I was really confused rather than splitting hairs.
Interviews are probably mostly useless to both sides. In the end, it's sink or swim.
I only know this from Indian culture. There, someone can "loose their face" if you correct them in front of others.
A friend if mine attended a conference talk of a colleague. She noticed that they made fundamental mistakes in their statistical analysis when she asked if they performed some prior tests. The result was her manager telling her that she shouldn't ask questions in public anymore.
Personally I'd find it very difficult to knowingly give a wrong answer to a client or potential employer, even if it's clearly the one they want. After all, they're hiring me to find the right answer for them, not to agree with them. That's just me though.
This is why, as a blind person, I'm fundamentally screwed when looking for a job. I suspect situations like this contribute greatly to the 70% unemployment rate in the blind community.
Humans are important (and also illogical). If we are to work in a team, then we definitely need to factor humans into the equation.
That also means, that when we evaluate the employer, we should try to find out as much as possible about the human culture there. Even if they are all tech whizzes, if the team is broken, the job will be a nightmare.
I was talking to a guy a couple of days ago, about a job he quit after two days.
He noticed that every time the manager walked onto the floor, everyone put their heads down, and avoided eye contact. It was only a matter of time before the manager cut a victim out of the herd, and humiliated them in front of the others.
During the interview, this same manager was a font of friendliness. But on the floor, he was a tyrant.
I have found it's practical to go on a fishing trip while employed and test the limits, less pressure. Of course, this can backfire and did for me, but overall it was beneficial.
For example, if it's something like a high street bank, don't mess around, these are highly political institutions, similar can be said for large, established companies.
Sometimes, a wizard is required, sometimes a yes person is required.
It's good if you know some hr people personally as friends, listen to their stories.
"One Hacker Way Rational alternative of Agile - Erik Meijer"
Nowadays a Developer has to act a little bit like in this scene from Good Will Hunting.
Make yourself stupid...you will be hired.
That's providing an opinion, a very perilous thing.
Scrum can work, and it can also fail spectacularly. Often it just presses forward as well as any other process. By saying, "Scrum is awful" you're positioning yourself against the current trend in many businesses (similarly saying that about DevOps in many places, or DevSecOps if you go to the US gov't or DoD contractor). And since the value of Scrum is so dependent on the people doing the work and how they actually run things there's no universal statement that can be made about it anyways.
So don't, qualify it: I've seen Scrum at a past employer where they insisted on only code-based stories in each Sprint which led to massive technical debt, and eventually the project came to an incredible slowdown as they paid no attention to refactoring or other cleanup tasks until after they were bit by it. If the stories are a mix of maintenance and development stories then it seems to be more successful.
Such a statement is hard to dispute (it's about personal experience) and is non-confrontational (you're not entirely dismissing Scrum but you're not endorsing it either, only offering insight into what seems to be a problem feature and a potential solution). You aren't making yourself stupid, but you're opening up a potential discussion. If the Scrum Master is present you can get into a discussion about their specific process as implemented in their organization. That can give you more useful insight than saying something negative as if it's a universal truth, when it's not.
In one case I argued for a while until we moved on. In the other, I was able to show the interviewer why I was right and they eventually saw my side. I'm not sure if this was poor communication on my part (definitely possible!) but I felt helpless. Still, I don't think I'd be able to intentionally say an incorrect answer just to get the job.
"Before I answer, can you clarify if we're talking about MVC as the term was used for Smalltalk applications or as it's used now to refer to web frameworks?"
If they look at you blankly about Smalltalk, you know what kind of answer you're supposed to give. If they smile and chuckle, you get to nerd the fuck out. :-)
"That term/phrase is actually used in a couple of different ways. Originally it referred to X But it's sometimes also used to mean Y"
Because if I went to say that most projects need neither I would probably never get any job.
If SNL were to call you and offer you a job while you were working for us, would you take it?
I know what they wanted me to say but I thought it was a pretty dumb question. So I said:
Of course I’d take the job with SNL, but I didn’t move to this city for stand-up comedy, that’s just a side hobby.
I did not get the job and I guess they were looking for someone more dishonest.
A whole separate matter is the use of what amounts to a technical glossary being used as an evaluation criteria.
Interviewer was super nervous, visibly shaking! I poured myself, and him, a glass of water and took a sip. He took a sip, and visibly calmed down.
The answer to every "Do you know XYZ?" IT product was a meek "No". But I still got the job.
Their reasoning? You can teach a nice person technical things with proven interest. It is hard to teach a technically knowledgeable person to be nice.
Not to mention, the second answer is correct. Three-tier architecture and MVC don't have to be exactly the same to be related and have major elements that correspond to each other.
Going deeper, if anyone is interested... it's pretty common in MVC for the view to point at elements of the model. To make it more concrete, imagine a Circle type composed of x, y, and r and a CircleView that displays a Circle. The CircleView might point to an instance of Circle that is also in the model. But as long as the controller put in in the view and the view know it's from the model, that's a perfectly valid MVC (IMO anyway. People will argue about this.) Interestingly, a similar situation arises in a three-tier architecture where storage is able to directly serve resources over standard protocols. E.g., imagine an application to store and share images where the images are stored in KV store that also has an HTTP interface (like S3). When displaying images, the app layer could render HTML to the browser client with <img> elements where the href links directly to the storage. The client would then bypass the app layer to access the storage itself. This is perfectly valid for the same reason as the MVC case -- the client doesn't know or depend on the fact that the resource is directly from the storage, and the middle tier is what controls what the client it going to access.
Being a lead does not make you infallible or even least wrong. You must always be learning and that means often being wrong.
Source: me, a lead
[1] - https://www.todaysmilitary.com/joining-eligibility/asvab-tes...
I don't regret it. =)
The better answer is along the lines of: Architecture patterns are abstract concepts whose implementation may vary. You can express an MVC Architecture without a database or without a GUI, however commonly…
No other topic in our design pattern study group would incite such heated debates.
Any more, any time someone refers to MVC, non ironically, I just mentally check out. Bozo bit style.
Now I wonder if it was designed to prevent this specific scenario.
Me: so you just want me to generate O(n lg n) bits of entropy in O(n) time? That's not possible.
> When I arrived at the interview, there were 2 other guys present. One would be my direct team lead, which was also the technical lead, and the other was his manager.
> Then they moved on to a follow up question: “How does this architecture relate to the model-view-controller pattern?”. I knew this question was really tricky, because I know a lot of people make the mistake of directly linking the tiers to each of the model-view-controller.
> Normally, I would have given the correct answer, and have a nice discussion if they considered it wrong. But the problem was, that this guys manager was sitting next to him. If he didn’t know, I would totally humiliate him in front of his boss. So either he would stick to his guns and refuse the correctness of my answer, to save face. Or he needed to agree that he was wrong, and lose face. Anyway, there was only 1 proper solution to this: I had to answer what they thought was correct.
> The moral of the story? Job interviews are not all about your technical skills, it’s about people skills too. And this is good, because you need both in your job.
One "consultant" technique I use in such situations is to say the correct thing, but while giving "credit" to the person who was actually wrong: "I think $JOE is right about $TOPIC, and what he's saying is that <proceed to making the _correct_ case>". I find that many will be persuaded and are more willing to accept the correction in this form, as they're not being told that they're wrong. If they still don't agree, then I can ratchet up to more pointed criticism: "oh then I misunderstood you...but isn't there a problem then because...."; but, if you can keep bringing them in by validating at least some what of they thought, it helps.
At the company where I work we convert all of our data to quaternions when transmitting over the wire or for data storage. In the gaming and robotics industry there is a misguided assumption that quaternions are always better, and at my company we force this assumption onto all engineers by using typed protobufs. We can never send EUler angles over the wire, we must always send quaternions.
This is actually fundamentally bad. Like it's not even a design question. It is by logic worse to store things as quaternions. Quaternions are only good for certain transformation calculations. They are not as good for data transmission or storage. So I made a proposal to offer alternatives but I was shot down even by the CEO (who took the time to personally make his own viewpoint known on the entire slack thread out of nowhere) because all of these people buy into the misguided notion that quaternions are always better.
The person I was talking to about this was so hell bent on believing that quaternions are better that if I pressed the point further I could start an all out conflict that could get me fired so I had to stop and pretend (aka lie) to agree.
The fact of the matter is, Quaternions are a higher entropy form of storage for rotation and orientation. You lose information when converting something to a quaternion and this is good for calculation but definitively bad when you choose to use quaternions for data storage or transmission. If you transmit or store things as Euler angles you CAN always convert it to a quaternion. The conversion is trivial and mostly a non-issue.
The problem is that once you have a quaternion you can't go back to Euler Angles without additional assumptions. The back conversion algorithm is not One to One. So by forcing this format as storage you are limiting the future productivity of this data by keeping it in a higher entropy form.
Each quaternion is realized by TWO euler angles within a 360 range of motion across 3 axis-es. When you convert something to a quaternion you cannot go backwards. You cannot find the original euler angle where the quaternion came from because you HAVE two options to choose from.
For gaming this problem is not so apparent because you're in a virtual world and having everything exist in quaternions is ok because rotational orientations don't have to be realized by actual movement or rotations. The computer simply draws the object at the required orientation.
But real world rotations HAVE to be realized by euler angles. You cannot Orient something in reality without actually turning it about an axis. Gimbal lock cannot be erased in the real world and even the Apollo module suffered from this phenomenon despite the fact that the engineers knew about quaternions. People at my company seem to think the issue disappears once you switch everything to quaternions.
Thus for something as simple as having one robot gimbal imitate another... if the communication protocol between them both was exclusively quaternions (with no additional assumptions) the imitating robot can choose an alternative euler angle to project it's motion onto and the two robots WILL not be be in sync. Total information Loss.
So all in all this proposal never went through. I was shut down by stubbornness and over confidence by "robotics experts" who've been brainwashed by false dogma. The people I was proposing this to told me that I should trust the extensive experience their backgrounds of building self driving cars at uber and building robots at CMU. Yeah I respect that but can you not see the literal logic of the issue here? I don't respect people who aren't able to see logic.
The company culture is just part of the story, these falsehoods are likely held industry wide and you'd get these issues everywhere. False Dogma is powerful. Try telling a christian that walking on water is ludicrous when looking at it logically. It's same issue here. Peoples' brains will fight logic if it goes against their beliefs.
Very likely I might even get replies to this post who have so much confidence in quaternions that they'll come up with a retort that doesn't fully understand the problem I illustrated here.