There were certain areas where the interviewer messed up in their evaluation, e.g. they felt I "was a little weak at hashmaps and API design", which is probably because they did not know what I was talking about when I described the details of advanced hashmap implementations. There seems to be a bias to discredit the interviewee if the interviewer lacks knowledge in an area.
Either way, despite getting great evaluations, I was matched with a total of 5 companies most of which were highly underwhelming early stage companies with minimal traction. Furthermore, I was matched with full stack companies despite begin evaluated as "weak in API design", which is perplexing. I was able to get higher quality offers in my own search and it seems like the TripleByte pipeline consists of many mediocre companies.
If I had known, I wouldn't have wasted my time with this service and invested more time in my job search.
I had a common point about hashmaps -- the interviewer seemed to be at loss, and was asking weird questions that had little connection or relevance to the implementation path I'd chosen -- I politely explained the confusion and we swiftly moved on. They then marked that as a weak/"fuzzy" spot in their evaluation. I made sure to give my feedback about this to the person who shared the evaluation, but did not receive a response.
In the end, after making it to the company-matching phase, they found a whopping 1 company with <5 people, in an area I had clearly said I didn't plan on moving to. I don't know which part of the data-centric recruiting process got this so badly wrong.
Hoping that this process improves, but so far it hasn't lived up to expectations and I ended up finding multiple great matches on my own afterwards.
I ran the TripleByte course once, and it was 100% not worth my time as a professional.
The "coding test" was criminally simple, but got me in the door quickly for an interview. I spent a number of hours building out projects with a paired interviewer, as well as answering questions. This part I enjoyed, it felt like a nice back and forth while building an interesting bit of software. It was like an open discussion, and getting to tap away on my laptop was such an enjoyable time.
Then came the technical questions. The interviewer asked if I knew anything about a specific Technology X. I'd list the tech here by name, but it's so specific I'm afraid of it being linked back to me. It's not something most engineers would run into.
I responded with "I have not worked with that, I've heard of it" as well as it never being listed on my resume or professional work. The interviewer went ahead and simplified it down for us to discuss, much like "Ok well it works like this, so let's chat abstractly". I went along with the discussion since I figured it would be fine to chat abstractly about a technology I never worked with.
The interview concluded, we parted ways and I thought things went very well.
The following days later I received an e-mail from Triplebyte. They praised my clean code and thought process, but specifically said my weakness in said Technology X, which I would like to call out again I never worked with professional nor had it listed as a skill or on my resume, was too much to consider me for the next round.
TripleByte literally evaluated, and discounted me, on not knowing an uncommon bit of technology. Just what the hell.
I was shocked at the levels of failures that occurred to reach this point. It was unfair to use that as any benchmark, and unfair to waste a day of my time doing that. It was a smack in the face to an industry vet like myself.
I tell all job seekers to stay far, far away from TripleByte for this reason. They're not really changing the game at all, but like to pretend they are the magic answer.
One footnote: I'm an engineer at one of the giants (Google, Amazon, Apple, Microsoft, etc) who was and is way more qualified than anything TripleByte was or is pushing out.
If it's something commonly used by lots of developers and needed by lots of companies, I think it'd be fair to think of it as a bad signal at least.
The giants (Google, etc) have great engineers, but they also often use technologies that's very different from the general public.
Being an engineer from those companies are great, but it doesn't automatically qualify you for any job on the market.
I'm not sure if the fault lies with Triplebyte or Mixpanel but it was an overall shitty experience. To be asked to take out your time to complete a quiz (which can be automatically scored with an algorithm since its multiple choice) and then get radio silence is terrible. Even if it was Mixpanel's fault, they should ensure their partners actually follow up with interviewers. For example, I know that Hired (another platform) actually will ding an employer that doesn't respond to applicants.
Anyways, Mixpanel & Triplebyte are probably on my "never apply to again ever" list.
Edit: fixed the formatting
That reminded me of this classic HN comment: https://news.ycombinator.com/item?id=12701625
The containing discussion is also relevant.
Interviewing and evaluating engineers is an area a lot of people feel passionately about and have strong opinions on. We're continually looking for ways to improve our process, if you've any thoughts or feedback please ping me - harj at triplebyte.
At our company, we try to painstakingly craft our recruiting experience to make sure each candidate we interview has a good experience and ends up with a positive impression of our company regardless of whether or not we end up sending them an offer. At the end of the day, we're all human beings, each with something unique to bring to the table, even if that something might not be what we're looking for for a particular role at the moment.
Maybe past some scale we'll have to start changing our approach and start reducing candidates down to data points and "run experiments" on them like lab rats, like Google, et all, and these guys here seem to be so proud of themselves for doing, but I'd sooner quit than to stay a part of a company that does that.
Engineering managers MUST do recruiting - never let HR take this from you. HR can do clerical work, but they should have little role in search and outreach. I'd instead recommend finding new talent at local universities, mid level talent give recruiter like bonuses for employee referrals and poach from competitors, and extraordinary talent go look at commit logs of the open source software you use and hire people from that list. Easy, cheap, and effective.
It's hard enough internally to track someone's performance over the course of the year or two after they get hired, it would be even harder to do it if you are a recruiting company.
It's especially sensitive because employers are weary of sharing employee performance data to third parties because of the high risk of a lawsuit (there is clear precedent for these lawsuits.)
Once that data problem is bridged, it blows the problem right open for data to be explored and figure out what exactly predicts a top performer, in any field.
Yes, assuming there is some top-level "data problem" to actually bridge here...
How do we know that the concept of "top performer" isn't just a completely divergent idea that means different things to different people and different companies in different industries and different geographic areas?
I would like to test my hypothesis that grads from the top 10 cs schools as determined by US News et al. are generally good, but overvalued.
At the end of the day, value is driven by how much political leverage a hire can give a manager, not by how much value they add to the org.
What I was interested in is whether the questions get harder, if I answer well, but they seemed random.
You could use logistic regression to estimate the level of an interviewer and adjust the questions to get to the same accuracy with less time (or to improve accuracy with the same number of questions/time)
You're right that tailoring question difficulty to ability level can drastically increase a test's accuracy. But while a logistic regression model works well when you have a fixed quiz or a low number of questions, it isn't flexible enough to work with a fully adaptive system like we have at Triplebyte. Our models are loosely based on the kinds of systems that the MGAT or GRE use, but we've implemented significant extensions on top of those approaches to fit our needs.
In a prior conversation on HN (link below), I brought up some aspect of my interview (interviewer late, argumentative, smug, etc.). Then the interviewer came on to HN and PUBLICLY SHARED PORTIONS OF MY INTERVIEW. Honestly, should have been fired on the spot, but nope.
To the interviewers credit, after I was the number one comment for most of the day he deleted that portion of the comment. I am grateful (looking back now) that was removed, however I think it speaks volumes.
The prior discussion is here:
https://news.ycombinator.com/item?id=13830444
My two cents, is the idea is good - there is some room for improvement. What's scary is putting one company as a wall between you and the employer. I hope it never comes to pass where they control even 5% of the market. No one should be able to interview better than the company itself and employees shouldn't use a service which upon being declined blocks them from other companies. I don't believe that's the case (yet), so no qualms for the time being.
Given my experience, I hope they've improved and would happily change my view if I had reason to.
EDIT: Added prior interaction for reference
That means the chances of being hired after doing a TripleByte interview is slightly under 1% if my back of the napkin calculation is accurate.
The first thing every company asked for was my resume, clearly they had not bought into the triplebyte process. Some seemed entirely unfamiliar with triplebyte.
Interviewing can be a sad process. Triplebyte gave me a taste for what things could be like, but didn't give me any advantage in the application process.
The companies triplebyte matched me with resulted in some of my worst interview experiences. Think disinterested ceos, hostile line of questioning, and a focus on my previous job experience vs things I would have liked to talk about (open source, personal projects)
EDIT: Their website layout is a classic agency layout.
> header with giant "sign up" button
> "top tech companies" in big print as a selling point
> huge section with the most "famous" companies in their client pool
> free cost (you're the product they're selling, so they're not looking out for a best fit - they're looking to get paid for placing you)
> testimonials
> blogroll that reads like it was built solely for SEO
Also, shouldn't we be concerned that giving one company's algorithms control over who gets hired will be too much power in too few hands? And algorithms are not neutral. The people that make the algorithms have biases and discriminations just like regular people do but at least if your company does its own hiring you can work on figuring out what those are and how to address them. How can you do that if you depend on some proprietary algorithm?
And what about disabilities? How does your algorithm handle those? Racial bias? So many unanswerable questions.
I have many issues with the way most companies interview but giving up that process to a proprietary algorithm seems like the worst solution. This is not news to be celebrated.
Couldn't algorithms reduce racial bias by focusing on evaluating candidates independent of their race, and other personal attributes?
""The metric that companies care most about is what percentage of on-site interviews convert into hires, and the industry standard is 20 percent. Triplebyte’s placement rate is 40 percent," says Taggar."
The current screening process provides a low signal of competence, and so companies have to rely more on credentials (degrees, previous company brands) during screening, which means that a lot of skilled people still can't get their feet in the door at companies if they don’t “look right”, and companies fight over a restricted talent pool.
Lack of hiring data for smaller companies means they copy larger company’s interview processes, but there’s no strong forcing function to drive innovation in larger company’s hiring processes (i.e. their success could be despite a bad interviewing process - because they have a brand and offer a lot of perks, hence attracting the best talent, and so they aren’t in a “we have to fix hiring or we will die” mode).
This also really hurts startups - who aren’t in positions to take risks with hiring, and with a lack of good evaluations, have to rely on credentials, which restricts their pool, and makes them compete with the big cos for that talent.
Another important implication of fixing hiring is that it will introduce a powerful forcing function on higher education institutions. If students know that they can get jobs without having “traditional” credentials, but if they can pass, say TripleByte’s, or some other company’s, assessment which is more aligned with what’s required on the job, and is a signal that companies believe in, then students can use money that they would have spent on college to instead actually learn the skills that would be useful on the job.
This movement of money out of higher education, would fund a lot more experiments in learning and education.
I can’t stress how important I think this problem is to solve, and I’m glad companies like TripleByte, interviewing.io, are working on it. We need more companies, more approaches, more experiments in this space.
This is largely just a software/technology problem. In all other professional industries there are means to validate a candidate's competency before they are allowed to interview for a position: licensing, required internships, legal certifications/authorizations, authorized relationships, and so forth.
Technology doesn't have this. The big difference is that in those other professions they are using the interview to actually interview the candidate, as in the person. In software and technology the entire interview is used to gauge basic competency and even then the trust relationship is inherently broken.
Contrary to what technologists will tell you the problem isn't the hiring process or low salaries (preposterous answer unless you live in the bay area). These are symptoms of a broken trust relationship. Hiring companies inherently do not trust the people they are interviewing as basically competent unless they have been told otherwise by somebody they know personally.
Hiring companies shouldn't trust a candidate is minimally competent, because there is no means to a standard baseline on which competency is measured. That is the primary problem. Solve for this problem and the resulting symptoms are easily addressed by the marketplace as a matter of economics.
---
The problem is very clear to see when you have two simultaneous careers: one as a software developer and a different one in an unrelated industry that has professionally addressed these concerns with required professional education and accreditation/licensing.
> there are means to validate a candidate's competency before they are allowed to interview for a position: licensing, required internships, legal certifications/authorizations, authorized relationships, and so forth.
The problems with credentials that you mention:
1. They are often weak signals of actual competence, and in the case that they are decent, there is still a lot of room for improvement through experimenting via a data driven process (current credentialing is, in many cases, outdated, and doesn't map to what actual work is like).
2. They are not accessible by everyone. This is problematic as the means to learning is becoming more accessible (through online education, etc.), but the credentialing is still restricted - since the institutions that hand them out haven't scaled credentialing. There is a lot of opportunity to provide signal for competence that scales... and measures skill that is actually used on the job (which is also changing as technology matures and penetrates other industries - we'll need a credentialing system that can adjust to those changes quickly).
In fact, I'll go as far as to say that this is a bigger problem in non-software industries. At least in software, there is a more objective way to measure a candidate's competence independent from the path they took to gain that competence. This means that people that might not have necessarily had a formal education / credentialing have a sliver of a chance of an opportunity to prove their skill. In other industries, if you don't have the credentialing, you have no shot.
I disagree. They are weak at separating the top 10% from the rest of the qualified people, but they are excellent at removing the people who have no business being there in the first place.
The first two that comes to the minds of most people are law and medical licenses. These licenses don't exist as a job qualifier. They exist as a legal qualifier. That means a gross abuse of the license requirements are cause for law suits and serious criminal offenses even though most lawyers and doctors are corporate employees.
If programmers had the realization that gross negligence could land them in jail or cause them to lose their career and property in a lawsuit I suspect they would take their jobs more seriously than merely writing code.
Programmers don't just write code just like doctors don't just prescribe painkillers and soldiers don't just shoot people. They make numerous critical decisions that have real world implications. Examples of gross failure are simplistic known security breaches that allow confiscation of millions of credit card numbers and PII. Other examples include discriminatory and accessibility violating software products.
These are basic foundational qualities of competence. In any other industry negligence of this magnitude would put in prison. Since the base line is so ridiculously low for hiring developers these are considered advanced qualities often transferred to third party firms and only after threats of pending legal actions. All we care about when hiring developers is whether they are literate and have a pulse.
Be serious, no change to any hiring process will fix that.
> They are not accessible by everyone.
Don't care. If a person want to achieve access to a given career they will find a way through their own internal motivation. If the industry wants to make the careers more accessible they will promote a desirable education path. This isn't a secret legendary arcane black magic.
I disagree that there are objective measures and the vast amount of e-ink spilled on interviewing practice debates is proof of that (tech interviewing also doesn't correlate to the actual work being done!).
I'm on the "tech should be open to all" side but it _is_ harder for companies to filter out potential bad candidates. That's why they make up their own filters like "we only want seniors".
Do you think requiring those credentials would benefit software industry as well? Is that enough for top companies to base their hiring decisions on?
I am thinking that is every professional career other than software. Truck drivers without any education are substantially more regulated than a software developer writing life saving applications in an MRI machine.
Is this why it's so hard to find work by applying through a job portal?
Its far easier to do online screening for software developers than it is for the softer skills - sales, marketing, etc.
I would speculate the total number of people the average HN reader would consider a software engineer is much lower.
https://news.ycombinator.com/item?id=13830444
Further down the comment thread, the interviewer came on and shared information about my interview (essentially because I called them out for rude and smug - which IMO they were).
Their process is fantastic. I can see them replacing first round interviews entirely at some companies if they can look for all the candidates companies need, not just the most senior.
I'm glad to hear they're expanding.
Not nearly enough people in the hiring chain understand the importance of this.
Whether an employee can get along with co-workers is probably by far the most important metric in most jobs, yet it's mostly ignored because it's hard to test for. I guess companies hope they'll figure out if someone's a bad fit while they're still in a probationary period or something.
The most skilled worker in their field is useless if they can't cooperate and communicate effectively with others.
With the loyalty, willingness to learn and work ethic you can usually expect from an initially lower-skilled employee (along with lower wages/cost), a little training could turn them into a hugely valuable team member in fairly short order if a company makes an effort to ramp them up properly. Giving more of these people a shot will dramatically increase your odds of finding team members who work together well and become greater than the sum of their parts.
This idea of ignoring everyone who doesn't fit a ridiculously narrow criteria causes a whole lot of missed opportunities across the board. You end up hiring a bunch of elites-on-paper all trying to outmaneuver each other into the most possible money who will be gone in 2-3 years, while 'less attractive' candidates who would actually care about the work and tend to stick around get tossed in the garbage without even being seen.
What does this mean exactly? E.g. does the test successfully identify the people who have the best portfolios of things they’ve built previously?
Even with an algorithm doing the identification, any data produced is still going to be highly subjective (based on the "best portfolio" ideas of TripleByte and whoever worked on the algorithm).
After the initial quiz, I had a ~2 hours interview with a human, which included a 1h "pair programming" challenge, random technical questions on my field, general CS questions and architecture (system design) type of questions.
Once I passed that step, my talent manager (the person who helps facilitating the discussion with the companies) told me IIRC that about 1 in 5 passes the human test.
Based on my skill set and preferences, the system "offered" some 30-something companies, and I chose to have an introductory call with ~10 of them. Each company has some background information, what they're good for (in TripleByte's opinion), their general size and their engineering size. Some companies (the bigger names) have additional steps before the on-site, like another pair-programming session or take-home exercises.
From those calls, 5 on-site interviews stemmed, and 4 of those resulted in an offer. TripleByte also helped arranging the on-site all in the same week, so if you're remote you don't have to fly back and forth all the time.
Top notch service, imo
I am a swe with 2-3 years of experience, maybe four depending on how you count experience maybe, speak at lots of large conferences, contribute to open source, etc. I feel (and have data to back up) like Triplebyte sold me as a 5-7 years experience person, so only got interviews with companies looking for senior people. Really got along great with all companies and I have historically been a great judge of how my interviews have gone, but I think Triplebyte overselling me essentially caused me to waste a week of my life. Feel free to handwave and say I am blaming Triplebyte for my failures (of which I have many), but I do think I would have had 1-2 offers otherwise. Although to be fair, the 130-140k comp most companies mentioned would be a significant pay cut and may not have been do-able for me even if I had received an offer.
They put me through the standard interview process along with additional discussion about the interviewer position.
Overall they're much better than most startups but have room for improvement.
The good:
They kept me well-informed throughout the process and set proper expectations. They were prompt with followups and stuck to the schedule they set. My interviewers were knowledgable, clearly software engineers.
The bad:
Too much focus on algos and CS fundamentals, not enough on higher level concepts and what makes someone a good fit for an available position.
I understand it's really hard to have a generic evaluation that covers multiple potential roles. That said, I believe that these questions do not select for good employees for most roles. They are biased to select for recent college grads and people who are willing to study before interviewing.
For example, they asked me about bloom filters. 95%+ of startup software jobs will never have to deal with bloom filters. Why would you ask about them? Ask something that will actually be encountered on the job.
To add a little background I was responsible for our hiring process in my last position, including interviewing, so I'm a bit opinionated about this.
Edit: By the way, you should email them about the bug with your profile, they'd probably want to know about it / give you another shot.
After the initial test, there is a fairly long phone/remote desktop interview that consists of:
* Writing code (on your actual machine with the tools and language of your choice!) to solve a simple problem.
* Debugging (a smallish program with 5 failed unit tests)
* General knowledge questions (databases, web (both html and http), data-structures algorithms)
The phone interview then ended with them giving you a couple of tips on answering the non-technical interview topics that a lot of engineers flub (why do you want to work here, when can you start, compensation).
The next day I got a list of over a dozen positions with the recommendation that I pick at least 5 to move on to phone screening.
The phone screenings went well (3 of them were just varients of "all the candidatese triplebyte has sent us were great, so we just want to talk about our company"). This was also my first hint that compensation would be an issue; one company was immediately ruled out because they were early stage and I can't pay a mortgage with equity.
Then triplebyte scheduled the on-sites all in the same week so that I wouldn't have to go back-and-forth to the bay area.
Ultimately Apple was the only company on my list paying enough to get me to relocate, and they passed on me.
Overall I don't think it was a good experience for me.
The interview was pleasant but long. Since they are more of a recruiting firm than anything, they are able to ask questions in a way that an organization looking to hire won't. It allows the candidate to be more candid and detailed with their knowledge and expectations. I thought it went well enough but I did not proceed to the company matching phase.
It is _very_ clear that they are looking for a specific type of developer - a type which I reckon probably doesn't need Triplybyte in the first place. I am not a web/rails/whatever developer, nor am I a senior engineer with very niche skillsets like low-level systems, etc.
The feedback I got was mostly positive, but contradicted itself in odd ways (pro: "we like your DB skills" / con: "work more with DBs") and really just translated to "you aren't marketable to startups and don't have the pedigree / experience needed to throw at our larger clients."
I suppose if you have an popular or incredibly niche skillset that is in-demand, but are having trouble getting the attention of companies for whatever reason (it happens), TripleByte is a decent shot. But if you have more general experience / a skill-set that
Ultimately they are a business and they have to operate this way, so I understand being turned down. I was, however, disappointed that I was 'let down' in a way by the vision I was pitched of what TripleByte claims to do / be.
This is the business model of “elite” universities - admit people who would be successful anyway then claim credit for their success
My only critique, is that it seems a bit too front end oriented these days for someone like me who is basically a deep backend person. They did have long form infrastructure questions, but as I get asked to take their prototype tests every so often, it seems many of the new tests are just front end oriented (i.e. language types). I'm guessing this is because that's where the market has taken them.
I think Triplebyte was really useful for me because I didn't have a CS degree and didn't have any work history directly in tech, so I couldn't get through resume screens via direct applications and didn't really know many people who could refer me. So Triplebyte had high value in getting me actual the actual technical interviews. I'm not sure how valuable it would be for others who don't have trouble with this, but given the minimal time investment (a few hours for phone interview), it's probably worth a shot regardless.
Does anyone think that social proof could work here? If 15 peers endorse Sally for React Native and those 15 people are likewise found to be credible, could such a network effect be more valuable than a coding test?
But what if you could self-identity your skills (Erlang, Vue.js, etc) and there was a low friction way for your peers to +1 those skills?
I'd consider recommendations from CTOs, former Senior or Principle engineers to be the #1 indication that the person can do what they say they can do. Verify a bit more, but honestly the college level micro-optimization whiteboard-only BS questions are mainly only useful for checking new grads.
I got to interview with some pretty exciting/interesting companies.
The only problem with Triplebyte, in my opinion, is that I don’t think they track job success AFTER the hire. I imagine this is probably a problem they’re working on. But it’s hard to build a successful recruiting company if you don’t know what happens to the employees once they actually get hired.
The problem with raising the bar for interviewing engineers is the work that they end up doing isn't moving at same pace. With more frameworks, better languages and open source building stuff is getting easier
I think hiring is a difficult process because we need to work with others and people are different in general.
I have personally worked with people who started programming just because they were interested in it - they had no knowledge of algorithmic complexity but they were very open-minded, had a great perspective on the domain and were a pleasure to work with.
This is very anecdotal of course but I sincerely hope that they would have been able to make it past the online quiz...
(If you're thinking they should be smart enough to be able to game the quiz, then my question would be - why not just screen everyone in person then? Of course, that's not scalable and not worth the 50 million then...)
Thing is, a startup cant do like amazon and actually dive into every random applicant. It's too big of a work. So, a startup is limited and can only use recommendations in order to even think about interviewing someone.
Now, what if a company would do the grunt work and select a few of those random applicants and submit them to the companies. That would bring a shitload of value because now startups would have a new source of relevant applications to tap in.
I think triplebyte is actually a good investment.
As it happens, stock trading is a bad example of something scientific methods do not improve :)
This is the most exciting thing to me. I would love to use Triplebyte to try to find a position, but relocating is just not an option for me right now.
Marissa = Disaster. I hope she doesnt overemphasize her position as an investor and again runs an enterprise into the ground.