This is an interesting quote. From my experience and personal perspectives, many of the best researchers and scientists doubt themselves, a lot, and are typically hesitant to make definite statements in general. Research is inherently high risk and prone to failure... that's fundamental to what makes it research. If you work in research for awhile, you're wrong so often that it creates an environment of constant self-doubt and constant questioning of ideas.
On top of that, from my experience, the more I learn about an area or subject, the more I realize how little I knew before and the more I've discovered in terms of what I don't know. As the space of your knowledge grows, the surface area also increases and you eventually begin questioning things some fundamentally just accept while the deeper you dig, the more you know where the current frontiers of uncertainty and knowledge truly lie. Combine that with the understanding of where you started (knowing even less but thinking you knew more) and how in hindsight, you were so wrong.. leads to lower confidence in your assessments, even if most might consider you an expert.
Isn't it exactly the other way around? Steve Jobs, Elon Musk, Mark Zuckerberg have a reality distortion field. The outliers in entrepreneurship sell a new world that they are going to create. When Elon Musk started talking about electric cars and space travel to Mars 99% of the people thought he was batshit crazy.
"I imagine 99% of your thought process is protecting your self-conception, and 98% of that is wrong."
Quote is at @1:23 (during the last half hour where the interview is mostly philosophical) of https://www.youtube.com/watch?v=Nb2tebYAaOA
In research, this process is usually a personal one (perhaps with a lot of discussion). But in industry, a CEO is giving orders and dragging a lot of people along with what feels wrong, and the CEO isn’t in a position to show deep self-doubt if it exists.
Not sure here about the message, but am sure the wording as Sam has chosen is very poor.
1. Bias towards action & clear eyed => I think that's right, but there is another part of this too, that is more important as a founder – making decisions even under massive uncertainty. In a company, it's not just uncertain technical decisions, but also market decisions, cultural decisions, people decisions, etc. This is stomach churning, and most researchers can focus on the technical challenges in ways that founders can't. You have to do this in research decisions too; but as a founder it feels like it happens way, way more often with broader and broader sets of decisiosn.
2. One of the thing that I feel very different about founders is you have be honest about what the actual problems you have to solve are, and not turn your nose at the seemingly mundane and important tasks like managing a company. Great researchers are focused on their scientific problems over decades - founders are focused on building a lasting organization. These have pretty different consequences on what one chooses to spend their time on.
3. In academia at least, there are some really big differences in running a company versus running a lab. In a lab, my main mission is training people, while working on problems I find interesting... slowly moving towards my long-term scientific/technical goals. In a company, it's building a product that people will buy, and slowly moving towards those same goals. Again, this has pretty big consequences on what one spends their time doing and the types of problems you get to solve. There are positives and negatives to both approaches, some of which are quite subtle. For example, reputation games are far more important in academia than industry - I also find authority becomes a lot more pernicious in academia than industry. Anyways, lots here that are very different (but again this might be academia rather than research itself).
Is anyone else offput by the phrase "best people"? I get (or at least hope) it's a shorthand for "best at their respective job of researcher/founder," but it really seems to reduce people's innate worth and goodness to this single dimension in a somewhat unnerving way.
How does one define "best" at all in this context? If you devote all your efforts to researching a problem no one is looking at and still come up with nothing, are you still considered one of the "best people"? What if you are researching a problem many others are investigating and then do find something new? It feels very much like hindsight bias to apply such a moniker.
And it's a one-dimensional quantity, to a first approximation.
It sounds very generic, but I've found it to be true. If I spend time thinking about what's the best way forward, then just do it, relentlessly and persistently, and with a healthy disregard for cynicism and disbelief from others, I get a lot done.
It also reminds me of the concept of "taking ideas seriously": https://www.lesswrong.com/posts/Q8jyAdRYbieK8PtfT/taking-ide...
The most obvious is that researchers care about finding truth for its own sake even if that truth doesn't have any commercial value, and founders care about producing a product for a market, even if that means ignoring some truths.
I also disagree with a lot of how the idea is presented in that text, but the idea itself - that if you get convinced of a basic point, you should extract the second, third, fourth and fifth order effect of that idea - is profound.
It reminds me of a story of a startup that did cybersecurity for SCADA systems, for factories. They would connect to diagnostics APIs, do anomaly detection, and could then alert on any cyber attacks.
Turns out factories are extremely sensitive to downtime (millions lost per hour of downtime), and a lot of them operate under "if it works don't touch it". So they pivoted - instead of actively tapping APIs, they would passively sniff network traffic, draw a picture of the network and what talked to what, and do anomaly detection on that.
But reality took the passivity idea seriously - and the value to factory operators ended up being visibility into the network topology. The company pivoted away from cybersecurity and into analytics and made a lot of money.
https://www.lesswrong.com/posts/QePFiEKZ4R2KnxMkW/posts-i-re...
I've worked with researcher/founders a lot; many of the people from my PhD program (Biophysics, UCSF) went on to start companies (Amyris, Zymergen) and we had strong educational pathways to learn how to start biotech companies. The two groups of people are definitely drawn from a highly overlapping distribution, although many scientists would make poor founders, and vice versa.
I've spent my adult life in research environments (academic, nonprofit and industrial R&D) and while much of the activity seems entrepreneurial (particularly grant writing), the overarching structural differences between building something for profit vs. for the public good makes a lot of aspects of building a business a bit mysterious to me.
There's awfully skinny budgets for most research these days and so much focus on 'success' (short term ROI) and '[financial] sustainability' (translating research into products/services in business form).
This is growing ever more true even in basic research, which is IMHO absurd. It's growing to the point it might as well just be 'D' with higher risks, less flexibility, and lower rewards which is making entrepreneurship more alluring.
I don't know who is going to fund long term research if the federal government doesn't. I suppose we can rely on the international market to produce research and hope it's useful. Businesses tend to be highly risk averse anymore.
That is in and of itself interesting, and the work (making earthquake forecasts and seismic hazard/risk models) is generally fun and has a lot more positive human impact than studying earthquakes because they are simply fascinating geophysical phenomena. But there are regularly a lot of great research ideas that go unexplored because we don't have the resources or immediate incentive to investigate them.
I get the sense that the work of almost all founders involves having to get stuff from other people lots more pervasively, from funding to hiring to organization building.
(Different attitudes to risk might also be a part of the difference.)
"This job would be great if it wasn't for the fucking customers" --Randal, Clerks
Lack of respect for traditional authority, entrenched interests, and boundaries would be my take. Forgiveness > permission mindset, with a healthy risk tolerance above baseline.
Founders take their research and drive towards profitable exploitation of that knowledge relentlessly.
This is an interesting take. A lot of researchers are pretty anti-authoritarian, at least initially, and the scientific process involves a lot of tearing down existing knowledge and rebuilding. We all really, deep down want to prove everyone else wrong.
However, when the funding comes from institutional sources, there are certainly limits on how rebellious one can actually be.
Furthermore the peer review process encourages a kind of camaraderie and politics where you compete with each other, and are actively tasked with finding fault in everyone else's work, but you are also stuck with them for decades, so you don't want to screw anyone over too hard, because their turn to review your grant proposal will come around soon.
> Founders take their research and drive towards profitable exploitation of that knowledge relentlessly.
Yeah, this latter part is what I've never really gotten. My goal is always to take my research and drive relentlessly towards... more research. Ideally while freely disseminating the products and tools used so that others can do the same, thereby letting everyone share in the fruits of the labor.
This is such an important issue in the startup world. The most common mistake that founders I’ve worked with make is that they focus on the wrong problem or even worse focus on too many problems.
Having good “problem taste” is critical for anyone who wants to start a successful company or publish breakthrough research.
The basic idea is that you need to work on an important problem. But an important problem isn't what you think (e.g. time-travel, teleportation, antigravity, etc.) -- instead it is a problem for which there exists an "attack".
I think it might be important to quantify these terms, but then I think it is pretty hard to do so. If I worked on some idea for 2 months, then am I persistent enough? And if I worked on it 10 hours a day, have I worked hard enough?
I guess, you just know it when you work hard or are persistent enough, but sometimes you dont know and you are hurting inside that you are not working hard enough or being persistent enough as you don't see any success
The author's brand makes it look very insightful but if you look closely it's really cliche. Yeah, no shit, successful people work hard on important problems, they have small-scale laser focus and also large-scale vision.
Seems like the wisdom tree has been plucked, these startup wisdom blogs are getting emptier and emptier (see also Paul Graham...).
Yeah, academic research labs are strikingly similar to seed-stage start-ups (mid six/low seven annual burn for 3-20 employees laser focused on a particular vision). It's not at all surprising that the two career tracks attract similar types of people.
I couldn't comprehend why anybody would watch daily videos from a guy/gal who does nothing but films him/herself filming shit (Casey Neistat being the first I believe), but I think I've figured it out.
It's like having an internet friend - if they like you, it no longer matters what you do, the same way you aren't having deep conversations with your friends, you're just 'hanging out'.
Sam Altman is doing the daily video version of hanging out, except he does it in blog format because he's an 'intellectual' or maybe just camera shy and the frequency seems to be a week or two apart.
It used to be that people would actually provide some value - a blog would at least aggregate interesting news stories (Daring Fireball) and provide some insight, but people have realized that doing all that work of actually reading, thinking and providing insight, is optional - you just need others to want to consume whatever you're providing and the bar has turned out to be far lower than any intelligent person can readily comprehend.
There's also a great deal of 'ignorance is bliss' when it comes to people like Paul Graham. His posts on Twitter strike me as him sharing what he considers to be insightful or interesting. It's revealing that rather than actually studying people who've come before him and devoted their life to contemplation, he's perfectly content to have 'insights' about his children's latest quip. You can't fault someone for it and I don't think Paul has ever claimed to be an intellectual, so it is perfectly good that he gets to have his simple fun of re-discovering the tried and true, rather than working hard on attempting to discover the novel. It's when he generalizes his personal little joys into theories about the rest of the world without any felt need for diligence (besides editing) or response to feedback, that his simple-mindedness is revealed and catches people who haven't lived a while, off-guard. Sam Altman may fall into this category.
Mark Twain I think said: That man can pack the smallest ideas into the most words of any man I know.
Personal Hate: Essays that follow the NPR style of layering vast amounts of extraneous sub-anecdotes before getting to the point.
Personally I find the born talent view to be lazy and not a little bit creepy.
I was placed into the "gifted" program in the 1st grade of elementary school and told for many years that I was somehow special or "very" intelligent.
I never believed them, of course, because of two observations:
1. The adults who were telling me this did a lot of stupid stuff, which undermined the credibility of their claims.
2. Despite their best efforts to insulate us from the normal students, I knew people my age outside of the gifted program who were as clever -- if not even more so -- than my so-called "gifted" peers.
As an adult, I'm glad I never bought their hype. It's a one-way high-speed trip to narcissism, laziness, entitlement, and creepiness.
Not lazy icons these anyway.
If I wrote a post like this, people would very reasonably wonder what sort of experience I had informing these generalizations. Sam's background is very relevant for figuring out whether his thoughts are worth paying attention to here.
Sam, if you're reading this and want to challenge my hypothesis, all you need to do is make a pen name and register a domain name to go with it, then publish your next post of the same caliber under that persona and see how well it sinks/swims on HN.
But for some constructive criticism, there are some actual topics I'd be interested in hearing discussion about, on researchers vs founders. I've been a bit of both, and I'd say the more practical similarities are:
- "unlimited" freedom to work on what you think is important, usually in something you think is different, but with a existential constraint. For founders, it's the business model -- your pitch deck needs a convincing business model to survive regardless of the product (which is what a lot of founders really care about). Whereas in research, you need a long-term vision that is attracting to funding to survive, which can be a deep expertise in something societally-relevant, or evidence of success in doing something novel
- the game: there's sort of a game to play for both. With startups, there's the optimization of MAUs and acting like a startup and growing fast; there's the established ways of getting funding from angel investment to series of investments, attorneys and payments, and then different ways to exit. For research, there's the game of publishing, annual cycles of recruiting great students and advising, reputation and finding your niche, and the academic system in general.
- management: on both cases you're managing a small team, usually under 50 people, so small enough that you know everyone and can be a bit involved in what they're doing, but big enough that you need a bit of hierarchy.
There's also some major differences:
- Equity vs reputation. Early startup employees work for less pay (moreso in the past) for the chance their equity will be highly valuable. Early stage researchers (PhD students or Postdocs) work for less pay for the chance to discover/invent something amazing to become a tenured professor or leading scientist.
- Formal mentorship credit: researchers get credit for being mentors for people that leave and do well later. PhD students are partly known for who their advisor is. When a student does well at an institution and goes to another one, the first institution is acknowledged indefinitely. Papers credit the authors as well as the institution before a single line of text. In startups, when someone amazing leaves it's a major negative thing. When someone says "GreatProgrammer was previously at Foo startup with HappyCTO" there isn't that same admiration for Foo startup or HappyCTO as if you say "GreatResearcher did their PhD at Foo University in Professor Happy's lab."
I don’t disagree, we should all be kind.
That said, this post gets voted to the top because it has (samaltman.com) next to it. If it had (jonnybeeble.blogspot.com), it’d get maybe a few upvotes and comments and that’d be that. But here it immediately gets upvoted to the front page, therefore receiving intensive scrutiny, and here we are at 80+ comments all kind of saying the same thing.
But I think the post called for examples of specific founders/researchers and their situations, e.g. how they manage going deep in weeds vs. steering long term vision, either practically or emotionally, is there something special in how they manage this? Or on persistence is he seeing people sacrificing weekends for 2 years in a row or mastering deep work practices or...
Twitter is asking you to login because you clicked the “tweet” button to post a tweet linking the article and tweeting requires a twitter account.
Sam Altman is the former president of YC.
Instead, it just kinda stops abruptly.
On a serious note, I beg you to write about things other than research and researchers. Leave them alone, outside of the media spotlight and your writings. You see the media and its spotlight have a tendency to disrupt and destroy value. If you truly want do good, leave them alone. Please.
Having met people from both groups, the other word I hear a lot is impact. That's a qualitative metric to define success.
I feel like I almost never have creative ideas - the entirety of my (short) engineering career has been spent working on school projects, contributing to a design team, or set projects at work.
Am I screwed if I want to be successful as a computer engineer? (specifically hardware)
But also keep in mind -- creativity is a muscle that can be flexed. Don't sell yourself short. Work on it.
Being able to do the work is really all it takes to be "successful" in the sense that you can support yourself and pay the bills.
Beyond that, it really depends on what your definition of "success" is. One of the biggest realizations on the path to maturity is that "success" has a different definition for basically every individual.
I mean, youre always welcome to give it a shot if you think youre so good. I can guarantee though that the results will surprise you. Most people cant even handle raising a child, the most important work there is.