Worst pay, top heavy salaries.
When a phd makes 80k a year and a “ML/AI” data scientist is lucky to make 100k you won’t find any progress like software
They need to cut the top heavy executive bloat, respect the mid tier with better pay
HR asked what sort of salary range we were looking at, we suggested that we won't get any decent candidates for less than 70k EUR and were laughed out of the room and they decided on a 50k limit. I've since left, but I'm pretty sure they've still not manage to hire a software engineer.
Just relocate your lab to be in the US and associated with a US University. Then you can hire a Masters/PhD student from Europe's top schools at that salary.
On the negaive side, yeah, all the truly high talented people will leave because the salaries in Europe can be so laughably low.
Trying to figure out how to move back to my home country now, but short of director level positions, there’s just nothing with a comparable salary range.
Different story in Switzerland where you can easy get 140k+ CHF.
In most of Europe you can find a good young engineer/developer with several years of experience for €50k.
Porting it to java will speed it up a few x,
and if it uses some silly library like pandas or numpy or spark then consider it a great time to rewrite it from scratch properly with no dependencies :)
It just sounds like completely different systems / way to run the numbers to me, not at all apples to apples.
Generally the US pays better and at the top of any field you care to mention except perhaps finance it pays far, far better.
[1] https://en.wikipedia.org/wiki/List_of_countries_by_household...
https://www.bls.gov/oes/current/oes151256.htm#st
https://www.bls.gov/ooh/computer-and-information-technology/...
At those income levels pretty much the only thing you’re really priced out of are nice single family homes, but I suspect that’s the same in Zurich.
We will see salaries in the UK increase over Germany.
Lower salaries make it easier for businesses to compete.
Shameless plug: I'm CTO at Streamline Genomics, a Canadian biotech startup, and tech is our limit. We're remote-first and we're hiring for a bunch of positions: https://www.streamlinegenomics.com/careers
A parallel way is to start a company that only does one part, say a software based part. Pharma will pay more for tools that solve their problems than they would pay for a couple of FTEs to solve their problem directly. The problem is your customer may not know how to put your product to use, or even understand its value. So sales is no slam dunk.
I loved my time in the pharma business but was happy to go back to what’s called “tech.” Culturally the life sciences CES are full of some dreadful pathological practices, and I prefer the pathological practices I’m familiar with :-(.
Facebook, as an extreme example of profitability, can pay its directors and EMs $1m+ per year in total comp, but it's not top-heavy - it pays its engineers ~$200k out of undergrad and >$350k after a few years.
By contrast, I know of many cool hyped-up hardware unicorns and biotech companies, and none can compare to FAANG in pay because software scales in a way that other businesses can't. (One hardware unicorn pays its new grad SWEs $60-70k).
Unless the biotech company is actually a biotech-focused SaaS company, it's inherently going to have a higher unit cost that prevents software-level comp. It's a disappointing effect of the current system we live in.
Besides the top heavy compensation problem, the other problem I had with biotech is that everything is swamped in patents, NDAs, secret patents, secret NDAs and in the US there's also a lot of spooky secret government bullshit. I worked on information security so I'm not that scared of three letter agencies, but in my short time dealing with biotech they were actually not letting me do my job, which had never happened before.
Speaking as someone who has spent the last six years of their career working on advanced physics in various technology sectors (including biotech) and then trying to make various 2D-xene materials work for semiconductors, I’ll tell you one thing:
They pay you shit and if you think you’re all treated badly in FAANG, hoooboy, at least nobody has nearly caused deaths in the lab through negligence!
No one thinks that. Workload at some places maybe a little on the higher side but still on average monetary and toll on life wise FAANG is probably one of the best jobs.
Citation needed. I'm pretty sure innovation is happening in all fields.
> Software is no longer such a field, our brightest minds should be going elsewhere.
Citation needed. Pretty sure innovation is happening in software too.
This post is... utterly fact-free, and really just reality-free. It literally says nothing besides "biotech excites me". The author had no need to add a bunch of false generalizations on top of that.
I think a fact-based article exploring the question of whether software really does have a slowing (or just slow) pace of ideas and innovation would be much more interesting than this article, but not every article has to be factual reporting and synthesis, some can be of the form "I believe this is how the world looks, and I draw these conclusions starting from that belief".
Personally, I relate to his premise, though I'm unsure it's true. It does seem to me that there aren't many new ideas in the industry. We seem to "just" be doing things we've been able to do for a very long time, but making them more accessible to more people, more efficient (with respect to human time while not necessarily computer time), and more scalable. We seem to me to be doing a lot of streamlining but not a huge amount of innovating.
I have a lot of non-programmer friends who sometimes say that programming must be very dry and boring. My shiny go-to example to convince them otherwise is a nice desktop planetarium app, which you can't develop without first learning how the solar system works. Once you do you can write that down in code - an executable, computing form of knowledge, a living document that allows you to tinker, refine, share, reproduce. Software truly is pretty neat as a human societal tool with a wide range of applications.
It should be for everyone. The other thing I tell them: If you've ever been in bed in the morning and planned out your steps for how to get that cup of coffee you need, designing an efficient bed-to-coffee algorithm, you've already been a a programmer.
As a relative dummy, I guess I'll remain in software.
It takes a long time to develop biotech, test it, approve it, market it, and make money. There is also a limited market (ie the people sick with that condition, specifically in rich countries). The reason tech companies make money, grow/iterate, pay more, is because they are in a field that does not require the same oversight and moral safety obligations (maybe they should to an extent) as well as being marketable to basically everyone in rich countries.
It seems to be equal parts users stuck in a local maxima of computing skill and how that enables lax software engineering standards.
When’s the last time you’ve sat down with a user who isn’t remotely interested in tech and watch them work/use a computer? Most of the population’s mental model of a computer is starkly different to the average hacker news reader. You can still hear the same complaints about how computers “don’t do what i want it to do” that i remember my parents generation saying, and they were experiencing the first waves of computerisation in their offices.
The story became that the older generation just couldn’t understand the new generation, but kids are amazing with computers because they’re growing up with them. Well, some of those kids are just as hopeless. It’s partly an education problem (hard to learn computing from a teacher who doesn’t understand it themselves), and partly because UI design trended to simplifying everything as much as possible so that users who don’t understand computing can still enjoy and use their devices. Now there’s not a great incentive to learn more than you need to just use the UI you’re given, and computing skill tends to get stuck in this local maxima.
I won’t go on about my other point in detail as it’s a perennial favourite for hacker news discussion. But hardware gets faster so quickly, but our software is so hastily thrown together that it eats up all the gains. Users don’t notice that software they’re using is crap because their mental model of computing isn’t developed enough to know what’s happening. Instead we get this casting of devices as somewhat malevolent entities (“ugh, my stupid computer keeps losing my stuff. I need to buy a new one that isn’t so dumb”)
We use to think this would be resolved with time and generational change, but it seems like there’s just a more-or-less static percentage of the population that just doesn’t get computers. (Which is completely understandable, people have different interests, it’s hard to inculcate an appreciation of something in your entire population, look at peoples relationships with mathematics)
Ever written Javascript? Good luck doing real maths with only floats.
Ever used Electron? Hope you didn't need that 8 GB of RAM.
I could go on, but most mainstream software is one step above absolute garbage. There are isolated islands of extremely high quality tools, but they tend to be esoteric FLOSS packages that are isolated from market pressures. OpenBSD is a work of art.
The biotech industry (which is made up of at least three rather different verticals: tools, diagnostics, and therapeutics) is changing quite a bit today. There are certainly companies that have less of a technology or data emphasis or who are still trying to figure out the value those could bring, and those companies are far less likely to pay well in SW/DS roles. There are others that either from their inception or more recently realize the value these approaches can deliver and compensate accordingly. I personally find the new wave of biotech startups that are focused on being hybrids of experimental and computational capabilities extremely exciting (which is why I'm at one) and these are the firms where software and mathematical skill sets are most likely to be valued.
You'll probably still make more on Wall Street than you would in biotech. But you don't have to be _badly_ paid in order to work on a meaningful mission. OP is, IMO, correct that biology is entering a phase in which computational skills are a rate-limiting factor in our ability to make advances (note: not _the_ limit -- experiment is still absolutely critical), and it's a super exciting and impactful field to be in.
Shameless plug: Recursion is hiring a TON of positions in data science and machine learning, engineering, and elsewhere. Check us out: https://www.recursion.com/careers. (Contact info is in my bio.)
I work in "Wall Street".
My firm sells things to those who want to buy them in a highly regulated marketplace with considerable governmental and self-regulatory oversight.
We do this on behalf of investors who entrust us to use their capital as fiduciaries for their, and their clients, best interests.
We solve challenging problems with cutting edge approaches involving non-trivial technical, statistical, and business considerations.
And we are not paid _poorly_ for our efforts.
Please stop pushing a strawman financial industry narrative that we've no meaningful mission to serve your industry's recruiting needs. I don't poo on biotech to source my hires.
(Of course none of this constitutes financial, or personal, advice.)
I understand the point of places like Fidelity or Charles Schwab, that offer the general public effective, low-cost investment vehicles. It's hard to deny that's a meaningful, socially beneficial thing to do. But when people punch at "Wall Street," they usually have in mind places like G-Research, whose only real purpose seems to be making an already rich owner richer.
My point was that _if_ one's primary goal is to get the highest comp possible, it's probably better to go into finance than biotech. No bones about that; we don't pay as well. But that does not mean that biotech universally pays _badly_ for technical skills, as casual readers might assume based on reading this thread. My point was that if someone is both technically-minded and interested in the mission in biotech, it's worth taking a closer look, especially at the wave of companies from the last ten years bringing tech and biotech together.
* Not everything, I'm sure, but neither is everything in biotech!
If you're exploring the world of AI for drug discovery, IMO, it's important not to fall for the trap of companies that just do "AI on public available data". That's just the same "the fundamental problem of software is a lack of ideas" phenomenon the article was talking about. You want to meld software and data science skills with the fundamental work of whatever industry you want to be the vanguard of.
Seriously, if you're in the market, check out Recursion. My contact is in my bio also, and I'll answer any questions Imran can't (or won't :p).
We've been doing a lot of remote work during covid and I have definitely noticed the decline of water-cooler solidarity. We still do hire remote engineers and data scientists, but only in cases where they are the literal top of their field.
[0] Standard disclaimers: I work at Recursion, have drunk the Kool-Aid, etc etc
Which I think is fair. Innovation does also occur where it doesn't necessarily need to.
the brightest don't spend their time & energy making others wealthy
Fields where you don't make other people wealthy aren't so rosy either. They bring their own drama to the table.
The brightest absolutely do spend their time and energy making others wealthy (ex: I would consider most senior eng at FAANG to be bright and although they are definitely rich, they are not wealthy). I suspect that's because of the cycle of responsibilities and spending most of their incomes.
I've seen millions of dollars-worth of software development waste as FAANG and BigCo. Nobody bats at eye. Because it's a bizarro world with no consequences. All you gotta do is not get wrapped up in it and collect checks ;)
As such it isn't mutually exclusive to be a software engineer while working in a field where tech is the limit. (But even so in my opinion, software itself is still just getting its bearings)
"Ideas" were and still are largely worthless. They are absolutely not the bottleneck in software today. There are a billion implementation-level problems that are still unsolved, and there will always be new ones.
Like, I've heard senior leadership at a 'computation-focused' biotech company outright call engineers 'bad people' only to quickly correct this to 'bad at being people', which is so much better!
We have capability based security as a model for having computers that don't get taken out by any flaw anywhere, but... like doctors who refused to believe that washing hands helped save lives, most programmers don't believe in it, or have never heard of it.
Computers used to be leading edge because anyone could just get a machine and start hacking away at it, with physical hardware being the only limit. Now our operating environments are about as secure as a forest during high fire season... only little spark, and poof... your house is gone.
We're about 10 years out, not because of a lack of ideas, but because of a lack of adoption of technology that works, instead of the old stuff extended way too far on a bad local maxima.
Innovation as a goal sounds noble initially, but in my experience it's like chasing the wind. Faithfully doing what is already known to be good seems better for everyone. It might even be the quicker road to innovation.
As to your second point, we did have people with exceptional sense of social responsibility in software, but multiple factors, including easy/fast money, unspoken agendas, both co-opted and corrupted some, and drew in 'fresh blood' that was motivated purely by the new social cachet & easy fortunes of software work. (These are the ones Alan Kay would say are the 'pop artists' of 'pop software'.)
As to innovation and "limits", there are legion and everything from the hardware, OS, libraries, to languages are on the table for innovating, to say nothing of theoretical breakthroughs in pure comp-sci.
Software is the closest thing to magic we have in the modern world. The imaginal sky is the limit.
One option is research software engineering, where SWEs team up with researchers to produce better code for models and simulations. Are there any research fields where synthesis of domain knowledge, programming skills, and computational thinking could bring great benefits?
I imagine that technology in any kind of manufacturing or mining field is going to be similar or predominately dominated by one or two big players that haven’t faced an innovative competitor in decades.
Ditto for Wall Street. The “innovation” tends to sit on top of woefully outdated systems rather than replace them.
I think the speed at which they were able to develop mrna vaccines just shows how far along we've come, but also how much more we have to go. Things like protein folding at deepmind definitely requires all these things you mention.
Most of the time when someone is saying something that amounts to "I can't imagine what else we could make", it's a failure of their imagination that's the problem.
Sure, it can be frustrating to be banging your head against the same wall as everybody else, but there are people that thrive in such a setting. The most extreme example might be pure mathematicians.
I'm not even sure what to think of it, honestly.
But I can think of all sorts of jobs that are more important to society than working on some random doomed-to-fail SaaS that won't pay remotely near that.
In the case of research positions, the funding situation has oversaturated the market in most entry level positions - turning negotiation and career advancement into a trial by fire.
One thing I've always wondered about biotech... I imagine there are many non-obvious correlations and interactions in medicine, which would be easily detected using nothing more advanced than Excel-spreadsheet level data analysis.
Making up an example: people with a certain DNA trait/allele who also have a diet with a high amount of XYZ tend to not develop disease ABC as frequently as most people. Even if we don't know the pharmacological reason why that is, it would still massively benefit lots of people, right?
So it always seems to me like tech from 2007 was ready to tackle this problem. Dump in a bunch of anonymized data, find correlations, repeat.
But I feel like I never hear anything about this type of work. Is it happening, but not publicized much? Is it actually not as simple as it sounds? Does nature simply not work in this way?
Even if 95% of diseases are just "bad luck", I assume that other 5% is made up of environmental factors we don't yet understand, but could easily learn using well-known data processing techniques?
So... anyway you're right that this is a natural way to approach the question of understanding the genetic basis of disease and physiology. But it's been beaten to death and found to drive fewer insights than were hoped
[1] https://en.wikipedia.org/wiki/Genome-wide_association_study
But finding a correlated gene is only the first step. One issue is that a single protein can participate in hundreds of seemingly unrelated chemical reactions throughout the body depending on the cell type and environment. So simply tweaking the genes expression will have many unintended consequences.
For instance, each cell is constantly maintaining a baffling complex balance between growing and dying. Any external perturbation has a good chance of either killing the cell or causing cancer.
The important part is “and there is promising tech on the horizon”
I think trying to get a startup based on space travel at relativistic fields would be pretty difficult.
Steve Jobs was a master of this. Seeing promising tech trends that were just about ready, and putting them together at just the right time to make innovations that were world changing.
I feel like the author has an overly one-dimensional definition of both innovation and what it means for someone to be one of "our brightest minds".
There are two different goods and two different talents at play here. The first is taking an idea that already has been had and making it possible. The second is inventing new ideas. Both are goods, but each requires very different talents.
Biotechnology desperately needs people who, given a great idea, can break technological barriers and enable it. If we accept that software is bottlenecked by ideas, then software desperately needs people who can radically change paradigms. "Our greatest minds" consist of both types of people.
Because better technology won't get the entrepreneur a satisfying reward.
Your enjoyment of it hinges on whether you can be happy collecting a paycheck doing the bare minimum and satisfying your tech itch outside of work (FOSS, side hustles). For some, that is perfectly acceptable or even ideal, especially if you can get away with working fully remote.
For someone who is relatively better at ideating, I would argue the opposite is true.
There are some important differences that are often overlooked by those coming from the world of computing...
I thought this in the 1980s
The caveat ofc, is that I'm still young with no ties.
Chomsky was talking about this some 15 years ago.
1) If you are able to bring a major drug to market 3 months earlier, it's worth billions. Hence the continued interest in computational approaches.
2) Salaries in the pharma/biotech biz are set nationally. Yeah, there are variations by geography, but less than one would expect. Thus, a PhD with x years can look up the salary range per region, etc.
3) The data is confusing and the error range(s) are unknown. So, many/most of the models are retrospective rather than prospective and if the initial guess at the biological target or model fails, everything else is a waste of time. Google for all of the failures re: Alzheimers.
3b) As we can't test on humans (at least not ethically), we're totally dependent on animal models being good predictors of human behaviour. But, while chimps are like 98% similar to humans, the difference has resulted in catastrophic failures in Phase 1 testing. Diseases by the score have been cured in mice...
4) Computational modelling occurs at the start of the process, which is the most efficient. I think they had a sequence for the mRNA vaccine a few days after the Chinese published the data. Getting it made, stable and deliverable is where the time was consumed. And then the various clinical trials are significant costs in time and money. Hard to trust a model for a new class of disease or mechanism.
5) Computational methodology has been (over)sold since the 60's. Yeah, there have been successes but they've been way fewer than hoped and people have grown rather jaded when presented with the latest breakthrough. ML/AI isn't really new as it was studied in the 90's, but there's way more data. See (3) above.
6) The crystal doesn't always form. The reaction yields brown oil rather than white powder, or doesn't scale. Chemistry is messy. And there's a lot of material design problems that have not been amenable to modelling. There are new ways of gathering information (CryoEM), but we still need more/better.
7) We need newer software and better parameterization. Both of these trace back to academic work on Vaxen, maybe SGI's. Visualization software is probably the most valuable tool right now, with broad acceptance in the research stage.
7b) Physics might bite us in the ass. MD software, for example, tries to model explicit protein, ligand and solvent atoms/molecules. Even given revised software and parameterization, entropy or chaos might prevent accurate numbers or what we can calculate might not be pertinent.
I could go on (and on), but I wanted to leave you with an upside... If anybody DOES deliver the goods, they'll be bloody heroes. Fame, fortune, the whole gig - like CRISPR and the other advances that have occurred. So, if you and your buddies are smart and dedicated, it'll beat the snot out of selling ads on handhelds in terms of making a difference.