I think of this in terms of the 'great man' theory of history.
https://en.wikipedia.org/wiki/Great_man_theory
And feel that it's obvious that science works the same way, and we should focus on the "science from below" a lot more than we do.
Charles Fort talked about "Steam Engine Time" and it gets referenced in Discworld:
https://fancyclopedia.org/Steam_Engine_Time_(concept)
There's similarly the book that talks about all the 'genius' people that Einstien was surrounded by who we don't generally hear about, but I can't recall the name right now.
There's the concept of 'scenius':
https://www.wired.com/2008/06/scenius-or-comm/
And in music/art there's the concept of "The Secret to Creativity Is Knowing How to Hide Your Sources" which (appositely) is often mis-credited to Einstien.
Also https://en.wikipedia.org/wiki/Stigler%27s_law_of_eponymy
Basically, I don't think you can look seriously into anything and come away with thinking that one individual did it all themselves unless you are very prone to that conclusion. And it seems human's generally are prone to that bias.
Inkremental research flourishes again, exploring every nook and crevice of the newly discovered basin.
How to get the personal that does the final break through? Hire exotic characters who are uninfluenced by societys dogmatic powers. Heretics, Insane Prophets and Grifters, force them to be days filled with the material, so it fills the subconcious. Finally add some dogmatic priests to the mix, stand clear of the explosion.
Do you by any chance consider yourself one such? I hear this kind of rationalization a lot, almost always from people who feel that they themselves are candidates for the "misunderstood genius" role. It's a bit of cargo cult or selection bias (attributing outcomes to superficial and barely-related behavior), a bit egocentric, a bit misanthropic. What it's not is a formula for progress or innovation. Life is not an Ayn Rand novel, nor should we wish it to be.
A bit of "thinking outside the box" is certainly necessary for progress to occur, but that can be done in a collegial way - e.g. Feynman, most others involved in the Manhattan Project or early computing. Constantly deriding what others believe based on currently available evidence as "dogma" - five times in three short paragraphs, for example - is a huge red flag that something other than genius is at play.
Your mention of the Newton-Ortega distinction is the first I've heard of it by those names, so I'm not entirely familiar with its scope, but in reading the Wikipedia entry it seems to assume the contributions of the scientists in question are somehow known, and it's a matter of "does science progress with lots of little contributions or a few big contributions?"
You can turn this on its head though, and suggest that "contribution" really means "discussion in the literature" which is a property of the citers and not the cited. So, if say, a Newton comes along and has a brilliant discovery, but no one understands it or it goes into the wrong outlet and isn't read, then there will be no impact. Conversely, though, if something comes along and there's a rush of recognition of a concept, and someone publishes it first, is that because this "big innovation" was associated with that first publisher, or because the readers all had the same collective idea, and they're just citing the first to goal?
The issue is that scientific development is not actually a property of the discoverer -- the discovery is necessary but not sufficient -- and the "size" of a discovery and who makes that discovery aren't really the same thing either.
I think that e.g., (1) finding that science progresses in big leaps rather than small steps, and that (2) there is a "first post" phenomenon doesn't mean that the big leaps are necessarily due to the first poster.
My personal experience is that all of this bibliometric research is a little distorted because so much rapid change in process has happened even in the last 20 years, and much of what actually happens in science and scientific credit is much more complex than bibliometric models allow. It's difficult to study big versus small contributions accurately when political maneuvering and social dynamics is such a big part of what happens.
It's interesting to think about, in any event.
The doc is on Youtube here: https://youtu.be/ObPg3ki9GOI
It was very clear that Newton in particular was building incrementally on work that other mathematicians had done before him. Leibniz was also making incremental improvements, but was apparently somewhat more visionary in his work. The Youtube comments are also interesting, including anecdotes of ancient Egyptian land surveyors using the same "sum of lines" technique (what Leibniz generalized into our modern antiderivative) in order estimate the areas of unevenly-shaped land plots along waterways for taxation.
Leibniz in particular was fascinating because it shows the incrementality of science continuing after him as well. Apparently Leibniz had built several computing machines, 150 years before Babbage, and was deliberately trying to work towards general and abstract paradigms for solving mathematical problems, 300 years before Hilbert, Church, Gödel, and Turing!
For many achievements, most credit belongs to the role, not the person. In other words, the counterfactual of "if we didn't all agree that somebody will do this" would better prevent the achievement than "if one specific person didn't have their talent." We have phrases like "the time is ripe" because we know this. When conditions are in place, things happen.
The person in the role is who society has given the permission and the means to do the thing, whose doing of the thing is accepted and even expected, so it's mostly unremarkable when that specific person is the one who does it. That's who we supported in getting it done.
No one is surprised when every major fire was put out by firefighters, every touchdown pass was thrown by the quarterback, or every US law was passed by Congress and signed by the President.
Maybe Alice is uniquely qualified to be the president, and President Bob is a stooge. He's still the one to sign the law, though, and the law still gets signed. No amount of merit Alice has can change this. Society has still arranged itself to have a person in this role and Bob is just the person.
The arrangement (and the unavoidable path-dependence of history) is what magnifies the tiny contribution of the individual's abilities and makes them seem so necessary to the outcome. Given the prerequisites we've hidden in the role are met, the individual is all that's left.
Newton was a genius who invented calculus—but so did Leibniz. Einstein was a genius who invented relativity, but arguably so did others whose work we lump in with his; it makes a better story. They all had social institutions to support them achieving whatever they could. Their ability to get it done provided that support was clearly not unique.
It's more important that the roles to do the work have people in them, the conditions are in place for someone to do it, than exactly who is in which role.
Does this mean no more awards? Like, will the Nobel prize always go "to the institution" that made the discovery? Should biographies be titled POTUS: Period X-Y?
You see what I mean? Part of why we attach credit to people, is because people are inherently interesting. Institutions are inherently boring, and supposed to be. Storytelling is maybe not as important as the actual achievement, but it's not nothing either. No child will ever want to grow up to be a nameless crank of the system.
Certainly those people worked with a groundswell, but it took their extreme individual interests (or lack thereof) to determine those big outcomes.
I think this is generally true across congress too -- stuff that just happens to have a powerful champion goes far, other equally popular issues languish. My wife works in hospice policy and the selection of members who just happen to have had a good (or bad!) experience with a family member in hospice really changes which bills happen.
The game Civilization was really great at representing this. Tech advancement is a DAG, rather than a chain or a tree. Reality is like this, but much, much more granular.
For instance, I'm acquainted with the inventor of the electret microphone. I'm sure that mic brought together a whole bunch of advancements, and once I was aware that miniaturizing microphones was nontrivial, it became clear that, for example, pocketable cell phones would not exist if not for tiny microphones.
I see this disconnect all the time in how we think about technology, and which innovations actually lead to major cultural changes. We get wowed by impressive inventions, like blockchain, but to date, it is largely not penetrating society outside of its own expanding hype bubbles. It's pretty clear to me that blockchain is not, of itself, transformative. Nor is it the final link in enabling a society-changing breakthrough. The question I'd ask is whether that's because other boring adjacent developments are missing? Or is it a relative dead end? I'm picking on blockchain, but a lot of other innovations could be chosen here.
In trying to predict the socio-technological future, it is important to become attuned to the boring, enabling technologies. And if I were an investor, I would also have to understand whether the economics of a potential enabling technology are attractive for achieving returns. I think of all the fiber that was laid in the 90s. Investors were right that it would eventually become really important, but they wrong that it would be lucrative for them to invest in.
For the next 50 years Sagnac interferometery was a dead end, a minor curiosity in the history of physics. Then in 1963, Macek and Davis at the Sperry Gyroscope Co. figured out how to build this in a laboratory environment with the recently invented lasers. The coherent beam of a laser unlocked the usefulness of the Sagnac effect. Meaning that just another 30-odd years of work by hundreds of people around the world got to a situation where fiber-optic gyros are superior to mechanical gyroscopes and capable of things that mechanical gyros could never do. But all sorts of things with scary names like "anti-Shupe winding" had to be invented and then perfected to get these fiber-optic gyros to be so good, and that was the result of many people, who probably all knew each other through the conference circuit and in meetings, sharing ideas and then improving on each other's ideas. So who gets the credit for the Fiber-Optic gyros? Sagnac? Laue? Macek and Davis? Shupe? What about Ring-Laser, which is different in engineering but also based on Sagnac interferometry?
So, the Sagnac effect itself was worth nothing, and for a long time afterwards was just something that a few scientists even knew about. But a century later- and with the hard work of hundreds to thousands more people- the world depends on it.
In an alternate dimension, I got a Ph.D in this sort of stuff- I am truly fascinated by it. But I decided to try and be one of the engineers today rather than documenting what the engineers of the past did.
> Jasen would go so far as to argue that [some difficult achievements] makes [Wegener] the Togo
The more apt conclusion from the article is probably that the Balto/Togo theory simply isn't as good a model for the scientific discovery process as the author was hoping.
My takeway is not that Togo is a better choice for the hero than Balto, it's that there is no correct choice. Balto, Togo, and a lot of other dogs were part of a group effort. Balto became the mascot. But humans need a simple story, so they confuse a mascot with a hero.
America is named like that because someone named Amerigo claimed columbus was wrong in thinking he was in India. People had to name it something, so they shortened 'Amerigo's continent' to America. Was he worthy to name a whole continent? Probably not, but still he has a continent named after him.
Transistors were invented by Lilienfeld, yet everyone knows about the work of Bardeen et al. at Bell Labs. Einstein in relativity is the Balto of Lorentz-Poincare, which is why his Nobel prize mentioned the photoelectric effect and not relativity.
Simple formulas of scientific credit tend to stick better; the path of scientific progress is usually anything but. This is ok. The main thing is the knowledge itself, not who came up with it after all.
And yet it is very frustrating!
That doesn’t add up because there was no Nobel prize at all for relativity to anyone. Searching online the consensus seems to be a mix of things:
* general confusion that his photoelectric Nobel was for relativity.
* the experimental proof of relativity came at a time when anti-semitism was on the rise and so there was enough of a fig leaf of “there’s disagreement / dispute of it being proven”.
* Einstein didn’t attend the ceremony because of a prescheduled lecture tour in Japan (But also possibly because of a fear for personal safety as his name was on a hit list by the perpetrators of a successful assassination). This made it seem like he snubbed the prize committee (which maybe he also did because he probably should have had 3 Nobels for all his contributions and felt like that prize wasn’t handed out on merit).
Certainly by 1945 special relativity was proven as otherwise there wouldn’t be an atomic bomb and yet still no Nobel for anyone.
Also, this:
> One difficulty is it’s hard to distinguish “ahead of their time beacon shining” from “lucky idiot”
The world is full of lucky idiots who are continually attaching themselves to one "contrarian" idea after another, in hopes that they can claim primacy after somebody else does the hard work of developing or popularizing it. No shortage of them here, for example. It's a gamble, betting that others will ignore or forget the more numerous (and sometimes even harmful) misses accumulated in the process. The world could do with a lot less of that, TBH. There are already more credible theories than people with the knowledge and patience to explore them properly.
"Jasen would go so far as to argue that shining a beacon in unknown territory that inspires explorers to look for treasure in the right place makes you the Togo, racing through fractured ice rapids social ridicule and self-doubt to do the real work of getting an idea considered at all."
It was in the discussion of Wegener, which muddies the waters a bit, but that "explorer" role leads to a lot of selection bias in science. That is, that "social ridicule" can be extended pretty far to "career ends" or "ostracism" so you end up in a situation where it's not just that people doing that work get less credit, it's that they might be driven out even when they were on the right path.
There's lots of examples of this with the pandemic. E.g.:
https://www.nbcnews.com/health/health-care/scientists-were-c...
https://en.wikipedia.org/wiki/Gunnar_Kaasen#Last_leg_of_the_...
Seems we messed up by not getting dogs involved in covid vaccine delivery.
> Disney later made a movie about him that makes no mention of Balto for the first 90%
The character appears at the 5 minute mark!
I'm certainly guilty of doing that, so I can't judge.