I'm in compilers and programming languages. I really wanted to create my own language as part of my PhD thesis, but I was basically told that this would be unpublishable. I mean, how can you hope to numerically show that your language is better than everything else around? Plus it's all been done already, nothing else new can possibly be invented in that realm.
Things weren't always like this. In the 1970s, we created things like Smalltalk, ML and LISP, which had a tremendous impact on the programming world. People also had bold ideas about artificial intelligence and nuclear-powered spaceships. In the 70s, people were allowed to just explore ideas, in the hopes that these ideas would lead to something good (and some did). Now, it's much harder, you bring up an idea and people immediately try to shoot it down, ask you for proof that it definitely will work, and bring up the most asinine suggestions as to why your idea will definitely fail.
Today, the exploration has been scaled down. It's not because the exploration failed, we invented many great things as a result of it, it's largely IMO because we live in different economic times. The USA is no longer in an economic boom, things are no longer in expansion. There are cuts to scientific funding, cuts to education. People are being told not to be "wasteful". We live in a much more nearsighted world, in a sense. Being a dreamer isn't considered a virtue.
In Kuhn's view, "normal science" is mostly iterative and incremental. It is so because most people in the discipline agree on most of the big issues, and people are mostly refining those understandings. The periods where people don't agree on the big issues in a discipline is around the time of scientific revolutions: the solutions to such big issues are so different from previous approaches that accepting them requires a complete re-think of what the discipline is.
A lot of areas of CS are in the "normal science" part of that cycle, and I think compilers and languages are in there. (The biggest argument against that is concurrency and parallelism.) In the 60s and 70s, programming languages were new, and they changed computer science forever. We were exploring what these things could be.
I also recommend Cristina Videira Lopes's blog post, "The Evolution of CS Papers": http://tagide.com/blog/2014/02/the-evolution-of-cs-papers/
You're trying to compare different times, but its getting confused because you're actually comparing one incident now to an aggregate impression of the "1970s" (though, from the 3 specific examples cited, the "1970s" you are talking about are really something like 1958-1973.)
I'm pretty sure I've seen some recent theses that did describe the creation of languages, too. Though they had less of a theoretical computer science bent, and more of a focus on saying "this is a language for X problem domain".
I am inclined to agree with your larger point that we have gotten too conservative, but I also think we are awash in programming languages. Before I would encourage someone to create a new language, I would want to see a good argument that none of the existing ones would suffice for some interesting purpose.
...how much of that is due to the fact that CS is now much more of a mature field compared to 44 years ago. It seems like the fields of synthetic biology and biological and chemical computers and self replicating nanobots (hey Von Neumann again) would be more open to grand scale ideas, and there are lots of exciting opportunities that haven't been explored.
http://en.wikipedia.org/wiki/Synthetic_biology
http://en.wikipedia.org/wiki/Biocomputer
http://phys.org/news/2014-01-slime-molds.html
http://en.wikipedia.org/wiki/Chemical_computer
http://en.wikipedia.org/wiki/Self-replicating_machine#von_Ne...
There still are some polymaths, but it's hard for them to make as fundamental a contribution in fields that have already existed for a long(ish) time.
Yes, but it misses the fact that polymaths historically have solved that problem by creating new fields. Richard P. Feynman, as one example, lectured on nanomaterials and nanodevices decades before the technology existed to make his ideas practical. Einstein shaped relativity theory about four decades before there was any way to confirm (in detail) its theses or apply it to practical problems.
http://moreintelligentlife.com/content/edward-carr/last-days...
Since the age of six, von Neumann was fluent in Latin and ancient Greek. He also read (and remembered completely) all the major works of antiquity.
As someone who's been struggling with Latin and Greek for over 10 years it completely breaks my spirit to know that for someone else it was so effortless.
It seems like there's a free lunch somewhere. Which is unusual for living beings, for evolution to miss it.
I think you might be onto something - I suspect its that genius is not genetic, which indicates it might be nurture not nature - which means things might be really cool for humans in a hundred years.
http://www.amazon.com/Prisoners-Dilemma-William-Poundstone/d...
In particular, von Neumann was an "amazing human" as long as you don't mind him advising the military to bomb Russia out of the map, preemptively. What a wonderfully humane move.
"If you say why not bomb them tomorrow, I say why not today? If you say today at 5 o'clock, I say why not one o'clock?"
(and this is not just some cranky scientist rambling -- he was very high up in the military circles, an advisor/policy maker).
Another common target of cringeworthy worship: Feynman. All fun stories (and I'm aware saying this is probably not going to go down well on HN) until you read about the abuse of women, paid abortions, broken marriages of his colleagues...
They're all human.
As a mathematician, it's probably worth pointing out the use of the implication. "If you are planning on bombing them tomorrow at 5pm then why wait?" is not equivalent to "Russia should be bombed now."
And indeed, of all the logical fallacies I see on the Internet this is one of the ones that tends to bother me personally the most. "A -> B" is not! equivalent to "B". If I meant B, I would have simply said B. I suspect the same is true of von Neumann.
"Von Neumann knew that it was only a matter of time before the Soviet Union became a nuclear power. He predicted that were Russia allowed to build a nuclear arsenal, a war against the U.S. would be inevitable. He therefore recommended that the U.S. launch a nuclear strike at Moscow, destroying its enemy and becoming a dominant world power, so as to avoid a more destructive nuclear war later on."
http://cs.stanford.edu/people/eroberts/courses/soco/projects...
You also have to remember that those figures have pretty much all they ever said scrutinized, so that tends to distort things -- I'm pretty sure everyone one time or another made some unethical remarks, specially as a WWII fugitive. Horrible things happened in Europe and after that the harshness of Stalinism was also clear. I think it wouldn't be hard to get caught in "ultra-Americanism" at the time.
you need to stop putting words in my mouth. never did i use the word hero, never did i use the word worship. never did i suggest those things, either.
Mauchly and Eckert did but Neumann came and stole the fame.
http://en.wikipedia.org/wiki/John_Mauchly#EDVAC
Eniac: The Triumphs and Tragedies of the World's First Computer, Scott McCartney
I always think of that when I ponder about my own struggles with life, the human condition, and mortality. I really don't want to be that fearful of my own impending death if I get something like cancer.
Anyhow, a colleague of mine at PaineWebber sadly came down with schizophrenic delusions, and concluded that he and I were illegitimate children of Norbert Weiner, while Jack Grubman and Andy Kessler had been sired by John von Neumann. I objected strenuously; as somebody whose PhD these was a special case of the min-max theorem for zero-sum games, I thought it only fitting that I be on the von Neumann side of the ledger.
Dyson's Turing's Cathedral is a fascinating account of the development of digital computers. The early ones were amazingly physical, using wave propagation delays in liquid mercury, and repeatedly painting rows of oscilloscope screens, as storage.
https://duckduckgo.com/?q=turing%27s%20cathedral
https://en.wikipedia.org/wiki/Williams_tube
EDIT: s/Turning/Turing/