https://www.tandfonline.com/doi/abs/10.1080/0969229100360783... https://www.jstor.org/stable/2522141 https://ideas.repec.org/p/ess/wpaper/id2471.html
and there are even dedicated Master's degrees for some of those questions: http://www.lse.ac.uk/study-at-lse/Graduate/Degree-programmes... http://www.lse.ac.uk/study-at-lse/Graduate/Degree-programmes...
conclusion: if you're not an expert on the matter yourself, you should be very careful when claiming that something is under-investigated. maybe it's just harder to come up with useable solutions than you think, and that's why you haven't heard of any. at the very least you should spend 5 minutes to google it.
PS: be generally cautious of anyone who writes or talks as if hew was an expert on no less than 10 entirely different disciplines.
I think this list makes no sense unless it is made into a wiki of some sort with community contributions from hundreds of people.
Under "Physics", he has a sub-heading "Increasing Iteration Speed of Experimental Physics". This section mentions one random startup. Yet CERN has hundreds of people actively working on this topic for decades.
They invented, built and implemented the world's first capacitive touch screen control system in the period 1972-1976 specifically to answer this need. That's just one example off the top of my head.
You'd just end up with a lot of noise. The person who wrote this list even has a few scientific publications, which is more background than most people who contribute to the wiki would have.
Perhaps we could create a Foundation that pays a group of scientists a modest salary to spend all their time curating the list. The public can send the Foundation any comments they want.
Of course, the Foundation can't pay enough people to do all the work itself. So they could crowd-source some of the work to people from various fields who know enough about the field to judge what's really under-investigated vs. things that are already thoroughly investigated. Those groups -- panels, let's call them -- could get together and make recommendations to the Foundation.
The Foundation employees could form panels, get their recommendations, and the synthesize that into a list.
And if we're going to all this work to create a good curated list, then maybe the government or donors or whoever could even fund some of the ideas that come out of the final curated list. Not a ton of money -- just enough to hire one or two people to work on the idea for few years. Maybe call the final curated list a Portfolio or something.
IDK what we could call such a Foundation. I guess if the government funds it we could call it the National Science Foundation or something list that. And if it's private, probably focused on more near-term stuff, maybe "VC firm".
Sounds like a really good idea.
Did you mean "for decades", or "decades ago"? I'm not sure any notable result from half a century ago qualifies any of the list he's creating for modern research (without regard to the quality of that list).
I was talking to professor Shriram Krishnamurthi a week ago about a block-based, educational ML dialect I was making, and he told me that he believed that once a language had a type system as sophisticated as mine, its target audience should be using text, not blocks. So, perhaps Scratch-style languages are a dead-end, or infeasible beyond a certain level.
My personal hope for the "next level" of programming languages is those that use typed holes to interactively help the programmer construct the program.
My impression is that blocks would be good for learning concepts without accidentally overfitting on syntax, but for writing real programs, blocks are hopelessly inefficient for input and editing. Then again, letting the user input text but automatically convert to blocks by continual parsing, might be the best of both worlds. Sort of like emacs lisp parens mode.
There are easily thousands of people working on building higher-level programming languages/models. A very short and very incomplete list of entire subcommunities of PL working on languages that are higher-level than java/python:
1. The ML family -- OCaml, SML, Scala, F#. I think it's very fair to say that these are "higher-level" than imperative OO languages. And you can definitely get a job writing OCaml or Scala or F#...
2. A whole bunch of programming languages/primitives/paradigms aimed at making concurrency/parallelism/distributed systems easier. Erlang, X10, session types, Manticore, etc. Rust might even belong here.
3. Programming languages that incorporate resource/complexity analysis.
4. Literally decades of work on visual programming languages (which have mostly resulted in modern IDEs and teaching tools like Scratch).
5. behavioral types
6. linear types (again rust kind of fits here)
7. dependent types
8. I would also argue that systems like tensorflow and pytorch are really a sort of programming language -- they have a very different model of computation than the host language. Just because they don't have a parser/compiler/etc. doesn't mean they aren't a programming language, imo.
9. Tons of other stuff that doesn't fit in the major categories above (e.g. netkat).
10. I mean even SQL belongs in this list.
Even for language/models listed above that don't have large adoption, the ideas are often incorporated into more mainstream languages in one way or another. So there are significant projects developed in each of these types of languages (with the exception maybe of behavioral types and session types).
Higher-level programming Languages is one of the most explored areas of Computer Science -- if anything, it's an over-explored field.
This is less a list of "underexplored ideas" and more a list of "over-hyped ideas with over-crowded communities". Every item on the CS list is the sort of thing that an ungrounded undergrad research intern would want to work on.
Some of the descriptions in other fields have a similarly dilettante vibe to them. E.g.,
* Bio: math bio is a huge community and all those folks are well-trained in chaotic dynamics. you can say it's under-explored, but there are probably hundreds of people working on this right now and at least thousands have in the past few decades.
* Math: there's a section on subspace packing with a side-story about a proof assistant and the author doesn't even mention Hales...
* physics: Building machines to automate experiments is definitely the sort of thing people get paid to do whenever there's a large enough market (and even sometimes when there isn't). similarly, Nuclear-powered propulsion is underexplored... as long as you don't count the militaries of the major nuclear powers, that is.
It would be worth considering what drives people towards researching particular areas, even if it might seem kinda obvious. I.e. it might be tempting that it's visions of fame, loot & prizes, but I think to most people it is obvious on some level that they personally won't get any prominent position in these fields. I think it's partly a question of discoverability (say I'm a student, how do I learn about these topics, and that there are practical ways to work on them?), partly perceived prestige.
Also, getting into a particular PhD or similar currently means thinking years in advance -- I'm not talking learning/studying here, but connections, bureaucracy and applications, having paper "proofs" you know something etc. You notice some interesting area towards the end of your studies. It's too late to move even not that much from what you're doing (e.g. move from cognitive science to computational neuroscience) without wasting additional precious years. And that for entering not particularly rosy world of academia.
Myself (not academically nowadays), I see myself searching for a middle ground between overcrowded fields (where I will probably do relative "grunt work" at best) and fields that are so obscure as to be not viable. The fear of having no steady income is too real.
Tom Mitchell, Never-ending learning (2015): https://www.cs.cmu.edu/~tom/pubs/NELL_aaai15.pdf
Sebastian Thrun, Life-long learning (1995) https://www.ri.cmu.edu/pub_files/pub1/thrun_sebastian_1995_1...
Also this seminar at Stanford on Lifelong Machine Learning(2013): https://www.seas.upenn.edu/~eeaton/AAAI-SSS13-LML/#Schedule
or naive.
It's possible for something to be over-investigated and also not produce results. See also: the build up to AI winters.
The term "technical debt" has always rubbed me the wrong way.
Most technical decisions were sound... at the time!
I agree, people from 1999 didn't predict what would be happening in 2019. But why is that considered to be some sort of debt?
Often, teams cut corners to release a feature earlier/on time, and only make it work for the MVP use case without restructuring the codebase to fully accommodate the change. In this setting the term debt is pretty fitting.
In many modern web companies a given project has a useful life of ~3-5 years, if its still running by year 8 with a team that's been on KTLO a few things are probably true.
A: No one knows how to productively add features.
B: The business need for the project was much larger than the KTLO funding would imply.
Odds are at this point there are a long list of user complaints, year+ old feature requests, and excuses being made to the board for why some initiative is facing yet another delay.
Perhaps we should be talking about software depreciation rather than tech debt?
Many tech debt traps start with relying on an LTS version of OS or libraries. The philosophy behin that is, that software behaves like a chair: You buy it once, and then you sit can sit on it until it is no longer needed.
A much better analogy is a horse: You need to feed and take care of it daily, and you need to be ready for it do die when you still need it.
No one knows how to run and manage these systems, yet we do it all the time!