The case of Wikipedia shows that efforts like that can be scaled but it's hard to find more examples.
A counter example is the old DMOZ directory which was a continuation of the tree-style Yahoo web directory.
That was subdivided into sections that were managed by "experts" (admins appointed to the sections) but the admins were lazy, corrupt or both. Even if they had been doing their jobs they would have been overwhelmed by hardly relevant "content marketing" submissions for web pages that are "200 OK", hot keywords, and a scam.
You have to draw from a spam-free pool.
For instance there is a data dump for stackoverflow
https://www.brentozar.com/archive/2015/10/how-to-download-th...
personally i think stackoverflow is a junk web site full of more wrong answers than right answers and wouldn't it be nice to pick just the best answer to the question and not see the code sample at the top that didn't work that the original poster was asking about?
If you are a real programmer, I mean you are shipping, you write a unit test, if it doesn't work they are gonna send it back to you, you may learn the hard way that answer #1 is not right (the compiler said so!) and scroll through a lot of half-baked discussion before you find that answer #7 passed the unit test, the acid test, all the other tests -- you are an expert and your opinion is worth much more than the "crowd".
If "Road Scholars" like us were getting our experience fed back into Stack Overflow how different would it look?