We're obviously a long way off from colonizing space and needing the Internet to spread, but we still have the physics problems here on Earth.
I'm not convinced that centralization in its current iteration (cloud operators controlling huge infrastructures) is the best long run. As we saw with the recent Azure outage in South Central US, even the huge infrastructure can have problems too.
Secure decentralization has seemed like a panacea for a long time - for all things that resemble a public utility. Even things like the power grid.
You might be interested in checking the InterPlanetary File System [1] which attempt to tackle this among other issue.
I can't find it now but I remember the doc mentioning the need for a future space network to be decentralized, so there is that too.
So, the idea behind IPFS and others (SSB comes to mind, except, yacht-themed) is that it's largely a collection of offline networks, and when the planets align -- quite literally -- those networks will exchange all their new blocks.
It's a neat concept.
But I think it's not particularly accurate, in that, while latency would be extreme, that doesn't necessarily translate to bandwidth - and bandwidth constraints shaped Usenet and especially FidoNet as much as latency.
I think a more likely primary mode of operation would be WWW-like, but only your local part is actually real-time; everything else is synced in bulk as and when possible, with some creative approaches to update conflicts for writable resources.
The clock is not used to say that the time is exactly the same on all nodes, it is used to guarantee that if two events have timestamps whose difference is larger than some threshold they can be ordered reliably. You don't need an atomic clock to do it, for instance CockroachDB only requires NTP, but of course, the smaller the error margin, the faster the system is.
That being said, given that speed will be limited by the distance traveled by information and the speed of light, I suppose those systems won't have much edge over purely causality-based ones. In other words, CRDTs will rule Space :)
However, a total ordering of events seems plausible only from the relative perspective of an observer and we would need to figure out, how to think about how things like transfer duration affect each oberservers understanding of ordering.
So you would just have to agree that one observer's clock is the "master clock", and then everyone translates their local clock time into the corresponding master clock time (and all timestamps are written with respect to the 'time zone' of the master clock).
That makes for good science fiction, because Quantum Mechanics is so poorly understood by most people, but it’s in no way possible or implied by the theory. Any entangled channel of communication would appear to be random noise without a Classical channel of communication, which effectively limits entanglement to light speed.
The first answer gives a nice explanation in depth. https://physics.stackexchange.com/questions/203831/ftl-commu...
The key point is this:
Alice therefore still measures two overlapping bell curves, overall! Where are the interference patterns?! That is very simple: when Bob and Alice compare their measurements in the first case, Bob's 0-measurement can be used to "filter" Alice's patterns...
That comparison is what requires the Classical channel, and we’re back to light speed. If you try to use a Quantum channel to compare you just have two things to compare and a lot of noise.
* The interface is dead simple - share this folder, done.
* It is a read-write browser. Netscape (and other browsers) used to be this way - they had some limited HTML creation tools. Beaker brings this back in the form of making an "editable copy" of a website. It's a choice in the address bar.
* Making an "editable copy" doesn't have to mean you're now editing raw HTML. An editable copy can direct how it is edited through JS. (See the recently released "dead-lite" for an example of this.)
All these attempts are exciting but I'm actually starting to use Beaker because it's so useful even without adoption.
(Also, I'm not even sure how you could p2p private user data, unless you expect everyone to carry around one or more yubikeys, or implant chips into fingers or something; plus all devices need into buy into that. But I haven't given that much thought.)
* You can generate domains freely using pubkeys and without coordinating with other devices, therefore enabling the browser to generate new sites at-will and to fork existing sites
* Integrity checks & signatures within the protocol which enables multiple untrusted peers to 'host'. This also means the protocol scales horizontally to meet demand.
* Versioned URLs
* Protocol methods to read site listings and the revision history
* Offline writes which sync to the network asynchronously
* Standard Web APIs for reading, writing, and watching the files on Websites from the browser. This means the dat network can be used as the primary data store for apps. It's a networked data store, so you can build multi-user applications with dat and client-size JS alone.
I'm probably forgetting some. You do still need devices which provide uptime, but they can be abstracted into the background and effectively act as dumb/thin CDNs. And, if you don't want to do that, it is still possible to use your home device as the primary host, which isn't very easy with HTTP.
The real cost is scale, $20 year will cover a few thousand users but if you want googles scale it will cost you in bandwidth and complexity. p2p like torrents radically reduces the cost of bandwidth by distributing it, but more importantly it reduces complexity by standardising it.
Once the complexity is standardised budget web hosting can provide google scale for dirt cheap, and there are millions of budget hosting companies too many to shutdown them all giving you censorship resistance.
The original vision for the web was that editing/creation had the same status as viewing/consumption, and that websites were writable as well as readable. This is what Amaya implemented. It never gained wide adoption, but is served as a reference implementation of the W3C's vision of the web. (In my experience Amays is not particularly usable because it regularly crashes, but that could be fixed.)
Is Beaker similar to Amaya extended to use transport layers beyond http, such as ipfs?
Wiki markup is different from HTML markup, but it represents many of the same (early) text-formatting and resource-linking concepts, while limiting the excessively powerful features of arbitrary layout, scripting, etc.
How is access control implemented?
It seems like this basically only applies to web content you want to give everyone access to and can have 100% of application logic run client-side.
That's a pretty narrow cross-section of the existing web...
Access control in Beaker is through that private key - you need it in order to edit the 'dat' (name for a synced folder). So, no, there aren't a lot of complex permissions available - but you can also separate an app into several dats and use a master one to manage the permissions of those. Not terribly complex, but it's actually surprising how much you can do. (It's tough to wrap your head around not having a server - but it's actually true.)
But help me out - I think alot of the Web falls into this category:
* User logs in to edit their data (has private key to their dat). * User shares their data (blog, photo feed, whatever) with others (who don't have the key). * Those others merge all incoming feeds into a single master feed.
You could replicate YouTube, Facebook, Twitter this way - usually there are not complex permissions in these apps, are there? (Not that you'd want to replicate them...)
Look up Tara Vancil's talk on "A Web Without Servers" if you need a crystal clear explanation - don't know if I'm doing adequately. And my blog is at kickscondor.com if you're curious why I had to write my own blog warez. Also, there is a resurgence in blogging happening right now with the shakedown of social media. It's great.
Really cool point about extending Twine! That had never occurred to me. Amazing.
(As an aside, I originally didn't like using Webmentions on a static site - I had planned on making a cron to periodically check for Webmentions and republish. But now I really appreciate that it goes hand-in-hand with moderating comments. I look over the incoming Webmentions, nuke any spam, and republish. No bad feelings about comments that sat in the queue for a day - they are still out on the web at their original URL.)
Practically, as pointed in the article, there are laws to comply with and those are IMHO the biggest lock to decentralization.
The fair middle ground between extreme centralization à la Facebook/Twitter and total network anarchy is something based on federation, like emails and Mastodon. With federations, there are several providers for the same end-user application, with native data exchange and interoperability. The idea is to give the power to anyone with hosting capabilities to compete with the Giants, even if only a few domains will actually survive (like Gmail, Hotmail, etc because of network effects and funds, probably).
What we need is a framework, or a backbone, that allows people to easily create new federated-native apps ("dapps") without thinking about consensus issues, protocols versioning, and with native laws compliance.
Unfortunately, the world decided to go the centralised way. At some point I had to re-work my outstanding papers, because any mention of peer-to-peer or even decentralisation meant immediate rejection. Internet service providers went more greedy, so if you don't build your own global backbone to have some leverage, you need to pay someone who does or you're hosed. Even the laws in place start to strongly reflect an expectation of overpowered centralised platform beneath any communication.
Then, finally, what we ultimately need is to figure out the money flow. People want polished products and that costs money. The centralised platforms we have today have succeeded because they figured some funding. Achieving that in a decentralised world is the main problem we should be looking at. I'm afraid "just slap blockchain on it" is a highly detrimental approach, but I haven't seen anything more serious (not that I looked seriously).
Dislaimer: I'm in Google now, but this comment actually reflects my personal post-INRIA sentiments.
I agree that this is probably the way forward. The only downside is how your identity is tied to the service provider you choose. It was a PITA when Lavabit went down and I lost that email address.
Fully agree with this. The link to identity is not often brought up. I run a university lab focussed on re-decentralisation of the Internet as a day job. We focus on identity & p2p + trust.
Beaker browser is impressive early work focussed on the raw bit transport. It re-uses DNS for global discovery, its hard to do everything decentralised at once. How to do global search on a decentralised Twitter or spam control?
The hard issue we need to solve in the coming decade is the governance of such systems. Ideally it would rule itself. Definition of a self-governance system as: a distributed system in which autonomous individuals can collectively exercise all of the necessary functions of power without intervention from any authority which they cannot themselves alter.
Federation is what we have now and it has a tendency toward centralization, as we see with the WWW and the mega sites where users aggregate.
This is really needed. There are endless great tools for centralized apps that make it trivial. I could build a usable forum website in rails in a day. I have no idea how to do that in a decentralized and secure way.
The problem then moved onto being one of _curation_. Companies such as google, facebook, amazon are in the business of providing curation: i.e. taking away the leg-work of what we should attend to.
A de-centralized web doesn't appear to decentralize the problem of curation at all, which means we are going to still end up with centralized curation and the same or similar monopolies on attention that we have now.
...feels, to me, like a huge mistake.
How would one eliminate hate speech and toxic content from it? Or illegal content? Or anything you put there and need removed to keep living your life freely? The technologists developing this tech hand-wave these concerns away citing "freedom of speech" -- but one's freedom ends where another's begins, and hate speech, toxic content, illegal content, not being able to have what you said or did forgotten online, all these things curtail someone's freedom.
And by making it decentralised, they're just making it harder for people who are the victims of these problems to hold the people responsible accountable and to stop them. These technologists want freedom of speech at the expense of everyone else's freedom.
You simply make your own choices and don't follow/subscribe/view all that illegal, toxic, hate content. You know, the same way you do today by not visiting all those illegal, toxic, hate websites. They still exists though for those who don't share your views on policing content for other people.
The women whose boyfriends posted private sex pictures as revenge, or the minorities who will be the victims of hate groups organizing on social media, the children who were filmed while being raped and have their video circulating online, the victims of bullying whose bullies are empowered by other people seeing it and not doing anything to stop them...
You can choose to ignore this when you see it, but the victims can't, and it's for their freedom that I'm concerned for.
Just because it can't be dealt with by threatening the odd CEO or two, doesn't mean that it won't be dealt with some other way. Now, it may be the case that governments react to a change like this and just accept that they can't control terrorism, child porn etc etc. But it would be astonishingly naive to assume that that's the most likely outcome.
What is considered toxic for you might not be considered toxic by another person. That's personal choice. If you need to eliminate some sort of content, that inevitably will lead to this https://www.reuters.com/article/us-china-internet/china-laun...
Current web tech is inherently centralizing. Say you want to create an experience like Instagram or Twitter, delivered via HTTP. You have to pay for bandwidth, CDNs, storage, app servers, DB servers, etc etc. At scale, it's millions a month. So only corporations can do it, and with a few exceptions (eg Craigslist, Stack Exchange) they end up monetizing and "growth hacking" in user hostile ways.
The big open question is: can we create an experience as compelling as Instagram or Twitter over the P2P web?
It's a hard technical challenge, and today the answer is no. But if we get there, then internet mass media can be delivered via open source projects over open protocols, with a bunch of competing clients to chose from. No central organization controls and monetizes the thing.
Like BitTorrent, but for applications more complex and interactive than just file sharing.
--
If you're interested, here are imo the most compelling projects in this space:
- Dat
- Beaker
- Augur
- OpenBazaar
- Patchwork / Secure Scuttlebutt
They are working on overlapping subsets of the same fundamental challenges, eg:
- How does a node choose what to download? The BitTorrent answer is "only things the user explicitly asked for". The blockchain answer is "the entire global dataset since the start of time". For something like a decentralized Twitter, both of those are unsatisfactory, you need something in between.
- How do you log in? Current systems either have no persistent identity at all (eg BitTorrent) or they just generate a local keypair, and it's your job to back it up and never lose it (eg SSB, Dat, all blockchain protocols). Both are unacceptable for wide-audience social media. Ppl lose their devices, get new devices, forget their password, etc all the time. They expect and rely on password reset, etc.
So there's a lot of hard tech and UX problems left unsolved, but also a lot of recent projects making solid progress
You can make some nice proof-of-concepts with a group of volunteers, but the effort required to provide a UX comparable to centralized services is going to take more than a handful of people working evenings and weekends.
Decentralized services generally do not afford the same monetization opportunities as central services. Decentralized proponents consider this a feature rather than a bug, but it leaves open the question: Who is going to pay for all of this?
> It's a hard technical challenge, and today the answer is no.
This is why I completely dismiss almost every "distributed" solution. If you can show me a business model/design document for a distributed service that can scale to big tech levels, deliver a user experience that matches current solutions, while also incentivizing developers enough with money to get them to build it, I will be swayed. However, every solution I have seen makes massive tradeoffs that negatively affect all 3 criterias compared to current centralized solutions.
Decentralized serving has just as much bandwidth, storage, and iron (if not more). Does it somehow make those resources cheaper?
That's the point.
Disclosure: I'm working in this field.
Featuring Mathias Buus and Paul Frazee from the Beaker project.
This is not only a technology problem, it's (mostly, I'd say) a social one. Humans will always want more power and control, whether it's in real life or online.
Every single type of governance has fallen victim to human greed and ambition, as will any kind of Internet, I believe.
Fix the users - save the Internet! :)
In A Thousand Plateaus, Deleuze and Guattari talk about the opposition between the state apparatus and the "war machine" (their term for a nomadic/decentralized structure). They talk about how it seems like nomadic societies are primitive, but actually a lot of nomadic societies have "collective mechanisms of inhibition" to ward off the formation of a state apparatus, by preventing power from accumulating within any one party and "evening it out" among everyone.
The applicability of D&G's ideas on the war machine to our current problem of platform power is immediately apparent. A centralized platform is exactly like a state apparatus. In our situation the collective mechanisms of inhibition might be something like stronger/more proactive antitrust laws to break up/nationalize entities that become infrastructural components of the society.
But as you've mentioned, I think this problem of "uneven development" is a feature of any marketplace-like structure. In sufficiently large numbers, a power law tends to assert itself with no other checks on power. This is why blockchains by themselves won't solve the problem. The debate, then, shifts to be about whether this is a feature or a bug, which is something that I'm never sure about.
To close, another quote from ATP comes to mind ("smooth space" is another term they use for nomadic spaces):
> Smooth spaces are not in themselves liberatory. But the struggle is changed or displaced in them, and life reconstitutes its stakes, confronts new obstacles, invents new paces, switches adversaries. Never believe that a smooth space will suffice to save us.
We've had all kinds of redundant network topologies that used independent networks for decades. The internet is decentralized, and it works pretty well, all things considered. The web is fairly decentralized, too: DNS is independent of a registrar is independent of a network service provider and all are independent of ISP's, and even those are independent of backbones.
The only thing that isn't very decentralized is the client-server IP/TCP/HTTP model. You can provide decentralized versions of HTTP services, but those are the things that are the most costly and inconvenient to decentralize. It can be done, but it's a huge pain with very little benefit.
Distributed network will even more depend on the ISPs due to self-serving nature.
Perhaps the 'decentralized' web should also address the very foundation of the network - the network infrastructure and access to it.
Does Internet need to depend on the ISPs?
You could see this play out every time any party has tried to take full control of Bitcoin. So far everyone has failed
It's all cyclical.
And who will pay for it? With experience Xanadu seems like the only solution. The reason why it's in development hell is because the problem it's trying to solve is so hard.
Blockchains do not have a monopoly on decentralization. People who assert this are trying to redefine the term to mean some kind of extreme P2P model that fits their narrative.
Almost all our traffic goes through Google, Amazon and Facebook. It's extremely centralised.
Blockchain a don't claim to have a monpoloy - they're just the most recent thing that's repopularised decentralisation.
If Amazon servers go down, so does a significant portion of the internet. That's centralisation at work!
Blockchain tech doesn't claim to have a monopoly on the term 'decentralisation', it's just re-popularised the technology.
So is bitcoin. Only 3 or 4 companies own the majority of mining.
Companies on the internet are centralized, but not the internet itself.
This seems wrong to me.
Many of the problems alluded to in this article, in particular the privacy risk of centralized data, are more effectively solved by policy changes and iterated technology (differential privacy as well as bread-and-butter cryptography) rather than furious hand-waving about blockchain protocols.
Disclosure: I founded Namebase which is a registrar for Handshake
It boils down to: what’s the best way to provide services that I want?
I’m working on a project to provide a decentralized marketplace for software and infrastructure services, competing with AWS and Azure. The marketplace itself is blockchain-based: partially decentralized, but with a permissioned blockchain that still allows governance, legal compliance, removal of bad actors, KYC compliance, etc. The kind of things that customers (corporations) need for them to use the marketplace.
I think we need to be pragmatic about it and figure out where technologies like blockchains can help build better services, instead of trying to cram decentralized systems into everything whether it makes sense or not.
Tor simply hides where your web requests originate from - it's up to you to to visit HTTPS sites and encrypt your communications.
Also, Tor is quite decentralised but the existence of directory authorities undermines this, since presents a centralised component.
And most enterprise software (and almost no individual user) barely needs more than a rack of current gen servers.
So yeah, decentralization will be upon us soon enough.
My router stays online 24/7. It already has a web server built in. I could hack it to make it serve a public website.
But there’s absolutely no way I’m going to do that. The security and maintenance requirements are just too much of a PITA.
It’s much easier, more secure and more reliable (and likely cheaper once you figure in depreciation and opportunity costs) to set up and maintain an instance in the cloud, or a serverless site.
And if you don’t like the big cloud providers, there are many smaller outfits that can do the basics - compute and object storage are all you really need for a small site.
Consumer hardware and software are not really well suited to running publicly faceing websites.
The total capacity of infrastructure entities like AWS will increase by 10x at a minimum over the next decade. By comparison, your phone or laptop will modestly nudge forward. Consumers are not going to buy 10x the number of laptops, desktops and smartphones that they do today, ten years out. Most likely, those figures will barely move (the smartphone industry is already stagnating). Most of the incremental spending and investment will go into the centralized infrastructure by the giants.
Network speeds will continue to increase relatively rapidly. We can easily go from routine 50-100mbps home lines to 1gbps over the next decade. We're not going to see a 10x increase in the power of the average laptop (lucky if it doubles in ten years). It's primarily going to be useful for streaming/consuming very large amounts of data from epic scale central systems for gaming, 4K+, VR, etc. Decentralized systems owned by consumers will be far too weak to fill that role.
The AI future isn't going to be decentralized. The very expensive infrastructure that will demand, and its need to run 24/7, will be centralized and owned by extraordinarily large corporations.
It's precisely the typical consumer's home hardware that will act as the ultimate bottleneck guaranteeing decentralized can never take off. This has always been obvious, it won't prevent the fantasy from maintaining its allure of course. That will perpetually draw headlines and hype in tech, for decades to come, with no mass adoption breakthrough.
[0]: https://hacks.mozilla.org/2018/07/introducing-the-d-web/
Which, of course, will not necessarily be connected... but that's a part of decentralization and freedom. "Diamond Age" and its virtual polities come to mind.
ddapp.org
Firefox has been supportive of the effort for some time already, working on libdweb: https://github.com/mozilla/libdweb
Mitra @ the Internet Archive, when integrating DWeb ( https://news.ycombinator.com/item?id=17685682 ) and I talked about this.
I showed him a cryptographically secure method of having passwords (that keys are not derived from) that allows for password resets (without a server).
For a high-level conceptual explanation of this approach, see our 1 minute Cartoon Cryptography animated explainer series:
http://gun.js.org/explainers/data/security.html
This same method can be used for doing Shamir Secret "recover your account based on your 3 best friends" method, which I believe will be the best UX for most users.
This is an already solved problem.
If you’re only willing to offer me the “Lector” package, I’m going to pass.
i think the UX for a completely keychain-centric auth/authz framework can be much better than what we have today with password managers. a master password + device-entangled PINs protecting per-app/agent keys drastically reduces the possibility of getting locked out of your account AND provides for master password reset by unlocking and re-encrypting your keychain using the local device-entangled key.
I'm in tech and I'm interested in a decentralized web but I also feel that throwing the baby out with the bathwater isn't a great idea. The article says, "The decentralised web, or DWeb, could be a chance to take control of our data back from the big tech firms." To me it sounds like we're basically saying, "ok Facebook/Google/Twitter/Instagram... you're all too big to regulate so we're going to build A WHOLE NEW INTERNET". If they're smart enough to pollute the current system, they're smart enough to pollute a new system. In fact, these corporations are so big that you'll find out eventually that they've funded quite a bit of this decentralized web.
As a parent, I would feel at least a little better seeing some bankers, Pharma bros, tech execs, etc. actually go to jail and have their lives ruined for their blatant disregard of pretty much everything. I don't want to tell my kids, "well, we're too dumb to regulate the internet so we made another one.. and that one got messed up too... herp derp"
Disclosure: I founded Namebase.io which is a registrar for Handshake
When you pick a random tld like .io for example you are not getting the reliability of a .com. .io had a few big issues last year (1/5 of dns queries were failing, ex-google employee bought ns-a1.io and was able to take over all .ios).
As more tlds come from good and bad faith actors people will flock to .com as a known respected entity. Limiting to .com, .org, .net and country codes and slowly introducing new tlds made more sense and gave time to estiblish trust / create brand awareness. 500 a year creates noise and forces distrust of any unusual or new tld.
(1) IPv4 (2) Bandwidth limits
IPv6 makes NAT unnecessary. With IP scarcity gone, IP addresses might become permanent like phone numbers.
ISPs are currently making money off fixed IP addresses. Market forces would change that eventually.
Even if lawsuits didn't kill p2p networks, Virus / safety concerns would have. Imo, trust is a bigger issue than discovery, hence need for curation - centralization.
The reputation system of thepiratebay makes it my primary torrent site.
Laziness, convenience is more of a trust than a search issue.
Many uploads are viruses/adware/ransomware masquerading as movies, books, games...
This necessitates multiple downloads - it's frustrating. I remember downloading several gigs of rar and encrypted .avi movies files, only to be greeted with a message asking me to fill a survey to get password.
Yify - a reputable source eliminated this concern for movies.
If decentralization works out, I believe specialized search engines will emerge.
But note, trust is the bigger issue than search or content discovery for decentralization. If not, iTunes store, app stores and other walled gardens would have long failed.
Perhaps the most decentralized part of the internet today is BitTorrent. It’s a very efficient way of sharing files and has a lot of success. One can see how BitTorrent could become the backbone of a decentralized web. However:
1 BitTorrent “naturally” throttles popular files over anything else. Niche items which are hosted by fewer people will be slower to download => BitTorrent makes a cultural echo chamber
2 BitTorrent needs some kind of centralized search engine: it’s not possible for everyone on the network to host a copy of the entire index of files on the network. The only way is to have a search engine, much like Pirate Bay. In fact one could say that google was this in the first place.
3 decentralized social media would be much more polluted with fake accounts since no “authority” would be able to fix it.
People have been excited by decentralized web for at least 5 years. The technology has existed for at least 10 years. If it was to happen, I think it would have happened already...
AMP is not any faster to load than the actual site not on google's CDN. The only reason it appears faster in most situations is because google is abusing it's monopoly search position to pre-load and prioritize AMP results.
AMP gives google more control and that's why they push it so hard. This plus their quest for hiding and/or getting rid of URLs so they can use AOL-style keywords within their AMP walled garden is multiple steps back. All the way to the late 90s.
The fundamental technologies were designed with decentralization in mind
Mastodon is just peered IRC all over again
The ISPs have poopooed running shared services from home connections.
DNS and the core protocols can run in decentralized ways no problem
It’s the social order that doesn’t enable it