Now if AGI make people's work redundant, and makes economy grow 100-10000x times... what does that measure mean at all? Can produce lots of stuff not needed or affordable by anybody? So we just hand out welfare tickets to take care of the consumption of the ferocious production, a kind of paperclip-maximizer is doing? I suggest reading the novel Autofac, it might turn out prophetic.
Will that "growth" have any meaning then? Actually the current we print money and give it to the rich economic growth is pretty much this, so with algorithmic trading multiplying that money automatically... have we already achieved that inflection point?
Imagine a list of things many people wish to happen in physical reality. We’ll have more of that.
-Better healthcare
-Curing most things that destroy quality of life
-Curing aging and age-related death
-Much better treatment for all sources of mental suffering
-Far better and cheaper and reversible body modification
-More free time to spend at whatever you want
-Everything much cheaper
-Bigger and better homes and living spaces
-Bigger, faster, cheaper transport
-Easier to organize meaningful social interaction
-Better and more immersive entertainment
-More time to spend with close friends and loved ones
Regardless of if "illness" is or is not a terminological inexactitude, it looks like ageing is a chronic progressive terminal genetic disorder. I think "cure" is an appropriate term in this case.
Funny that this kind of ideological conflict will likely be a key fulcrum of the machine intelligence revolution. We will have a very loud minority that attempts to forcefully prevent all other humans from having the voluntary choice to avoid suffering.
Are you in it?
Or paper wasps: https://www.bloomberg.com/features/2017-biological-markets/
I disagree with the underlying presumption. We've been using animal labour since at least the domestication of wolves, and mechanical work since at least the ancient Greeks invented water mills. Even with regard to humans and incentives, slave labour (regardless of the name they want to give it) is still part of official US prison policy.
Economics is a way to allocate resources towards production, it isn't limited to just human labour as a resource to be allocated.
And it's capitalism specifically which is trying to equate(/combine?) the economy with incentives, not economics as a whole.
> Now if AGI make people's work redundant, and makes economy grow 100-10000x times... what does that measure mean at all?
From the point of view of a serf in 1700, the industrial revolution(s) did this.
Most of the population worked on farms back then, now it's something close to 1% of the population, and we've gone from a constant threat of famine and starvation, to such things almost never affecting developed nations, so x100 productivity output per worker is a decent approximation even in terms of just what the world of that era knew.
Same deal, at least if this goes well. What's your idea of supreme luxury? Super yacht? Mansion? Both at the same time, each with their own swimming pool and staff of cleaners and cooks, plus a helicopter to get between them? With a fully automated economy, all 8 billion of us can have that — plus other things beyond that, things as far beyond our current expectations as Google Translate's augmented reality mode is from the expectations of a completely illiterate literal peasant in 1700.
> Can produce lots of stuff not needed or affordable by anybody?
Note that while society does now have an obesity problem, we're not literally drowning in 100 times as much food as we can eat; instead, we became satisfied and the economy shifted, so that a large fraction of the population gained luxuries and time undreamed of to even the richest kings and emperors of 1700.
So "no" to "not needed".
I'm not sure what you mean by "or affordable" in this case? Who/what is setting the price of whatever it is you're imagining in this case, and why would they task an AI to make something at a price that nobody can pay?
> So we just hand out welfare tickets to take care of the consumption of the ferocious production, a kind of paperclip-maximizer is doing? I suggest reading the novel Autofac, it might turn out prophetic.
Could end up like that. Plenty of possible failure modes with AI. That's part of the whole AI alignment and AI safety topics.
But mainly, UBI is the other side of the equation: to take care of human needs in the world where we add zero economic value because AI is just better at everything.
We probably can't. I mean why stop at humans? Let's give every pet the same luxury, or ... in the limit we could give this to every living being. Ultimately someone is going to draw the line who gets what and who is useful or not "for the greater good".
It just happens that many living beings don't contribute to the goals of whoever is in charge and if they get in the way or cause resource waste nobody will care about them, humans or not.
Human rights and democracy is all cool, but I think we just witnessed enough workarounds that render human rights and democracy pretty much null and void.
Humans have rights insofar they're able to enforce them. Individually by withholding their labor (muscle or brain power), or collectively with pitchforks if need be.
Once labor is dime-a-dozen and pitchforks ineffective (OP's premise of "fully automated economy"), human rights and democracy go the way of dodo, inevitably. Nature loves to optimize away inefficiencies.
Although the "fully automated" bit is quite a stretch at the moment. The end-to-end supply chain required to produce & sustain advanced machinery and AI is too complex, a far cry from "LOL let's buy some GPU and run chatbots".
Eh.
A line, drawn somewhere, sure.
Humans being humans, there's a good chance the rules on UBI will expand to exclude more and more people — we already see that with existing benefits systems.
But none of that means we couldn't do it.
Your example is pets. OK, give each pet their own mansion and servants, too. Why not? Hell, make it an entire O'Neill Cylinder each — if you've got full automation, it's no big deal, as (for reasonable assumptions on safety factors etc.) there's enough mass in Venus to make 500 billion O'Neill Cylinder of 8km radius by 32km length. Close to the order-of-magnitude best guess for the total number of individual mammals on Earth.
Web app to play with your size/safety/floor count/material options: https://spacecalcs.com/calcs/oneill-cylinder/
> It just happens that many living beings don't contribute to the goals of whoever is in charge and if they get in the way or cause resource waste nobody will care about them, humans or not.
Sure, yes, this is big part of AI alignment and AI safety: will it lead to humans being akin pets, or to something even less than pets? We don't care about termite mounds when we're building roads. A Vogon Constructor Fleet by any other name will be an equally bitter pill, and Earth is probably slightly easier to begin disassembling than Venus.