I literally explained it. I straightforwardly applied the technology to our existing social/economic structure.
And changing the social/economic structure is probably harder than developing the technology and requires precisely the kind of power that a successful AGI technology would remove (e.g. workers can't strike to keep their jobs when the boss is planning to lay them all off).
> It's not a fact, pseudo scifi action movies don't count as facts.
Honestly, the "AGI will be so great/everything will be fine" assumption relies less on facts and more on sci-fi fantasy than anything I said.
Yes, both sides of the debate use scifi as facts, I agree. I don't think the other side does it more than you do, though.
If you think that, the problem's on your end of the connection. The most charitable read of your comment is you're expecting a level of exposition that is not actually required, especially given the common context of what exists now.
Personally, I think you're actually doing more of what you're accusing me, for instance your sibling comment of:
> The economic system is not set in stone. If everyone is irrelevant to it, the economic system becomes irrelevant to everyone, and a parallel system gradually replaces it.
You're basically hand-waving a future and saying "everything will be fine." And you're also misunderstanding some significant things in a kind of black and white way. E.g. I never said "everyone [would be] irrelevant [to the economic system]," I said labor would be. That's a lot of people, but not everyone.
AGI in an internet connected world is capitalism end-game. Once you have AGI, labour (both physical and intellectual) becomes redundant, humans have a "value to the system" approaching zero.
Our economic system is built on a series of assumptions that fundamentally cannot survive AGI, and nobody is really even trying to grapple with that fact.
What do you do when "demand" for human labour drops to zero and "supply" stays at >8 billion.
No account of tinkering at the edges is going to fix that. We're in a much deeper fundamental problem than you might seem to think we are.