But I put the question to myself, if I could magically wish this thing away, would I? I wouldn't. I understand the many that would, this thing completely upends the status quo. Massive swathes of people will have skills they have built up their entire lives become worthless. But the potential for good that can emerge from this, can potentially benefit everyone and the people that are not benefitting from the status quo the most.
If you think AI is a disaster, think about the potential for medical breakthroughs that can emerge from this. Doctors will spend fewer hours writing charts and more time with patients. Every underprivileged child could have a personalized tutor. The number of discoveries and ideas that can be generated are endless.
I can definitely picture this being a personal tutor to a child
You don’t need to be a “prompt engineer” to get use out of a language model. You just have to recognize what it is, and what it isn’t.
If that can’t be taught to children, there’s no sense fretting over AI. We don’t need it to fail.
Don't bother. People here love to miser. There's no way you're getting any positive responses to this. Let doom and gloom begin.
Really? When I played around with it I was filled with incredible optimism and hope. It was an amazing companion that helped me with my code, answered questions, and what I hoped Google Assistant/Alexa/Siri etc would become a few years in. Sadly they never did.
This is amazing, and would be an excellent personal assistant when it becomes cheap enough for smaller personalized LLMs.
The “think of the good” argument is over played. Same was true for combustion engines, they will probably send us extinct.
I’d like it if people just stopped with it. People aren’t idiots, they know good things and very bad things can be achieved.
What? Because we weren't burning fossil fuels before cars?
Besides forgetting about trains, steamships and the coal-powered Industrial Revolution, you're ignoring the billions of lives lifted out of poverty.
!RemindMe 150 years
Climate change is a serious issue, but you cheapen and discredit it by exaggerating its impact. There is zero chance climate change will cause humans to go extinct. If my stance is incorrect, please reply with some evidence.
It seems to me that he's really missing some of the major concerns about "AI" as it stands today. As often happens with new technology, the old rules that secure rights don't quite fit anymore. E.g., if I were an artist who had spent years developing a unique, recognizable style, I'd be furious to have a for-profit company use my work to create something to imitate my art. It's probably not illegal at the moment, but it easily could be down the road, and it regardless raises real ethical questions. I'm disappointed to see Gabriel fail to grapple with that here, where his cache as a prominent artist is being used by a for-profit company for their own ends.
Peter is OK with AI consuming his tracks. But other Artists... not so much.
One reasonable concern is that tech supplementation will lead to a deluge of derivative work, nullifying the efforts of the actual creators. That's always happened in some form or another, and does it really lead genuine fans away from artists they care about?
There's a comment in another thread about generating a song that includes Kurt Cobain, which is such a weird example because a computer could not have dreamed that up in a thousand years. A computer couldn't write a punk song, and mean it. It will never replace the open mic, the buskers, the songs passed across generations, the Zappas of the world, and millions of others.
Those genres are safe (for a while), but they're also a puny portion of the market. American pop music is totally going to be replaced by AI. It's been nothing but awful, formulaic crap for the last 25 years, so there's no way that AI-generated music could possibly be worse.
Even out of the context of AI, I think this isn't stressed enough in general copyright discussion, especially around piracy.
People often say that piracy doesn't actually reduce the sale -- which I fully agree -- but that's not the only concern artists have, especially some indie ones. I have seen both illustration/musical artists explicitly stating they don't care if "their work is enjoyed and known by more people because of piracy", they only want paid users to get it. I don't even agree with this sentiment, but I respect it since it's their choice to make, not mine.
If anyone legitimately feels their copyright has been infringed by this competition, we and Stability AI will work to take down the video until the dispute has been resolved."
StabilityAI and Gabriel are providing (a) but not (b).
https://www.courtlistener.com/docket/66788385/13/getty-image...
If Getty wins, is Gabriel committing contributory infringement.
Even if what happens after text is entered into a prompt is not infringement, mass copying for "training" is done for commercial gain and it is done without consent. Google gets away with copying websites en masse into a cache for the purposes of running a commercial web search engine. Maybe copying for purpose of commercial "AI" will get similar treatment.
That said, consider what happened when Google tried scanning books. It seems that some of these training sets have used hundreds of thousands of copyrighted works from "pirate" sources on the web.
IMO, this is just another example of so-called "tech" companies, e.g., Uber, that can only operate if they are free from existing laws and regulations.
Does StabilityAI have a commercially-viable plan if "training" requires obtaining consent.
I don't know how any of the proponents can pretend that this isn't an abject disaster on the horizon for anybody who depends on copyright to make their living.
This is the natural progression to the unnatural properties of the shared delusion of pretending like ideas are property or that it every was natural to keep them artificially scarce. If we lived in an ideal system where ideas are free, copyright didn't exist, and artists and programmers could survive and thrive without the ability or need to hoard their work as if they were physical goods, this would be a non-issue. The system was antiquated for the needs of the modern world for multiple human generations already, and this is the dam breaking.
OK, I'll bite. How is the an abject disaster on the horizon for... let's say, novelists?
It's actually really scary for writers right now. You just have to look at the huge amount of AI generated attempts and think "what do we do when the writing gets really good? What do we do when most novels are mostly AI generated?"
People have spent decades working their ass off to get good and try to get their work sold, and they come out the end of this tunnel right into an era telling them that they're just about to be obsolete.
Graphics designers will have tools and will make movies and interactive fictions. They'll build their own following.
It'll look like YouTube and the rise of the YouTuber, except bigger and broader.
As to AI-generated content: Who knows?
[0] Towards humanity and creative endeavor in general.
So does the world currently only listen to singer-songwriters on acoustic guitar at small local venues?
The humanity in the consumption of art has been subject to mass commodification for centuries at this point.
This process is driven by money and not love. Technology has nothing to do with it.
We're not talking just more art here, or some people losing jobs who then have to retrain. We're talking with possibly very near advancements, entire industries completely emptying, with hundreds of millions of people who now have nothing to do and nowhere to retrain to.
Other advancements have caused job losses and opened many more. I'm not understanding where all the potential displacements from AI are supposed to go, or how we're going to balance an economy when you need an order of magnitude fewer employees than ever to pump out more content than anybody could hope to consume.
When technology destroys menial jobs, it's progress.
When technology destroys white-color jobs, it's a threat.
</irony>
I don't think any profession that requires a license is at risk. Software engineers resisted a licensing regiment for decades and now the profession will pay a price.
I don’t think it would destroy jobs for anything high profile (including programmers), but the grunts is a different story. If the secretary or, for instance, nurse, can feed through info and the ‘head’ lawyer/doctor of the practice only has to sign off on the result, the license is not an issue.
Not there yet, but can’t see this as avoidable anymore.
I'm also not entirely convinced that more software is by default a net positive for humanity, let alone "the earth"
No one knows really the magnitude of AI, but if we take the two extremes, AI takes all our jobs, and AI is just some stats that has no real utility, we’ll probably land somewhere in the middle.
Personally, I’m trying to learn these technologies to augment my current work. I’m treating it like going from using Notepad to program Java, over to a full fledged IDE. Not a perfect analogy, I know.
Given its inevitability, I think it’s logical to try and use it to our advantage as workers. If it ends up taking our jobs anyways, at least we tried. If it doesn’t take our jobs outright, then we’ll still be behind those that use the AI products as tools that augment their productivity, leading to a game of catch-up.
Even with the 6 month hiatus proposed, AI versions will still be released by those that refuse to follow the agreement. We’re in an AI arms race against the likes of other world super powers. And the morality of some are quite questionable (not that US’ morality is perfect by any means)
when there is no more desire to be quenched, when there are no jobs to do, when we have solved all disease, what do we do? Man has been defined so much by his suffering and toil, that when we take it away we are in an environment that we are not prepared for in any kind of sense.
Quite the opposite happened. We appreciate human-vs-human Chess even more because Stockfish, and other AIs, demonstrated there are more challenging lines and strategies to pursue.
I'm not willing to pretend that we'll somehow get basic income before the need and economic tradeoffs become very apparent. Democracy rarely ever works like that and the alternative isn't any prettier.
Regardless, like most tech/social evolution, we can't stop it even if we wanted to. Even if we try to slow it down it will probably just slow it down for some and build a bigger moat around the few with connections to power.
On this track, we continue to toil in the dirt, unless we figured out how to automate all the farming too. In fact, it looks like we're automatic all the fulfilling jobs and only leaving toil for the humans.
And even if it were true, what's the quality of those new jobs? Sitting in front of a command line engineering "prompts"? Sounds more like a particularly dystopian sci-fi story than a bright future.
They find better things. Maybe agitating for a different economic system. Some countries will fail horribly at this. In any case, LLMs aren't coming for most writers or coders or scientists or medical researchers. They will help with our credentialing fetish.
Better things that an AI can't do. To clarify.
>if AI takes my job
AI is not going to take anyone's job. People using AI will take jobs from people who don't use AI.
>What happens to the doctors
That's completely up to them. Some will use AI to research new treatments, to assist with their daily workload, or even to treat patients.
Some doctors will stick their head in the sand and refuse to work with the new technology. They might lose their jobs if they can't compete with AI-assisted doctors, and it will be entirely their fault.
If your boss fires you because your colleague uses AI sooner or better than you, sure, AI didn’t take your job, but what’s the distinction? If you are in a team of 10 translators and 9 get fired overnight, I would say AI took their jobs. Which is happening.
Also, this is probably shortsighted; when going forward, it will be possible for a manager/hr to chuck a resume and typical tasks into an AI and ask it if it can do it or they should hire a human. Now the AI will lie it can do it, but a lot of work goes into making that better and the execution of the provided tasks for the job will show if it’s lying.
Not really sure what point you’re making though. This almost makes it sounds like you want the reader to conclude that it’s going to be a one-for-one trade, but the whole concern is that it won’t be. If your manager uses AI to replace the job of you and 9 other people on your team, I think it’s a bit silly to say “don’t worry, AI didn’t take your job, a person using AI just took the jobs of 10 people not using AI.”
An incredible distinction on the level of “guns don’t kill people, people with guns kill people”. The difference is practically meaningless. By your logic, if everybody uses AI then no one loses their job but that’s not how productivity gains work is it?
— Karl Marx in Grundrisse (1857-61)
Literally why we have intellectual property. So you can capitalize your intellectual work.
Correction: if you are in America, *to those who can afford it.
It's easy for me to say, I am young and healthy and can move away from programming after it bought me a house. Maybe I'll work in a brewery...
Either way, the dirty words everyone seems to be avoiding is UBI and socialism.
If AI sincerely destroys every white collar job, well... that will be interesting.
this is a great point, and i think represents a really good attitude. it seems like there's a whole lot of people who think if they pretend hard enough, AI will go away. and that's obviously not going to happen - it's a useful tool, and so let's all try to figure out how it can be useful in a productive way, not a world-ruining way. and that means using it.
He has a great voice, but that's hardly everything. He is somewhat of a "grand old man" now. And he makes "grand old man" pronouncements like this one.