To be honest, I’d struggle to find any human content that is not recycled from the past. I guess we are still better than AI at remixing sufficiently novel combinations of concepts but I think much of this AI content aversion comes from some kind of phobia, or an unwillingness to admit humans are shockingly unoriginal.
In other words, if there is truly no such thing as an "original idea", then how did the ideas that we are pulling from, deriving from and combining come into existence?
If all we are saying is that existing ideas inspire new ones, or that most human generated content is derivative, then I completely agree with you. I don't see how that proposition could be controversial at all.
But it seems to me that some people, at some points in time, somewhere have and will continue to contribute something original at least on occasion. Even if the "original" idea is 1% of the "intellectual product" and 99% is reusing existing concepts. To insist otherwise is to insist that we hit peak human innovation somewhere along the line and there's nowhere left to go.
Art that isn't recycled is almost always an illusion caused by your not being familiar with the things it was remixing. Nobody pulls things out of thin air. Even whacked-out acid trip visions were molded by our cultural experiences and aesthetic. While there are a few people who make large shifts in their artform-- e.g. Jackson Pollock, Jimi Hendrix, and Antoine Carême-- but a) they were still making incremental progress, and b) you'd have a pretty limited set of things to choose from for your entertainment.
PS Edit: Right now AI-generated art is novel, but it has far less potential to meaningfully advance art as time progresses. It will mimic what it sees, but it won't see anything that a creative cultural avant-garde doesn't produce first, and as long as real people are doing it first, there will be eyes that want to see it before an AI algorithm waters it down into some conceptual amalgam of its real form. That probably includes commercial entities who are deliberately trying to make themselves stand out.
1) The output, while impressive on the surface, is bland and recycled. It will drag down the general level in the same manner that CGI has destroyed movies.
2) People don't want to consume AI generated content in the same manner that they generally don't watch Stockfish vs. Stockfish.
3) It is not phobia, it is disgust at humans being dehumanized.
The phobia seems to be on the side of AI corporations, who quickly step in any discussion that questions their business model.
It's clear that amidst a deluge of AI-generated content, audience urges for authenticity will rise. Attention is in more finite supply than content, has been even before generative AI; only so much will stick out, and it will perhaps be the most authentic or analog content and goods.
The real question is whether AI/AGI can make it past the "authenticity threshold" and xenophobia to where we also accept AI storytellers and brands as eligible.
So what? There are a handful of truly revolutionary artists each generation. Faulting AI for producing merely good and interesting art is missing the point.
We biologically have a desire to live. Part of living is confronting threats to our survival, and hopefully defeating them. Our minds are what provided us with the ability to survive despite there being stronger, more vicious competition out there. Why would we want to surrender our one competitive advantage?
People aren’t stupid. They know that AI will continue to progress (“technology must progress”, says the technologist) and that it threatens their way of life. Truckers know that AIs will automate their jobs. Artists know it will automate theirs. Everyone knows that AI is coming for them, sooner or later. If not in their career, then maybe in their social life, like video games and social media have decimated in-person communication.
Those are things we need in order to survive. Our jobs provide us the money we need to meet our needs, and give some of us meaning in life (no, UBI proponents, receiving a check doesn’t solve all the problems). Our social lives are paramount to our health. What will humans be doing all day when AI has taken all our jobs and we are all talking to chatbots all day which cater to our every proclivity? Maybe that is some folks’ ideal worlds, but certainly not mine.
Given the mixed results we’ve seen with technological innovations in the past, rather than giving a negative connotation to the people who are cautious or concerned about AI, why not listen to them? Rather than having a phobia, maybe they have a valid point.
I could make a similarly disrepectful and unsupported claim about the mental state of people who believe most human creative activity is simply recycling learned ideas, and we could spend some time flinging insults at each other, but why?
> The Not By AI badge is created to encourage more humans to produce original content and help users identify human-generated content.
How does this encourage humans to produce more original content? It may help users identify human-generated content - if they care. But perhaps more usefully it helps AI identify human-generated content to avoid training on its own garbage.
The word "recycled" attempts to carry a lot of weight here, and not successfully.
In 1905 Einstein published his famous four papers that could not have existed without recent work done by Planck, Michelson, Morley and Maxwell.
It would be ridiculous to describe those papers as recycling.
But perhaps the process doesn't matter as much as the end result.
Maybe another interesting aspect to consider is: humans create new things (even if by remixing and deriving from old things) all the time, because they want to, because they choose to do so.
Thus far, these AI creations have been made at the behest of humans demanding them.
Perhaps that will, that desire, to make something new in the first place will remain a differentiating factor.
It’s probably not completely accurate to say there’s nothing truly original anymore, but also probably true that the rate of discovery of truly novel and original concepts and ideas has slowed to the point it may seem that way. Possibly b/c all the low-hanging fruit as been taken, and only the more difficult discoveries remain.
You can retrospectively say we've always had a device that creates light from heat, like this 'new' lightbulb and metaphysically you'd be correct.
But the properties and design of a lightbulb are different to an oil lamp. That's new enough to be called original and Thomas Edison (and whoever else helped him) are the originators of the lightbulb design and they made it real.
An original thought? Probably not, an original object, yes. Very much so.
The argument should be about capacity for creation, and to me AI generation just doesn't seem like true creation, but more of a cheap magic trick.
That means AI is not capable of producing meaningful _new_ content like discovering new mathematical theorems, because AI does not understand maths, whereas humans can come up with something meaningful based on _understanding_ of the content they have learned from.
This is why when you ask e.g. ChatGPT about something it has not been trained, it can only come up with garbage, whereas a human would likely be able to provide meaningful answer based on looking at the same training data, if that makes sense.
"Hold the newsreader's nose squarely, waiter, or friendly milk will countermand my trousers."
A Bit of Fry & Laurie Concerning Language: https://www.youtube.com/watch?v=3MWpHQQ-wQg
make up a list of fictious creatures and describe them. make sure the names aren't reused.
"A man with a new horse visited the hardware store in Dagwood last week to try and find a new hitching post for his stable."
That simple sentence has enough entropy that I can say, with confidence, that is has never been uttered by anyone in the history of human civilization.
https://www.biblegateway.com/passage/?search=Ecclesiastes%20...
For me, I actually really like that human content is recycled. The discussions around AI, art, and humans always talk about art as a separate product produced by the human. I feel like many pieces of art (not all, for example zombie realism) has a piece of the human inside of it.
An example I used in a comment in this post is Raymond Carver. His short stories are about blue-collar men in the mid-west. Carver was a blue-collar man from the mid-west. I find that interesting, and I liked that he pulled from his life experience to write stories.
I get very excited when I realize that the work I'm engaging with is recycled in some way. Like a song's chorus sharing lines from a separate artist, or a style that seems similar to a different writer I know. I love that. It makes me feel more connected to the creator and makes me like them more because we share similar interests.
An example that comes to mind for me are the manga Hunter x Hunter and Jujutsu Kaisen. The creator of Jujutsu Kaisen loved HxH and it 100% shows. The crazy powers, the complicated fights, the walls of texts explaining what happened in a fight, etc. All of those come from HxH and I love the creator was inspired by it. Sometimes reading Jujutus Kaisen makes me feel like I'm bonding with the creator over our mutual love of HxH.
With AI created work, I feel like we're missing the human touch. And as I pointed out in another comment here - I don't think it's hard to bring it back. Show me the prompts and the chat log. Why did you choose those prompts? Why did you ask the AI to write in X style? How does that X style make you feel?
On the other hand, maybe the AI prevents some of the human touch from coming through. I imagine much of the recycling that comes from humans is unintentional. The song you wrote on guitar has that riff not because you're copying a band, but because you've listened to so much 90s indie rock that you just make what you love - without realizing it.
Maybe in the future this changes. I'd love to see someone spend some time with an AI and mold it to such a way that what it produces can emulate the user's love and interests. Don't forget to show and share the chat log!
[EDIT]: To add onto this, I've actually done the above. I've played some text adventure games in the past, and I got ChatGPT to play with me. I told it that I had stat points (Health Points, Magic Points), that I had two types of attacks, and that there are three types of monsters. That experience invigorated me to write paragraphs on world building - just so I can feed it into ChatGPT and live in that world.
I seem some instances that are similar like this - but people just end up not sharing the prompts they feed into ChatGPT. I'd love to see those prompts. I've love to see what someone did to get ChatGPT to act a certain way.
"Soon, asking a writer if they use AI will be like asking a photographer if they use Photoshop" – e.g., it's a foregone conclusion, and the best artists will generally adapt to using the best tools available
It's similar to music. DAWs and samples didn't kill off music; instead, it made it easier than ever for a teenager with a computer and a passing interest in music to create a song and share it with a world. As a consequence, though, the standards for mixing and mastering have gone up massively; people don't really tolerate bedroom recordings with $10 mics any more. I imagine most amateur musicians in the 90s didn't know what a compressor actually did (I certainly didn't).
Seeing the results of talented artists who are experimenting with AI[0] makes you realize that there's still going to be a massive gulf between skilled artists using SD etc as a tool, versus those who think they can be artists just by putting keywords into an image generation AI and calling it a day.
[0] https://twitter.com/jamm3rd/status/1619896080619159553 https://twitter.com/jamm3rd/status/1633758455952703488/photo... (moderately nsfw I guess)
Asking a photographer if they use Photoshop creates a framing where the artist still did went out and took a picture to create the work and then Photoshop only modifiers their original creation.
But you could just as easily say "Soon, asking a writer if they use AI will be like asking a photographer if they have a camera". That sounds ludicrous, but that is exactly what generative AI offers: the ability to create content essentially ex nihilo.
The problem is that it automates the best parts and transforms the user into a curator/manager.
Some people want that, but other enjoy the creative part more.
To bring it back to photography, that's the problem with digital, you blast through hundreds of photos and spend most of your time selecting and editing at the computer.
With analog, and especially wet plate or direct positive paper, it takes a while to compose your picture and you only have one chance to take it and develop it. It's very easy to mess anything which is what makes the process more meaningful. And you're left with a unique copy, not an artificially limited single copy.
Of course, it's not my place to say who should or shouldn't call themselves a writer, but I'll simply personally respect someone more if their stories aren't ghostwritten by the AI.
I don't think "AI" is the way most people would complete that sentence. Perhaps "Word"? Or "LibreOffice" if "Photoshop" were replaced with "Gimp"? Personally I use Emacs or Vim. You could incorporate AI into any of those tools, but how exactly? And would it be a core part of the functionality? And would it be something that the best artists make much use of?
If you used AI to make the spelling/grammar checker more reliable, probably most people would use that (assuming no privacy problems), but if there's an AI-driven autocomplete for sentences probably a lot of people, including the best artists, would turn it off, just like I turn off the autocorrect on a word processor today.
If an AI can complete the sentence, then perhaps the sentence isn't worth completing, at least if we assume that the reader is as clever as the writer and has similarly sophisticated tools. Perhaps the AI-driven tool should instead put some kind of wavy line under the second half of the sentence to indicate that it's boring and obvious and doesn't need to be there.
It’s tremendously useful technology for many domains already and you can see the stepwise refinements that will permeate many parts of our lives. Big money corporations are productizing those elements already.
But to be as fair to the pessimists as the optimists, its actually still a very big leap from Modjourney and GPT-4 to something that becomes the next camera or typewriter in terms of ubiquity. Because we saw a huge leap recently, we feel close and excited — and we might prove to be in hindsight — but we also might see that there are some hard conceptual limits that we won’t see anyone break through for another fifty or hundred years.
For image-making, there is a decision in the prompt and selecting images, but that is very different to making a decision about each color and brushstroke, and working to finish a painting. It's orders of magnitude more difficult and why great masters are celebrated. Creating stuff with AI will suit certain people, I definitely don't think necessarily the 'best' will automatically use it. It does seem to take away a lot of the fun of making images and truly original work will always push quite far outside the training set.
For literature, it could be interesting, how authors use new tools, where and in what sense. Maybe have more conclusive plots? Less inconsistencies? Have AI imitate dead poets' style?
We will value the classics more I guess, since they were done the old fashioned way.
New tools open doors, not shut them.
Another argument I keep hearing (most recently from pg), is that we'll always need non-AI training data. That, too, doesn't follow. Training new models on synthetic data does not mean we get stuck in a particular mode or style. We'll continue moving, improving, and trying wildly new things. Bootstrapping with synthetic data doesn't block evolution - it enables faster evolution, even. (I'm using synthetic data to train new models to great effect.)
People are angry that we've lowered the opportunity cost barriers and so they're expressing their frustration.
It's a good thing that life's choices no longer fence us in as much. Everyone should get a chance to learn how to express themselves through art with the new regime of tools. Despite changing economics, there will still be a top 1% that do better than the rest of us.
Like imagine a label on a thing that says "Made without CNC machinery". So instead of finishing the thing on a CNC mill, they instead stuck it on a Bridgeport manual mill, and finished by hand. Or somebody finished the cast with a file.
Okay, and so? In the end, a hammer is a hammer. If it hammers well, what difference does it make how it got that precise shape? It's not like some inherent goodness is being imparted by a hand file.
Now I get some methods have flaws to them, and some form of associated harm. The problem I see with blaming specific tools is that it's simplifying the problem too much. Eg, if the problem is taking jobs, then picking up one particular tool to blame for that allows manufacturers to use a different tool and cause the same sort of issue. If the problem is say, pollution, then it's not at all a given that the replacement method will be ecologically friendly.
IMO the better thing to do is to target the underlying problem. If say, your issue is ecology, then you want to certify that the manufacture is as environmentally friendly as it can be, not that the thing isn't made from plastic, or there exists the chance that the non-plastic thing will even be worse for the environment.
And if someone is going to send me a sales pitch or email that GPT wrote, I'd rather they don't insult me and just send me the prompt.
I think tfa is a gimmick, but i see value in knowing whether I'm relating to a human brain or to a statistical model
"Hand made" has been a selling point since things were first made by things other than hands. Example: "The factory, known as the Ateller, [...] it is the place where our craftspeople assemble each BUGATTI by hand."
This is basically the selling point of the entire luxury watches industry.
and fwiw, a lot of the assumptions in here are akin to saying at the dawn of photography that nobody would ever paint any more. Photography of course replaced certain entire categories of painting, but didn’t erase painting from the face of the earth altogether. Of course there are far more photographs than there are paintings, but volume alone is not the totality of meaning.
Generated content is strictly culturally regressive anyway. After a hundred years of ChatGPT will people still be writing prompts with “in the style of [person who produced all of their work before the year 2000]”? That would be a sad and boring future.
I have things I have written that took me two years for 8-10 pages. I wrote them to help me think through certain things.
Could ChatGPT have written it for me? Maybe. Probably not, though - I kind of had to discover what the content was supposed to be.
Could it have written it better than I did in two years? Probably not. Two years leaves you a lot of time to polish the phrases.
Could ChatGPT have taught me what I learned in those two years? No way.
> Art is not what I think when I’m painting. It’s what you feel when you’re looking
Yeah, because the Netflix catalog, for example, is not repetitive and stagnant.
These current advances will enable anybody with a unique idea to produce content. We are right before an immense explosion of human creativity.
I think we’re going to see an explosion of waste.
We are information processors. The input makes the output. What happens when you close that loop?
And I would argue that the current (mostly recent productions) Netflix catalog is indeed repetitive and stagnant. Originality in tv production is currently in a race to the bottom.
Those purists are mostly dead. Everyone uses technology in music today. And music is better for it.
> I thought using loops was cheating, so I programmed my own using samples. I then thought using samples was cheating, so I recorded real drums. I then thought that programming it was cheating, so I learned to play drums for real. I then thought using bought drums was cheating, so I learned to make my own. I then thought using premade skins was cheating, so I killed a goat and skinned it. I then thought that that was cheating too, so I grew my own goat from a baby goat. I also think that is cheating, but I’m not sure where to go from here. I haven’t made any music lately, what with the goat farming and all.
I just want to say that some answers here are like when artists talk about engineering, we simply don't understand the topic and it shows.
2. The same argument could be made for "Not with a Computer" and invalidate this website based on identical principles.
3. The future AI brings is a huge unknown, but as we've seen with every major technological advance so far, it's never been nearly as bad as the most fearful and skeptical thought it would be at the time this was unknown.
I generally agree that it will become increasingly harder to distinguish non-AI generated content and authenticity will suffer a lot. Maybe the solution to this is get connected with other humans directly and provide authenticity as first hand experience.
More seriously, even if we are willing to assume that everyone will be honest about how much they use AI, how do you define whether an image is less than 10% AI generated? Number of pixels? Number of objects?
And what about writing? Does getting an LLM to check your grammar afterwards count as cheating?
This is subjective as all get out. If you write a novel and use Stable Diffusion to draw the cover, does that make the whole novel less than 100% human-generated?[0] What if they had used, say, lots of prompt engineering and inpainting runs instead of just typing in what they wanted and grabbing the first thing they saw? If, say, painting programs start using image generators as brush engines, does that lower the percentage more or less than the inpainting case?
[0] Keep in mind that most covers are designed and drawn by publishers, not the original writers. Writers making their own cover art is very much a self-publishing thing.
I have never considered the cover art for a novel (or most books) to be a part of the novel. The thought that it might be never occurred to me. It's the packaging. So, in my view, such a novel would be 100% human generated.
AI art created by talented artists and writers is the future.
Second is art itself. We have high quality art prints that a large portion of people are happy to buy. Artists themselves are happy to sell prints of their own work. That product is entirely machine made except for the initial knowledge work by the artist. I'm not making a art, not art argument. I'm saying, people are happy to consume manufactured content, companies are happy to generate it for them.
The argument that 'humans will stop producing and things will stagnate' is totally non-serious and doesn't even deserve a retort really..
is a precursor to a more existential revelation our culture is going to have,
the extent to which the entire precept of the rational self-aware agent that we carry as our model of ourselves, is a confabulated falsehood maintained by our own minds.
Most of what most of us do most of the time is habitual, instinctual, autonomic, pre-conscious, whatever—very few of us are present very much of the time.
Even highly-analytic complex logical reasoning can and does often transpire in something akin to a "fugue state," indeed that is a much-noted aspect of technical work like most here engage in.
That doesn't mean that we are not capable of genuine self-awareness, introspection, and methodical reasoned thought—but I don't think it's controversial (except in our lay conception) whether or not those things are primary or common modes of being in the world.
Record yourself for 24 hours and compare honestly the sequence of your utterances and behaviors with your internal record of agency, and you'll find you are on autopilot in some sense most of the time.
From this I infer that the first "real" seeming AGI will like us be not monolithic capable systems, but relatively loosely coupled aggregates, with many components serving as analogs to aspects of mind and embodiment largely distinct form and only loosely coupled to the "executive" function.
However, the items I added to my works have a stronger claim than 90% AI-free: I used "100% AI-free organic content."
I think I'll keep this stronger claim because I can; I have not tried any of these tools. Not even once. I refuse to.
I am betting on the fact that if I don't, I'll start to have more influence, like Paul Graham said a few days ago.
If, by some chance, this actually caught on and was some kind of indicator of quality, it would be in a predictive model's best interest to integrate the logo into any kind of web design it produces. Furthermore, no human-curated content farm would hesitate to include this symbol, regardless of the content's human 'purity'.
Anyway, nice idea. I like it. Kudos to you.
Yours, KodingKitty
I also liked the idea behind the label, but on second thoughts the whole idea behind a label sounds counter intuitive to me. Like there's some kind of truth in sticking a label on something. So, should I trust a label then? Is there some authority in a label? Do I even need authority to point out what's real and what's not? What's reality then? Does it even matter? Anyway, you get the point. A lot of questions.
And then I read further on that site, and it tells me how to use the label. Don't change the label. Don't change the color, don't change the text. I think it's human to break rules, in one way or another. Or at least to push the boundaries. So - to me - it's more human to change this label, and not use it as intended by the creator of the label.
Maybe that's the difference between AI and humans. AI will follow rules (set by it's masters), whether it allows for randomization or not, it will follow rules. Human beings - sometimes - break rules.
If the goal is to highlight creation by humans, I’m struggling to see how slapping a big fat sticker that says “AI” in large type on it would be a good idea?
All it does is give AI more attention and associates the work with it.
Of course it does. If you make a purchase based on this badge and you later find out it's false, that's fraud. You don't need to have a certification program before false advertising becomes illegal.
There's a sort of insecurity about this. And I get it, people are freaking out. I feel unsettled in many ways and I truly believe this technology is profound. Wait until it's realtime and embedded in everything. But I don't believe it's a cause for concern. I don't think we need to legislate transformers trained on public data as something illegal, for example.
I think it's a tool, a fantastic tool and a powerful tool, but a tool none the less. It will democratize many things and it will lead to increased productivity and creativity. But we, humans, will find more things to do; ways we can't see right now because it wasn't possible and still probably isn't.
Soon, many digital image editors will incorporate some simple AI-powered features to detect edges, faces, various objects, etc. Are content-aware plug-ins to Photoshop AI?
This happened years ago. Agreed with the first part though - there is almost literally no possible route to avoid AI on consumer phones.
This is an absurd thought. AI content is still created by humans using AI, just like photos are created by humans using cameras.
Also who says that AI can't be as creative as humans or better? Why would humans be better at being creative "manually" instead of with machines?
I say AI creativity will be better than simple humans at creating stuff. Humanity will overall benefit from it and the world will be less dull.
I doubt if this concern is valid for humanity. Creating is an innate desire of human beings. We are going to have lots of AI-generated content, but I have confidence that people will continue to generate high-quality content too. Of course, how to find such quality content will be a different matter in the future. Maybe an industry-standard watermark or metadata for generated content is a cheap enough solution.
Well, to me, these anti-AI things are much more like NFT/crypto. It's just trying to create novelty out of nowhere.
To avoid AI content means intentionally avoiding sites that are designed for cheap monetization and nothing more. That includes most blogs today; the ones that aren't personal websites. Most 'review' websites are exactly this; they offer a generic description and add an Amazon affiliate link.
We won't know for sure what sites are using it and which aren't, but knowing the nature of the site (commercial/non-commercial) is a pretty good proxy.
The kicker is the "90% of content will be generated by AI by 2025" quote by an "expert" which links to a Yahoo-TV clips of some rando influencer spouting nonsense. Better luck next time, ChatGPT!
It does a lot more than that. It is capable of novelty and striking out in novel directions per the AI artist’s direction. Maybe there’s some soul-grounded capital-t truth forever out of reach, but that’s metaphysics.
People have always wondered if even people are creating anything new ("nothing new under the sun"), so it's quite silly to say that a model which produces data based on its training set and prompt is doing anything other than mix.
Some day AI will be able to create by itself, but the current state of the art isn't doing it.
Perhaps somewhat ironically, if we start identifying all human-created works as such, it creates a new set of pristine training data for future generative AI/LLMs.
I thought I’d have an open mind and read the “why” section.
> it helps your audience recognize your human-first approach
So, no. Refraining from using AI is not a human-first approach, when using AI can actually help significantly with human problems.
The badge is a false and outmoded signal which says more about the user being a virtue signaling type personality.
Other than that, it doesn't seem like a bad idea for people to be able to mark their work as "created by a real person" with a cool little badge like this. It's simple but it works.
Isn’t this already a “made by AI” artwork?
And how much is 10%? If a traditional painter that uses photo reference starts using AI generated photorealistic imagery instead, is that more or less than 10%?
* If there is any kind of real value to having this badge, then people will just use it even if they're using AI
* Rejecting technological advancement is like begging for extinction
No need to make websites or manifestos, do not worry humans will still human!
Anyway, I like the initiative. We'll need much more to make a digital divide that can guarantee human autonomy.
[ Art Too Bad To Have Been Done By AI ]
/S but only a bit.
If there is such a way, someone please let me know.
In the 2000s, if you were talking on the internet, you were pretty sure that you were talking to other humans.
This will quickly reverse as in being sure that one is talking to AI most of the time on anything remote (internet, phone, email, ...)
Which in turn will massively strengthen personal (like non-remote) face-to-face interactions.
phew. need GPT to show me how to properly use the badge.
Of course, a juxtaposition between this and Apple is very silly.
There is a lot of evidence that language drives cognition and there have already been instances of ANIs creating their own efficient languages. Imagine that the year is 2040 and euclidian encryption is augmented via transcoding into AI languages to safeguard against quantum computing or replaced with quantum-resistant algorithms altogether. A turing complete quantum processor was actualized a few years ago, or we just finally figured out that ML and the probabilistic nature of Quantum Computing are a perfect fit. People tend to vastly underestimate the rate of progress over 10+ years.
I can't help but think that social networks and search engines will start to negatively rank AI generated text. In my opinion, this would impact many more people than we think.
I'm not quite convinced humans create original ideas as often as some think
Any text I write is vaguely influenced by everything I've experienced(Other media, Real life experiences) even if it's not a conscious process
Depending on where you draw the line, I'd say most artistic work would't qualify as actually new
> displaying the badge on any asset does not guarantee the content is not majorly created by AI.
What a farce.
On the bright side, this is likely also going to play out this way and the market for the prompt "artists" will be nonexistent unless they add any value (which they rarely will).
We are meat based, the new minds are silicon based.
They have different strengths and limitations than us, will surpass us in all cognitive abilities soon.
Our hardware evolved to help us survive and reproduce but the evolution is a slow process compared to the intelligent design we do.
AI will surpass us and build itself beyond what we can reason about.
We will do art and work because we like to do it, just like playing video games, no other reason will survive.