LLMs are known to be compute/energy hungry to execute. It is a developing technology, if not downright experimental.
Therefore, this explanation is very likely. I cannot see the reason to call this a conspiracy.
How can I get help with this now?
Google result 1: https://stackoverflow.com/questions/77148711/create-a-trigge...
Google result 2: https://dba.stackexchange.com/questions/307448/postgresql-tr...
Like 90% of my questions like this are going to ChatGPT these days.
I can figure it out via the docs, but ChatGPT is SO convenient for things like this.
Failing that, read the documentation. Failing that, stand up a quick experiment.
Somehow, we survived before ChatGPT and even before saturated question boards. Those strategies are still available to you and well worth learning
And this is what happens when, say, you loose a job you've been doing for 10-15 years. You need to re-learn the world. And a lifetime is not enough to do it the way we used to do it.
But with it being down, my biggest advice would be to try it and see. something like dbfiddle.uk is perfect for these kinds of tests.
Also, when I asked it "what if I use PostgreSQL's non-transactional triggers", which I thought I just made up, it told me it wouldn't roll back the first insert: Non-transactional triggers are executed as part of the statement that triggered them, but they don't participate in the transaction control. So now I don't know what to think.
Try it and see? Why do you need an AI to help with this?
Wouldn't surprise me if bard permanently surpasses gpt in the next quarter. Particularly if openai is dialing down quality...
Using this to plug our open-source tool https://github.com/Marvin-Labs/lbgpt which allows ChatGPT consumers to quickly load balance and failover between OpenAi and Azure models.
What do you use to extract the text from a webpage and how do you handle websites with anti-bot measures?
https://www.gnod.com/search/ai
Looks like they all currently work.
If there are more, let me know.
However, you might want to get used to it, as it looks like it might happen not uncommonly.