It doesn't even know mildly obsecure facts that are on the internet.
For example last night I was trying to do something with C# generics and it confidently told me I could use pattern matching on the type in a switch statwmnt, and threw out some convincing looking code.
You can't, it's impossible. It wàa completely wrong. When I told that this, it told me I was right, and proceeded to give me code that was even more wrong.
This is an obscure, but well documented, part of the spec.
So it's not about facts that aren't on the internet, it's just bad at facts fullstop.
What it's good at is facts the internet agrees on. Unless the internet is wrong. Which is not always a good thing with the way the language it uses to speak is so confident.
If you want to fuck with AI models as a bunch of code questions on Reddit, GitHub and SO with example code saying 'can I do X'. The answer is no, but chatgpt/codepilot/etc. will start spewing out that nonsense as if it's fact.
As for non-proframming, we're about to see the birth of a new SEO movement of tricking AI models to believe your 'facts'.