How is it false? I’d say an LLM is like the output you’d get if you forced someone to write something with a strict time limit and without being allowed to go back and edit things or look anything up - likely to be wrong about anything that needs deep thought, but not entirely useless for simple things that are just tedious like boilerplate code