Speaking of LLMs. Since LLMs like to hallucinate every now and then, an LLM could also hallucinate names of packages that it tells people to install. And those packages could in turn have been squatted by malware authors.
And in this way, malicious packages may be unintentionally downloaded by users even when those malicious packages did not yet exist when the LLM was trained. Just because the hallucinated package name was randomly later taken by someone malicious.
I've seen this effect get amplified also when somebody puts a "bad" answer in a public place like StackOverflow. It is possible to have quite a large blast radius from something like this!