I like to quip that people need to imagine LLM security as at-least-as-bad as javascript code in someone else's web-browser: A determined person can make them emit whatever they want (as noted above) and also none of the data that went into them is reliably secret either.
As an analogy, it still needs some work through, since it doesn't adequately alarm people about the risks of covertly poisonous data even with an honest user.