the thing is, though, that the text generator self-anthropomorphizes.
perhaps if an LLM were trained to be less conversational and more robotic, i would feel less like being polite to it. i never catch myself typing "thanks" to my shell for returning an `ls`.