"Blurry JPEG" for how ChatGPT "compresses" character-based knowledge into vectors. That "compression" process gives ChatGPT an ability to generalize because it learns statistics (unlike JPEG) but like JPEG it is a lossy process.
It's a terrible analogy because the entire point of ML systems is to generalize well to new data, not to reproduce the original data as accurate as possible with a space/time tradeoff.
In a world where supposedly more-tech-industry-aware writers are talking about what "ChatGPT believes" and other such personification... show me a better article.