This is exactly backwards, large entropy means large information content. If the circles are all red or all blue, then you need only one bit to distinguish between the two possibilities. If half of the circles are red and the other half blue, then you need one bit per circle to describe the circles.
[1] https://en.wikiquote.org/wiki/Claude_Elwood_Shannon#:~:text=...,'
Also the entropy you calculate depends on what you know about the system in question...