(Pardon the joke, but while the "m" obviously means "M" from context, I really wish people -- especially computer engineers -- wouldn't say bits when they mean something eight times as large.)
What ? Sure it is. When you measure the information content (or, the entropy) of a message, you very frequently get non-integer numbers of bits per (character/unit/message/whatever).
Written english, for instance, has 1.46 bits of information per character.
Or 146 cb.
Please don't assume this. I have the great pleasure working with network Engineers, who have apparently globally decided that bits are a perfectly reasonable measurement of throughput and react very differently to speed in Mb/s and MB/s. I'm not trying to be pedantic or say that this is how it should be, I'm just saying that people really do use both units and it is horribly confusing and anything you can do to not be ambiguous is appreciated.
So a 1 megabyte file (as reported by the file system) is actually 1048576 bytes, which technically - sorry, I mean pedantically - speaking, is 1 mebibyte.
To make matters worse, disk manufacturers use the decimal prefixes, so our nice 1 terabyte drive is 931 mebibytes, but is reported by the file system as 931 megabytes (not MiB).
Finally, memory manufacturers use the binary prefix, so 1 megabyte of RAM is actually 1 mebibyte (1048576 bytes).
A bit of a mess, no?
All the above is, IMHO, a consequence of imprecision. If we get used to being loose with our terminology, we risk carrying that attitude over into our work product, with sometimes regrettable results.
So I'll continue to strive to be pedantic (translation: precise).
That amount of sloppiness in any other engineering discipline would just finish you off immediately.
This is putting the burden of collaboration in the wrong place: it shouldn't be a question of, can we expect a reasonable engineer in the industry to understand this unambiguously (with some deductions); but rather, can I hold down the shift key when typing the abbreviation for megabytes.
Obviously this depend on the actual audience, don't bother following this in team chat where speed is more important than clarity.
Just look at all that "technically it's mebibytes bla bla bla" in replies. No one cares. Write some code. Or better - go outside.
in that case it was confusion between metric and certain fantasy engineering units, but an error of 1000/1024 will cause troubles just as badly.
so with that attitude maybe don't write that code, and better stay inside or a rocket might fall on your head.
but for serious, that correction probably has taught more than 10 people the difference between uppercase B = bytes, lowercase b = bits, uppercase M = mega = 1 000 000, lowercase m = milli, MiB = mebibyte = 1024 x 1024 bytes = 1 048 576 bytes, or at least made them aware of the important fact there is a difference. while your pedantry about nitpicking has taught nobody anything except to always be alert cause there's people like you that like to offload mental ballast and use wrong units because they insist their errors can be inferred and corrected from context... which is an important lesson also, but as a warning, not to defend the behaviour.