Not at all at all, I understand your confusion. Mine is definitely a niche case, perhaps straying a bit into the eclectic.
I have put together a virtual file system for a programming environment for kids that is built atop the local storage/userdata API's.
I support legacy IE back to IE6 as well as modern browsers, and I wanted to maximize my very limited allowed storage space on legacy IE.
Also, I needed to have some certainty around how much free space is remaining in the file system. My design was to organize the underlying storage into 4K blocks that were pre-allocated when the system first starts; whatever the host system could give me in terms of space I would take.
JSON is nice for the general case, but the way it escapes some characters makes computing the free space I have remaining unpredictable as new files get written.
Of course, I can do my own encoding of bytes into unicode char codes and write those as raw strings using store.js. JavaScript strings are saved as 16-bit unsigned arrays if you unfocus your eyes and stare at the screen for awhile, and I use as many bits as I am able in my scheme.
So, I rigged up a byte-to-string encoder/decoder that was universally accepted on IE and modern browsers, and calling through to my getRawString/setRawString API, I just about doubled my storage over the default JSON.stringify/parse!
Suprisingly, performance isn't bad with my encoded strings either. That said, my compy has a retro flair, and I don't think my users will mind much that they have to wait a second or two to load their games from disk. Perhaps even adds a bit of authenticity. :)