Like what would be the expected factor of this blow up to make up the difference between ternary and whatever 16 bits encoding they were using?
I mean intuitively I'd expect to need ~10× the symbols to encode the same information? Are they using an order of magnitude more parameters, or is that not how it works?