What does amaze me a little, is the fact that the rather more complex Huffman algorithm was published and implemented decades before LZ.
The core idea of LZ, however, has been known for centuries: https://en.wikipedia.org/wiki/Iteration_mark
However, that's missing the point: lz4's decompressor is simultaneously simple and also the fastest thing aroud, at least at the moment.
In fact, it is so fast, it can be used to accelerate local data transfers over "slow" links like bonded 10GBit ethernet and arrays of PCIe SSD's.
It wasn't novel when Lempel and Ziv described it 1977 - the encoding idea itself is almost trivial, and was described before. However, they did prove the conditions under which this compression is asymptotically optimal, which was NOT at all clear or trivial at the time - and it is therefore named after them.
LZ4 is an implementation of the LZ77 idea that optimized run time first and compression ratio second. It is elegant and successful - but it has little novelty.
Not sure what you'd consider suitably documented beyond what is already out there. Not being patented or published in a journal look like positives to me.