The data (entropy) compressor used so far only takes into account that the samples (or coefficients, depending on the number of layers used) that, when compressed by byte planes, may contain a significant number of zeros, thus increasing the compression rate. Furthermore, we know that the coefficients tend towards zero as their frequency increases, even without perceptual quantization.
Implement an entropy coding algorithm that exploits the fact that, starting from a certain coefficient (always beginning with the lowest frequencies), and processing the coefficients by byte planes, from a certain point onward, all the remaining bytes in that bytes-plane are zero. To do this:
Using the byte FF as a zero-escape marker, implement this “byte stuffing” tecnique:
Example Original Data Stuffed/Encoded Data
-------------------- ------------------ --------------------
Normal data [05, 12, 08] [05, 12, 08]
Data has the marker [05, FF, 08] [05, FF, 01, 08]
End with zeros [05, 00, 00, 00] [05, FF, 00]
Stuffed data [05, FF, 01, 08] [05, FF, 01, 01, 08]
Here, the stuffing byte is 01. Obviously, the decoded data exactly matches the original data.
Test also other dictionary-based data compressors instead of zlib, and compare
in speed and compression ratio. Do not modify DEFLATE_raw.py.
A new InterCom layer named zero_coding.py. Write the code in a notebook, along
with an evaluation of this encoding technique.