Thursday, January 13, 2005

 

Further lossless compression of JPEG images

The makers of Stuffit have software ready for incorporation in their next version that gets around a further 30% lossless compression out of jpeg image files. These claims have been independently checked out by Jeff Gilchrist who maintains an archive comparison page. His figures were 25% compression - which is at least close to the claimed figures. Maxiumum Compression also received a beta version of the software, and achieved 23% compression on their test image.

The whitepaper (in PDF) gives few clues as to what they are doing, but speculation is rife. What seems to be self evident is that they have at least replaced the RLE and Huffman encoding stages, which are the lossless stages at that complete the JPEG algorithm, with more efficient methods. This is not novel (and would appear to be unpatentable) - after all, the official JPEG algorithm always allowed for the use of arithmetic coding instead of Huffman, though in practice this is rarely used due to existing patent complications.

There are other well known inefficiencies in the use of fixed length codes for the beginning and end of blocks, which mean that at higher (lossy) compression ratios, the JPEG algorithm performs particularly poorly - with a 30% inefficiency easily explained. Of course, when the JPEG standard was developed, computers were not as powerful, so a bit of inefficiency in the encoding was a fair price to pay to simplify the amount of processing power needed.

Comments:
Their claim of "lossless compression" is dishonest. What their white paper actually says is that they offer "the ability to reduce the size of JPEG images without further reducing the quality of the image." Since JPEG compression is already lossy, so is Stuffit compression.

Also, LZW showed us what a hornet's nest working with a patented compression scheme is.
 
Post a Comment



<< Home