Google has open sourced a new compression algorithm

Mar 1, 2013 09:05 GMT  ·  By

Google is always looking at ways of speeding up the web, from any angle. Now it’s releasing a new compression method for the web which should result in smaller files sent over the pipes.

Dubbed Zopfli, the new compression algorithm offers a notable improvement in file size, but at a price, the algorithm is significantly more taxing on computing resources.

"The Zopfli Compression Algorithm is a new, open sourced general purpose data compression library that got its name from a Swiss bread recipe," Google explained.

"It is an implementation of the Deflate compression algorithm that creates a smaller output size compared to previous techniques," it added.

"The smaller compressed size allows for better space utilization, faster data transmission, and lower web page load latencies," it said.

Google estimates that its compression algorithm should result in a file three to eight percent smaller than what can be achieved with the popular zlib algorithm at maximum compression, for example.

Files compressed with Zopfli can be decompressed with existing methods and there will be no performance penalty at this end.

"It is a compression-only library; existing software can decompress the data. Zopfli is bit-stream compatible with compression used in gzip, Zip, PNG, HTTP requests, and others," Google further explained.

Three percent may not sound like much, but it's a big improvement since it adds up. Especially for files that get requested a lot, any little bit helps.

However, the compression method is best suited for static web content, files that don't change that often and that generate a lot of traffic. That's because Zopfli uses two to three times more CPU time than zlib at maximum quality.

For applications where file size is more important than CPU usage, Zopfli should be useful. Google is open sourcing the library, written in C and designed for maximum portability, so that others can implement or improve it.