Data compression is the reduction of the number of bits that have to be saved or transmitted and this particular process is quite important in the internet hosting field due to the fact that info filed on hard drives is usually compressed in order to take less space. You'll find various algorithms for compressing data and they provide different effectiveness based on the content. A lot of them remove only the redundant bits, so no data can be lost, while others erase unnecessary bits, which results in worse quality once the data is uncompressed. This method needs a lot of processing time, so a web hosting server has to be powerful enough to be able to compress and uncompress data immediately. An example how binary code could be compressed is by "remembering" that there're five sequential 1s, for example, in contrast to storing all five 1s.

Data Compression in Shared Hosting

The ZFS file system that is run on our cloud hosting platform employs a compression algorithm identified as LZ4. The latter is a lot faster and better than any other algorithm available on the market, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that quickly, we're able to generate several backup copies of all the content stored in the shared hosting accounts on our servers daily. Both your content and its backups will need less space and since both ZFS and LZ4 work very fast, the backup generation will not change the performance of the web servers where your content will be stored.