Data compression is the decrease of the number of bits that should be saved or transmitted and this process is quite important in the internet hosting field as info stored on hard disks is generally compressed to take less space. You can find different algorithms for compressing information and they have different effectiveness depending on the content. Many of them remove just the redundant bits, so no data can be lost, while others erase unneeded bits, which results in worse quality when the data is uncompressed. The method uses plenty of processing time, so a hosting server has to be powerful enough to be able to compress and uncompress data in real time. An instance how binary code could be compressed is by "remembering" that there're five consecutive 1s, for example, in contrast to storing all five 1s.
Data Compression in Shared Hosting
The ZFS file system that runs on our cloud web hosting platform uses a compression algorithm called LZ4. The latter is significantly faster and better than every other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Since the algorithm compresses data really well and it does that quickly, we can generate several backup copies of all the content stored in the shared hosting
accounts on our servers on a daily basis. Both your content and its backups will take reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the web servers where your content will be stored.