The term data compression refers to reducing the number of bits of information which needs to be saved or transmitted. This can be done with or without losing data, which means that what will be deleted in the course of the compression can be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the content and the quality shall be identical, whereas in the second case the quality will be worse. You will find different compression algorithms which are better for various kind of info. Compressing and uncompressing data usually takes plenty of processing time, which means that the server carrying out the action needs to have enough resources to be able to process the info fast enough. One simple example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 inside the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Hosting

The ZFS file system that runs on our cloud hosting platform uses a compression algorithm identified as LZ4. The latter is substantially faster and better than any other algorithm you will find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data really well and it does that very fast, we are able to generate several backups of all the content stored in the hosting accounts on our servers daily. Both your content and its backups will need less space and since both ZFS and LZ4 work very fast, the backup generation will not affect the performance of the web hosting servers where your content will be stored.