The term data compression refers to reducing the number of bits of information that has to be saved or transmitted. This can be achieved with or without losing information, so what will be erased at the time of the compression will be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the info and the quality will be the same, whereas in the second case the quality shall be worse. There're various compression algorithms that are more efficient for different kind of information. Compressing and uncompressing data in most cases takes plenty of processing time, which means that the server carrying out the action needs to have plenty of resources to be able to process your data quick enough. One simple example how information can be compressed is to store just how many sequential positions should have 1 and just how many should have 0 in the binary code as an alternative to storing the particular 1s and 0s.
Data Compression in Cloud Hosting
The ZFS file system that is run on our cloud hosting platform employs a compression algorithm identified as LZ4. The aforementioned is substantially faster and better than any other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the performance of sites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data very well and it does that quickly, we can generate several backups of all the content kept in the cloud hosting accounts on our servers every day. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the web servers where your content will be kept.