The term data compression describes lowering the number of bits of information which needs to be saved or transmitted. This can be done with or without the loss of information, which means that what will be removed during the compression will be either redundant data or unneeded one. When the data is uncompressed later on, in the first case the info and its quality shall be the same, whereas in the second case the quality will be worse. There're various compression algorithms which are better for various sort of info. Compressing and uncompressing data usually takes lots of processing time, so the server executing the action must have adequate resources in order to be able to process your info quick enough. One simple example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 in the binary code instead of storing the actual 1s and 0s.

Data Compression in Shared Website Hosting

The compression algorithm used by the ZFS file system that runs on our cloud hosting platform is named LZ4. It can supercharge the performance of any site hosted in a shared website hosting account with us as not only does it compress info significantly better than algorithms used by other file systems, but it uncompresses data at speeds that are higher than the HDD reading speeds. This is achieved by using a great deal of CPU processing time, which is not a problem for our platform considering the fact that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it enables us to make backups more rapidly and on lower disk space, so we will have multiple daily backups of your databases and files and their generation won't affect the performance of the servers. This way, we could always recover all of the content that you could have erased by accident.