Data compression is the compacting of info by reducing the number of bits that are stored or transmitted. Consequently, the compressed info needs less disk space than the original one, so additional content might be stored using identical amount of space. You will find many different compression algorithms that function in different ways and with some of them only the redundant bits are removed, so once the information is uncompressed, there is no loss of quality. Others delete excessive bits, but uncompressing the data subsequently will result in lower quality in comparison with the original. Compressing and uncompressing content takes a significant amount of system resources, particularly CPU processing time, therefore every Internet hosting platform which uses compression in real time needs to have sufficient power to support this attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of saving the actual code.