I don't think blocks are easily compressed. The data is rather random since hashes are random and in order to preserve all of the data so it can be verified, it must be losslessly compressed. However, since there isn't that much of a pattern to the data, it becomes rather difficult to losslessly compress blocks with a good compression ratio.
Speaking generally, random data reliably compresses to 50% of original size.