Recently I had to back up a virtual machine server I had in the US before decommissioning it. As I had a lot of large folders I realised that compressing them would take some time. Being the lazy administrator that I am I took some time out to find the fastest compression command on Linux before I began my backup process.
Here is what I found for a 117Mb folder.
dhruba : /backup # time tar -cjf root.home.tbz /root/ real 0m34.600s user 0m34.040s sys 0m0.357s
dhruba : /backup # time tar -czf root.home.tgz /root/ real 0m6.255s user 0m5.672s sys 0m0.415s
dhruba : /backup # time tar --use-compress-program=lzop -cf root.home.tlzo /root/ real 0m2.624s user 0m2.061s sys 0m0.383s
The file sizes for the three commands were as follows.
dhruba : /backup # ls -lShr -rw-r--r-- 1 root root 81M Dec 14 20:46 root.home.tbz -rw-r--r-- 1 root root 83M Dec 14 20:43 root.home.tgz -rw-r--r-- 1 root root 88M Dec 14 20:43 root.home.tlzo
As you can see lzop has by the fastest compression size but the largest compressed file size. If, like me, this is the trade-off you are looking for then lzop is your answer.