Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

The algorithms you name are all rather outdated.

Typical gzip decompression speed is somewhere in the 200-250 MB/s region, compression is much slower. LZ4 for example tends to compress at ~600-700 MB/s, and decompress at several GB/s. zstd is tweakable over a very wide range of ratio-speed trade-offs.

LZMA(2) (xz) is a rather troubled format and should not be used any more. bzip2 has always been slower than gzip with usually marginally better compression. It has been irrelevant for a long time.



> LZMA(2) (xv) is a rather troubled format and should not be used any more.

You mean xz. Not sure where you got the idea that it’s a troubled format, but if you’re talking about the infamous “Xz format inadequate for long-term archiving”, IMO that’s just bzip2 authors taking a dump on xz for no good reason, and fortunately for us it’s bzip2 that’s basically irrelevant today, not xz.


> It has been irrelevant for a long time.

Funny you say that, my company uses bz2 for compressing pretty much everything.


And many companies use fixed-width record formats for data exchange... what's your point exactly?


I guess that it's not irrelevant. Deprecated, old, etc sure, but irrelevant?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: