If I can get a proper static copy of MDN, I’ll make a torrent and share a magnet link here. I know I’m not the only one who wants something like this. I don’t think the file sizes will be so bad. My current “build” of the entire site is sitting at 1.36 GiB. (Only a little more than double the size of node_modules
!) So, with browser compatibility data and such, I think it’ll still be less than 2GiB.
Aggressively compressed with bzip2 -9
, it’s only 114.29 MiB. A compression ratio of 0.08. That blows my mind.
2 in the morning is a great time to compare compression algorithms.
Ratio File size Filename Command Algorithm
1 1458553185 build/
0.451 658022612 ../node-modules/
0.322 469704387 build.tar.Z compress -k build.tar Lempel–Ziv–Welch (LZW) (oh, how far we've come)
0.185 269780511 build.tar.gz gzip -k9 build.tar Deflate
0.082 119839762 build.tar.bz2 bzip2 -zk9 build.tar Burrows–Wheeler transform
0.047 68258612 build.tar.br brotli -kZ build.tar Brotli
0.047 67989604 build.tar.zst zstd --ultra -22 build.tar Zstandard
0.046 67705992 build.tar.xz xz -zk9e build.tar Lempel–Ziv–Markov (LZMA)
0.046 is really mind-blowing. I don’t need a torrent, we’re approaching e-mail attachment file sizes here.
@abucci@anthony.buc.ci I didn’t time all of them, I probably should have, but xz
has its own timer. If I remember correctly, it took 7 minutes and 17 seconds on my toaster to compress 1.36 GiB, mostly text, at the highest compression level. I don’t think that’s all that bad.
xz
also lets you use multiple threads, which isn’t common on these tools. I didn’t do it for this test because there is an extremely small size penalty for doing so and I wanted to go all-out.
Here’s a good blog post that shows the differences with multi-threading. The size difference is negligible, and that test showed no measurable difference in file size between 2 cores and 32 cores. There are diminishing returns in speed, though.