this post was submitted on 18 Feb 2025
80 points (97.6% liked)
PC Gaming
9560 readers
1929 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't know much about compression algorithms. What are the benefits of doing this?
zstd is generally stupidly fast and quite efficient.
probably not exactly how steam does it, or even close, but as a quick & dirty comparison: compressed and decompressed a random CD.iso (~375 MB) I had laying about, using zstd and lzma, using 1MB dictitionary:
test system: Arch linux (btw, as is customary) laptop with AMD Ryzen 7 PRO 7840U cpu.
used commands & results:
Zstd:
So, pretty quick all around.
Lzma:
This one felt like forever to compress.
So, my takeaway here is that the time cost to compress is enough to waste a bit of disk space for sake of speed.
and lastly, just because I was curious, ran zstd on max compression settings too:
~11s compression time, ~0.5s decompression, archive size was ~211 MB.
deemed it wasn't nescessary to spend time to compress the archive with lzma's max settings.
Now I'll be taking notes when people start correcting me & explaining why these "benchmarks" are wrong :P
edit:
goofed a bit with the max compression settings, added the same dictionary size.
edit 2: one of the reasons for the change might be syncing files between their servers. IIRC zstd can be compressed to be "rsync compatible", allowing partial file syncs instead of syncing entire file, saving in bandwidth. Not sure if lzma does the same.
Better compression -> faster downloads
Especially since lzma currently CPU bottlenecks on decompression for most computers on fast internet connections. Zstd can use the cpu much more efficiently.
"Better" doesn't always mean "smaller", especially in this example. LZMA's strength is that it compresses very small but its weakness is that it's extremely CPU-intensive to decompress. Switching to ZSTD will actually result in larger downloads, but the massively reduced CPU load of decompressing ZSTD will mean it's faster for most users. Instead of just counting the time it takes for the data to transfer, this is factoring in download time + decompression time. Even though ZSTD is somewhat less efficient in terms of compression ratio, it's far more efficient computationally.
Bet that'll save Valve on some server costs too. Storage is much cheaper than compute (though I imagine they'll probably keep LZMA around for clients on slow connections).
Doesn't decompression only happen client-side? I don't imagine them compressing the files multiple times.
Hmm true. I was thinking that steam has a lot of games and respective builds it has to compress, even if the decompression benefits are clientside only.
Each new game update would also be compressed too - I have no idea how Steam handles the update to work out what files need replacing on their end though, which might involve decompressing the files to analyse them.
"most users" have gigabit pipes to the internet?