this post was submitted on 16 Jun 2024
292 points (98.7% liked)
Asklemmy
43901 readers
1068 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hi-resolution audio, especially for streaming. The general idea is that listening to digital audio files that have a greater bit depth and sample rate than CD (24-bit/192Khz vs 16-bit/44.1 KHz) translates to better-sounding audio, but in practice that isn't the case.
For a detailed breakdown as to why, there's a great explanation here. But in summary, the format for CDs was so chosen because it covers enough depth and range to cover the full spectrum of human hearing.
So while "hi-res" audio does contain a lot more information (which, incidentally, means it uses up significantly more data/storage space and costs more money), our ears aren't capable of hearing it in the first place. Certain people may try to argue otherwise based on their own subjective experience, but to that I say "the placebo effect is a helluva drug."
I've always kinda wondered about this. I'm not an audio guy and really can't tell the difference between most of the standards. That said, I definitely remember tons and tons 'experts' telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.
So I'm curious if there's just a natural variance in an individual's ability to hear and audiophiles just have a better than average range that does exceed CD quality?
Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can't. Seems like these guidelines are probably more averages, rather than hard biological limits.
I think this is the case where certain people simply can't see it here the difference.
I collect video game and movie soundtracks and the main difference I can hear between a 320kbps VS a FLAC that's in the 1000kbps range is not straight up "clarity" in the sense that something like an instrument is "clearer" but rather the spacing and the ability to discern the difference where instruments come from is much better in a Hi-Res file with some decent wired headphones (my pair is $200). All this likey doesn't matter much though when most users stream via Spotify which sounds worse than my 320kbps locally and people are using Bluetooth headphones at lower bitrates since they don't have better codec compatibility like aptX and LDAC.