292
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 16 Jun 2024
292 points (98.7% liked)
Asklemmy
43781 readers
876 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
Hi-resolution audio, especially for streaming. The general idea is that listening to digital audio files that have a greater bit depth and sample rate than CD (24-bit/192Khz vs 16-bit/44.1 KHz) translates to better-sounding audio, but in practice that isn't the case.
For a detailed breakdown as to why, there's a great explanation here. But in summary, the format for CDs was so chosen because it covers enough depth and range to cover the full spectrum of human hearing.
So while "hi-res" audio does contain a lot more information (which, incidentally, means it uses up significantly more data/storage space and costs more money), our ears aren't capable of hearing it in the first place. Certain people may try to argue otherwise based on their own subjective experience, but to that I say "the placebo effect is a helluva drug."
I've always kinda wondered about this. I'm not an audio guy and really can't tell the difference between most of the standards. That said, I definitely remember tons and tons 'experts' telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.
So I'm curious if there's just a natural variance in an individual's ability to hear and audiophiles just have a better than average range that does exceed CD quality?
Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can't. Seems like these guidelines are probably more averages, rather than hard biological limits.
i think hi res is for professional work. If you're going to process, modify, mix, distort the audio in a studio, you probably want the higher bit depth or rate to start with, in case you amplify or distort something and end up with an unintended artefact that is human audible. But the output sound can be down rated back to human levels before final broadcast.
O couse if a marketing person finds out there is a such a thing as "professional quality". . . See also "military spec", "aerospace grade"
Yeah to expand on this, in professional settings you'll want a higher sampling frequency so you don't end up with eg. aliasing, but for consumer use โฅ44โ48kHz sampling rate is pretty much pointless