Yesterday, 07:20 PM
(Yesterday, 08:16 AM)the_bertrum Wrote: The audiophile cynic in me says its because most folk actually can't tell the difference between bitrates once they pass a threshold of "enough" (which is surprisingly low) unless they have some other information to guide them such as a bitrate display.
When I'm less cynical, I'll accept that some folk might be able to hear the difference, but that I've never been in the presence of such skill.
Yeah, i'm usually with you in that I can't tell the difference between, say, 256 kbps (or even lower characteristic bitrate if VBR) and lossless. Once we're getting down into the double-digit bitrates, though, I can (or at least *could* with the older codecs) hear a distinct drop in quality, characterized by muddiness and swishing-swirling sound in the sibilants and other higher frequencies. Almost like the audio equivalent of JPEG artifacts. I haven't listened to low-bitrate stuff in a long time, so it could be that the codecs really are that much better now, but I think I'd likely hear a drop in 96kbps material if really was playing back at that bitrate.
(Yesterday, 11:06 AM)Tim Curtis Wrote: The bitrate shown for the built-in stations is hard coded in the radio station table. If you edit a station or click Audio info for the station the bitrate is one of the items in the list.
For stations that you add, if a bitrate is not entered then the bitrate shown comes from MPD's snapshot of the stream rate when it starts playing it. This rate can vary.
Makes sense. I'll double check the hard-coding.