You are here

Log in or register to post comments
m1ghtytexan's picture
Last seen: 3 years 1 month ago
Joined: Jul 2 2019 - 5:14pm
Bit Error Rates for Digital Audio


New member here looking to learn from you guys about digital audio. Of course, our goal with digital audio is to provide our speakers with an error-free, or as-close-to-error-free-as-possible, representation of our digital media files which by their nature are simply 1s and 0s on our drives or streaming services' drives. So, my question relates to Bit Error Rates (BER).

You've undoubtedly seen things like Real Time Operating Systems and other methods of ensuring Bit Errors are kept to a minimum. But, the BER for HDMI 1.3 is specified to be less than 3x10^-11 (See: For our media, there's a pretty high probability that we'll never hear bit errors; at least as it relates to the HDMI interface.

Other sources of bit errors could be moving media from a hard drive to RAM or transfer over a network. Let's just assume our error rate on internals is negligible, otherwise we wouldn't even be able to post on this forum without characters exploding into unreadable character sets. Maybe an ignorant assumption on my part, but it seems like a Real Time O/S would be overkill for Hi-Res Audio. Maybe I'm wrong...?

At any rate, our network might be the source of errors, but TCP/IP is error correcting, so our BER over the network (especially at home over short Ethernet connections) would be quite low and trivial.

So, my question is, with BERs being so low in interfaces like internal hardware, HDMI, and networking, where should we be concerned about bit errors as it relates to playing our audio files? And, how do we protect against them?

  • X