Audio, Precision, & Measurement: Richard Cabot Page 2

Harley: Bill Rasnake?

Cabot: [Nods] He keeps telling me that one of these days he's going to come down and prove to me that [CD Stoplight] is audible. I haven't ever listened to it. I have doubts based on my conviction that when the data coming off the disc is right, it's right. But turning it into audio is a whole different story.

Harley: It's generally believed that CD transports and interconnects between transports and digital processors have different sonic qualities. In theory, all transports should sound the same. What's your reaction to this?

Cabot: I have not seen any evidence of differences between transports myself. I admit I've not spent a lot of time looking. I've seen rather dramatic differences in converters themselves—I'm a firm believer that different converters have different sounds. My gut tells me that the data coming off the transport—if it's correct, if it's what's on the disc—should not be a limitation. It really is a data-recording medium, and you should treat it like that. It's when you get to reconstructing that data that a lot of differences are introduced.

One of the most amazing experiments I ever did was with the Magnavox [CD player] chip set—the Philips SAA72220 and TDA1541 DACs. One of our customers told me they were using that chip set in a professional piece of gear they were making. I had measured the Magnavox CD players and they had horrendous linearity errors. I said, "How can you be using this chip set? I've measured them and they don't work very well." And he told me that you just have to know how to hook them up. They work real well if you know what to do. They are really sensitive to ground-noise problems, and to glitching on the waveforms coming in. I said, "Do you mean the D/A [converter chip]?" He said, "No, the interface between the signal coming in [to the DAC] and the oversampling filter." I was pretty amazed that the logic signals coming into the converter from the oversampling filter could make a difference.

But evidently it does. You have to buffer the lines, trim them up, make them nice and clean without any large amounts of ringing or overshoot or undershoot. They have to be nice, firm, well-controlled squarewaves coming into the D/A converter or you'll get bad linearity errors. This was a guy I had a fair degree of faith in, who would have some reason for saying what he said. When I got back home I took a friend's CD player, measured it, opened it up, got out the Philips data books, and looked at the interfacing. I cut the lines and put in a CMOS buffer and some RC networks that shaped the waveforms so they were nice and clean, and they looked real good on the 'scope.

I measured the player again. The linearity improved by something like 6dB. Instead of being 9dB out low levels, it was now out 3dB. I was amazed. I am not a golden ear—I wouldn't swear to you that I heard a difference before and after, but I could swear to you that I measured a difference before and after. But when I tell people that there were differences in the logic lines—all I did was shape the waveforms on the logic lines before the D/A converter—they were amazed.

I knew another person who had a Magnavox CD player and did the same thing. It's real. It's repeatable. So there could easily be differences between two players, at least between converter interfaces—how to hook them up or the care in power supplies or grounding. Because the converter was obviously latching bad data based on what it saw previously with glitches or overshoots. These two chips were designed and made by the same manufacturer and designed to hook to each other, but if you just hooked them to each other they didn't work very well.

Harley: Based on your extensive knowledge of the S/PDIF digital interface, what factors could cause a digital data stream from a CD transport to sound different from another, assuming that the transmitted data is error-free? This is an area that many designers are starting to pay attention to—jitter in the transmitted signal creating jitter in the recovered clock.

Cabot: That would depend on how good the phase-lock circuity is. You can certainly get a lot of jitter in the received signal; if it's not filtered out by a well-designed phase-lock loop [PLL], then it will result in jitter at the D/A converter. I don't know at what point you can hear the difference. We can create jitter pretty easily on a digital waveform to see what the reaction of the PLL is, but I don't know what the state of commercial converters is.

Harley: The figure most often discussed as being audible is 100 picoseconds of clock jitter at the DAC for 16-bit DACs and somewhat less for 18- and 20-bit units.

Cabot: That would give you an error at high frequencies that would be down around the LSB [Least Significant Bit] region. I guess I have a hard time believing that an LSB error will be audible at high frequencies. I haven't tried to define where that threshold is. There has been work done that has shown that there are non-linearity thresholds in converters that are audible at the 14- to 15-bit level. But you really need a good 16-bit system to get around those. I'm sure that when you start using the full dynamic range of a converter, as in a professional application, you would need potentially more than 16 bits of range to accurately capture that music without a lot of gain riding, without your system clipping from overload from a loud transient. As to how that relates to how much of an error you'll hear at the bottom end of that range of a few bits of distortion, I haven't done the experiments that would let me know that.

Share | |

X
Enter your Stereophile.com username.
Enter the password that accompanies your username.
Loading