Bits is Bits, Right?

"Never explain, never apologize." But in this month's "As We See It," I intend to do both. First, the apology:

To all those who ordered our Concert double-CD set and had to wait a long time to receive it: we're sorry about the delay. This project was described in the November 1994 issue of Stereophile; at the time of writing, I didn't imagine that we would have problems after the digital masters had been edited and sent off to the CD-pressing plant. Little did I know!

When I transferred the original analog tapes to digital, I used the Manley 20-bit ADC to feed our Sonic Solutions hard-disk editor, which can store up to 24-bit words. I did all the editing and master assembly at 20-bit resolution, then experimented with the various redithering/noise-shaping algorithms offered by the Meridian 618 processor to preserve as much as possible of the 20-bit quality when the word length was reduced to 16 bits on the CD master.

When I sent the master Concert CD-Rs off to the CD plant, therefore, I requested that the mastering engineer who was going to add the track'n'timing PQ subcodes do nothing to my data. In particular, I requested that he not remove DC offset (I had hand-trimmed the ADC for every transfer), not re-equalize, not change levels, etc. Most important, I requested that he not go to analog for any reason. The carefully prepared 16-bit words on my master were the ones I wanted to appear on the commercial CD.

Incidentally, an A/B comparison between the 20-bit data and the same data truncated to 16 bits—the four Least Significant Bits (LSBs) are simply chopped off—reveals the difference to be not at all subtle. If not quite, in the words of Ivan Berger of Audio, such a "gross" difference that my mother would immediately notice it, truncation nevertheless seems to be relatively easily identifiable—in short, like "traditional," "fatiguing" CD sound.

It's also interesting to note that the fact that my original had been analog tape with a noise floor higher than that of the 16-bit system wasn't a factor in these comparisons. Although many observers hold that analog noise ahead of the A/D converter effectively dithers it, this is not necessarily true. The noise is treated as part of the original signal, and is encoded as accurately or as inaccurately as the ADC allows. (The correct use of dither at the LSB level is a different issue.)

When I received test CD-Rs from the mastering engineer, who awaited my approval of them before starting the plant's presses rolling, I was horrified to find that the CD-Rs sounded different from my masters. In fact, before I even did any comparisons, I noticed that the CD-Rs sounded like the truncated samples of the Chopin Waltz I had used in my listening tests. I asked others at Stereophile to compare one of the CD-Rs with one I'd made when I did the original reduction to 16-bit word lengths. They heard the same difference. I then compared the pressing plant's CD-R against a DAT I had copied from the 16-bit master. Surprisingly, there was less of an audible difference, probably because my DAT recorder has high measured levels of datastream jitter; but the DAT still sounded fundamentally like what I'd intended the sound to be—in Linn terminology, it "played tunes"; the production CD-Rs didn't.

To make a long story short: I uploaded the production CD-R data into the Sonic Solutions and tried to null it against my original 16-bit data, which had been archived to 8mm Exabyte tape. As described by Bob Katz last December (Vol.17 No.12, pp.81-83), you feed the outputs of the two pairs of stereo digital tracks to the Sonic's digital mixer, then slide one pair of tracks backward and forward in time with respect to the other until you get sample synchronization, then flip that pair's polarity. If the data are the same, you get total—and I mean total—cancellation/silence. If not, the nature of the residue tells you where differences have been introduced. (I'm currently developing this technique as a much more meaningful version of the Hafler/Carver/Walker nulling test for amplifiers.)

Both my CD-R master and the DAT clone I'd made of it nulled totally against the 16-bit archive data, revealing that I had not made an error in the master preparation. However, not only could I not null the production data against the archive, the production master was longer by one video frame (1/30 of a second) every 20 minutes of program. This appears to show that the mastering engineer either a) used a sample-rate converter, or b) converted my carefully prepared data to analog, then reconverted it to digital using a 16-bit ADC with a sample clock running very slightly slower than mine (by just 1.2Hz!). What was incontrovertible was that all my careful work to preserve as much as possible of the 20-bit original's quality had been for naught.

Already having accepted that this would delay our customers receiving their discs, there was only one thing I could do. (Please note that we do not process credit-card orders or cash checks until the day we actually put product in the mail.) I prepared new 16-bit masters, sent them off this time to Digital Brothers in Costa Mesa, CA to have the PQ subcode data inserted, and instructed the pressing plant again to do nothing to the data. This time, when I received the test CDs, they nulled perfectly against the archived masters!

How common among CD-mastering engineers is the practice of changing the data when there's no reason for them to do so? Linn Products' Brian Drummond mentioned on CompuServe's CEAUDIO forum that, in the early days of CD, Linn Records sent off a digital master tape to the mastering plant and got back a CD with different numbers on it. "We did a bit-by-bit comparison against the master—they weren't the same."

A conversation I had with Reference Recordings' Keith Johnson at last November's AES Convention (see the report in this issue's "Industry Update") revealed that he initially had the same problem with his HDCD®-encoded recordings. In these, the additional high-resolution data are encrypted in the LSBs in a proprietary manner. Any DSP, even level-changing, will change the words and therefore destroy the encoding.

Keith related to me that the first test disc cut from his HDCD masters not only sounded wrong, it wouldn't illuminate the HDCD LED on his prototype decoder. In fact, it was no longer an HDCD recording. It turned out that that particular mastering engineer routinely converted every digital master to analog, then back to digital! Other mastering engineers never switch off the default input DSP options on their hard-disk editors, not realizing that, with 16-bit data, this will reintroduce quantizing noise. No wonder so many CDs sound bad!

As well as indicating the dangers of entrusting your work to the hands of others with different agendas, this story is interesting in that all the changes made to my data were at such a low level—30dB or more below the analog tape hiss—that proponents of "bits is bits" would argue that, whatever the mastering engineer had done, the differences introduced should have been inaudible. Yet what had alerted me to the fact that the data had been changed was a significant change in the quality of the sound—a degradation that I heard even without having the originals on hand for an A/B comparison!

Those who continually point out in Stereophile's "Letters" column that the differences in sound quality discussed by our writers are imaginary and probably due to the placebo effect should note that I had huge emotional and financial investments in wanting the production CD-Rs to sound the same as the originals. If I were to hear any difference, it would both cost Stereophile quite a lot of money to have the project remastered and delay our customers receiving their Concert CDs. In fact, it took time to work through the cognitive dissonance to recognize that I was hearing a difference when I expected—and wanted to hear—none.

"Bits is bits"? In theory, yes. In practice? Yeah, right!

X