Jitter in, jitter out...
Although the digital data on a CD can be shown to be an exact copy of the studio master and is rarely corrupted by a CD player, the timing of the datastream coming off the disc can get a little sloppy, hence the term jitter. These small timing errors can then disturb the digital-to-analog conversion process, resulting in noticeable and measurable changes in performance.
As consumers, we're used to dwelling on the second half of the record/playback process---problems with transports, interconnects, and converters. We can sometimes forget that jitter is also an issue upstream at the recording and mastering stages of a CD.
In the January 1998 issue of Electronic Musician magazine, a letter from Ken Czepelka describes jitter degradation in the studio environment: "a digital signal with accumulated jitter that is feeding even the best multi-effects processor (with the intent of taking analog output from that device to tape or for additional analog processing) will degrade the audio." He goes on: "The circuitry in the multi-effects processor simply is not designed to create a clean clock (which will be used to clock its own converter) from a dirty one."
In a recording or mastering studio, a master clock---or "house clock"---is sometimes used to synchronize the various digital recorders, mixing consoles, effects devices, and mastering machines. Ken suggests using the studio A/D converter to generate the house clock in an effort to minimize further jitter accumulation from other devices down the chain. He also points out that most pro DAT or CD machines have rock-solid clocks for their D/A conversion. Thus, if these machines are not on the house clock, they may be able to reduce the jitter "present in the digital signal that was recorded" when generating an analog output.
As in the consumer tweak audio world, a debate also rages among recording/mastering folks about where jitter occurs and whether or not its effects are important. For instance, what about jitter introduced by AES/EBU datalinks during the recording or mastering processes? In the same issue of EM, Michael Cooper describes a test performed to determine if, under controlled digital mixing conditions, two digital signal cables would yield audibly different results. He theorizes that jitter would be the likely culprit if one cable sounded better than the other.
According to Michael, "A blindfold comparison was made of the two [identical] recordings, using the same exact signal path through 20-bit D/A converters, power amp, and nearfield monitors. The mix printed with the Apogee WydeEye cable was consistently identified as having more clarity, high-frequency detail, and depth [compared to a Conquest Series IIB cable]. It was also more open and had a wider stereo image. Subsequent blindfold tests using different program material yielded the same results."
Cooper wants engineers to know that not paying attention to jitter in the studio will likely lead to problems later on. "Any jitter incorporated into the original master will cause mild distortion at the D/A and will be printed on the remastered DAT recording. When it comes to jitter, it's better to be safe than sorry."
It appears that every link in the audio chain, from instrument to the listener's ear, is subject to jitter-related degradation. Greater understanding of how to reduce this artifact in the studio should generate rewards for all of us at the end of the audio line.