2020 Jitter Measurements

Even as digital/analog processors were becoming a hot product category in the early 1990s, audiophiles were also learning that timing uncertainties in the AES/EBU and S/PDIF serial datastreams—jitter—would compromise any improvement in sound quality offered by these DACs. Some companies therefore introduced products to reduce or eliminate jitter—in the November 1994 issue of Stereophile, Robert Harley reviewed three such products: the Audio Alchemy DTI Pro, the Digital Domain VSP, and the Sonic Frontiers UltrajitterBug. I still have Stereophile's review samples of the UltraJitterBug and VSP, along with two contemporary DACs: a PS Audio UltraLink and a Parts Connection Assemblage DAC-1.

As our reviews of these products were published before Paul Miller's and the late Julian Dunn's development of the "J-Test" diagnostic signal, I performed J-Test jitter measurements to bring that 1994 review into the 21st Century. You can see what I found here.

Enjoy!

Oh, and the heading image? This is the PS Audio's output spectrum when fed TosLink data representing a full-scale, 16-bit 10kHz tone and corrupted with jitter at 1kHz with an amplitude of 1 nanosecond. The Ultralink can't deal with the jitter, throwing up sidebands at ±1kHz.

COMMENTS
Bogolu Haranath's picture

As a side note ....... May be JA1 could review the Innuos Phoenix USB reclocker ($3,149) .......It is a device used between the server/source and the DAC :-) ........

Bogolu Haranath's picture

For a $300 device, AQ DragonFly Cobalt also has outstandingly low Jitter measurements, some what similar to dCS Bartok :-) .........

Archimago's picture

With asynchronous USB and ethernet (inherently asynchronous), jitter levels have been low for more than a decade. IMO the Dragonfly Cobalt is rather expensive for what it does.

The more important question is - was jitter ever important when it came to audibility; much like the question of whether 24-bit sample depth is ever important given the limitations of the rest of the audio chain and human hearing (obviously 24-bits are useful in production, but we're talking about domestic listening).

I won't post the link here since it's to my blog, but if you're interested, do a Google search for "Archimago jitter demo" and have a listen for yourself and consider the J-Test results. Compare the results here with some of the samples.

My sense is that jitter, while a real phenomenon and should be minimized for hi-fi playback, is more an objective finding than actually a problem unless very severely - literally "broken" equipment!

Jack L's picture

...... it came to audibility; much like the question of whether 24-bit sample depth is ever important given the limitations of the rest of the audio chain and human hearing?" quoted Achimago.

Agreed. Hi-teck stuff, like jitter, audio bit depth & sampling rate are really detectable by our ears thru our home audio ????? I doubt very much. Another marketing schemes to get consumers' money?

Just like Y2k Millennium Scare which burnt billions dollars of commercial institutes worldwide for virtually nothing disastrous ever happened.

A strong case established by Audio Engineering Society New York + Hiroshima City University Japan on the audible difference among 3 audio sampling rates: 48Khz, 96KHZ & 192KHz of a 16-bit digital signal encoded in white noise. Under the lab condition: in an enechoic chamber, 7 young male/female of early twenties (sharp hearing). Audible differences were detected on white noise only. Music was not used as testing medium.

Now back to jitter. I just recently completed an experiment at home, using a no-name DAC of a cheapest ever basic design (USD7.55 each available only on-line) to test if our ears can detect any jitter or whatever digital craps generated by that cheapie DAC.

Digital audio signals were provided by my new basic SONY WIFI Blu-Ray disc player (USD52 from Best Buy!!!!) using coaxial cable & by my 50" 4K UHD WIFI TV using an optical cable come free with the cheapie DAC.
L & R audio analog signals out of the cheap DAC feed directly to my stereo rig.

Very very surprisingly, the music from my Sony Blu-ray player & my 4K UHD TV sounds pretty good!!!! Being a classical music addict owning'1,000+ vinyl LPs & all vacuum tube phono/power amps, I surely know how how fine music should sound. Fast, see-thru transparency & details. I can't complain at all as if that DAC might have cost my thousand of greenbacks.

Till now one months already, honestly I still can't detect any hashness due jitter or whatever digital craps expected to be generated by my 7-buck dirt cheap DAC !!!!!!!!!!!

Listening is believing

Jack L

Bogolu Haranath's picture

Another $400 device ...... Pro-Ject Pre Box S2 also has outstandingly low Jitter measurements :-) ........

jeffhenning's picture

One of the biggest knocks against USB was the jitter it induced.

John Siau of Benchmark had originally written about how this was not a problem with his DAC's since he'd designed a buffer into his USB system that made the use of an asymmetrical USB driver unnecessary.

Well, the term "asymmetrical USB" became a mantra amongst the industry and a year or so later Benchmark's DAC's have them. I'd imagine that it was more for marketing purposes than anything else.

I use a Emotiva XMC-1 for my serious listening and viewing. It has two DSP processors. One sorts all of the signals coming in. They are clocked at the original rate with all of the incoming jitter eliminated (hopefully) and format/listening mode decoding done. Between this DSP and the next is a buffer.

With all the bits in place, the signal then moves onto the second DSP that involves the outputs and their processing. It is after this that the digital data hits the DAC's for output.

I haven't seen any jitter spectrums in reviews, but the thing sure does sound pretty fantastic.

There is more than one way to skin the proverbial cat. Proper engineering will always win out in the end.

eriks's picture

The less than $2,000 Mytek Brooklyn reviewed by Stereophile in 2017 may be a better way to compare how much better jitter rejection has gotten than the $15,000 dCS Bartok, no?

https://www.stereophile.com/content/mytek-hifi-brooklyn-da-processor%C2%96headphone-amplifier-measurements

John Atkinson's picture
eriks wrote:
The less than $2,000 Mytek Brooklyn reviewed by Stereophile in 2017 may be a better way to compare how much better jitter rejection has gotten than the $15,000 dCS Bartok, no?

The Mytek does have excellent jitter rejection on its serial data inputs. I'll see at what point it gives up with the sinusoidally jittered data stream. And to answer your question in another comment:

eriks wrote:
Here's a challenge for you, JA. Take your vintage DAC's and compare the delta between how they played CDs and how they played high resolution music.

Neither of these vintage DACs will accept data with a bit depth greater than 16 and a sample rate greater than 48kHz. :-(

John Atkinson
Technical Editor, Stereophile

eriks's picture

Then anything before 2000 will do. An old Theta Casanova?

eriks's picture

Here's a challenge for you, JA.

Take your vintage DAC's and compare the delta between how they played CDs and how they played high resolution music.

Then try the same with the Mytek Brooklyn or dCS Bartok.

I think you'll have interesting findings to talk to your readers about. Can we still justify high resolution music at all?

JRT's picture

...device that worked well, but should not likely be useful in most modern setups, was (is?) the Monarchy Audio DIP (digital interface processor). The original DIP was clocked at 44.1_kHz, and that one was renamed DIP Classic after they came out with a 96_kHz version.

Stereophile reviewed the 96_kHz version. Not sure about others.

https://www.stereophile.com/digitalprocessors/339/index.html

Bogolu Haranath's picture

Also see Stereophile review of AQ JitterBug, USB noise filter, $50 :-) ..........

JL77's picture

These days, all but the very cheapest audio clocks should be operating in the sub-pico-second range.

Bogolu Haranath's picture

One pico second is one trillionth of a second :-) ..........

hollowman's picture

Well, jitter entered the audiophile vocabulary (as well as published lab bench metrics ) in the early 90s, as JA noted.
What is less clear is how aware the orig. R&D scientists were of the issue? I.e., the Sony/Philips RedBook team, back in the late 70s.
Ditto with "linearity" -- which entered the audiophile vocabulary a few years earlier.
I have gone thru a few early Philips papers, and found nothing.

Bogolu Haranath's picture

The same people who invented Jitter also invented Twitter :-) .......

MCK22's picture

I am curious. I have been reading about jitter in DACs for years, but I've never seen any reference to scientific evidence as to how much jitter is needed in the signal to be perceptible. Human ears are wonderful things, but they didn't evolve to be jitter detectors, and like all biological systems of perception there are limits of perceptibility. Does anyone have any citations to peer reviewed journal articles? Different DACs can sound different for reasons other than jitter and it would be interesting to see how much of the R&D budget should be allocated to jitter reduction as opposed to, for instance, the analog output parts of the DAC.

skris88's picture

This is because jitter is irrelevant!

Jitter exists in ALL digital transmission. And there is lots of error correction etc algorithms in the design of digital transmission. The DAC receives only perfect data, identical to what the ADC originally created.

Claiming jitter in digital audio is an issue is like saying that there are errors in the monies sent between bank digital transactions!

Yes, there is errors in transmission, but this is exactly why we use digital data and not analogue data. There are checksums and error correction bits and re-transmission of blocks of data going on all the time.

In the old days of the first CD Players, perhaps processors used were not fast enough to manage re-transmission of corrupted data into the DAC. When such errors happen the DAC algorithm determines that there is no more time to lose and so drops or 'assumes' the data it needs (eg. repeat the previous data for that block of data).

As such Red Book audio can be considered as lossy - like MP3s! (and worse, and it changes on each playback unlike MP3s when data is dropped/'lost' only during the compression process).

But these days with even the cheapest of CPUs 1,000s of times faster than the highest audio data being transmitted, digital jitter is an non-issue.

Otherwise, I'd suggest you take all your monies out of your digital bank account NOW (and check for the missing pennies! :-))

navr's picture

The concept of jitter misleads people into thinking that all you need in a digital signal is the correct bits (which is relatively trivial to transmit via tcp/ip 3-way handshake protocol) with great timing (low jitter), and so all you need is a great clock. This simplistic view is highly misleading. At least three things matter - the clock, noise, and bandwidth. In the image of a perfect square wave, the horizontal axis is time and the vertical axis is voltage. We will assume the clock is perfect – ie. the vertical signal lines occur at perfectly spaced intervals (the bit rate). When the signal is representing a binary 0, it is at 0v. When the signal is representing a binary 1, it is at 1v. And we will assume that the receiver of this signal decides that the transition between a 0 and a 1 has occurred when the signal rises through the 0.5v level, and that a 1 has transitioned to a 0 when the signal falls through the 0.5v level. Now imagine that there is noise added to the signal. If the frequency of the noise is below the bitrate then this perfect square wave swims on top of a longer and smoother wave. The interesting point is that the timing between the data transitions (where those vertical lines pass through 0.5v) is unchanged. So no problem, yet. If the frequency of the noise is above the bitrate then the horizontal lines get fuzzy. And if we combine the low frequency noise with the high frequency noise the effect is combined. Again, the interesting point to note is that the timing between the data transitions (where those vertical lines pass through 0.5v) is unchanged provided the noise is not extremely high. So, again, no problem. Noise, on its own (as long as the deviations caused are materially below 0.5v) is not a problem. The reason it is not a problem is those vertical lines, because noise does not change the space between them.
Now imagine there is no noise. Zero noise is impossible, but something else that is impossible is the vertical line on the square wave, since it requires infinite bandwidth. The vertical lines imply the signal can achieve 0v and 1v in more or less the same instant. Whatever tools we have to transmit a signal, the demands of high bit-rate signals are way beyond what the available tools can deliver. Think about how your analog cables can mess with sound up to around 20kHz, and then think about the enormously wider frequency range required of a digital cable (and, optical cables just have a different set of problems, mainly related to reflections). The higher the bit rate the harder it gets. When we allow for constrained bandwidth, instead of transitions being instantaneous, the signal goes up a slope when transitioning from 0v to 1v, and down a slope when transitioning from 1v to 0v. If the bandwidth was the same as the bitrate then the signal would be a sine wave. To reasonably square out the signal you need to add several harmonics of the bitrate (say 7 or more) above the bitrate, and that is a lot of bandwidth - even more for higher bit rate signals. By adding harmonics, the sine wave begins to square out. Interestingly, in both of these constrained-bandwidth examples, the transitions through 0.5v are still perfectly spaced – even with the sine wave. So still no problem.
But as I mentioned, a higher bitrate signal (if you think high bitrate files must always sound better) requires even more bandwidth to square out the wave, and so in a system that has a finite limit on bandwidth, a lower bitrate signal will be more accurately represented than a high bitrate signal. On top of that, if you ask anything in a music server to work faster, it will work with less precision and this is a key trade-off to be aware of when you assume higher bit rates must be better, just because the numbers are bigger. These examples only allow us to conclude that there is no problem if we can achieve zero noise or infinite bandwidth. But each of those goals is unattainable, and the problem becomes apparent when there is both noise and constrained bandwidth. So what happens if we add a low frequency noise component to a frequency-constrained digital audio signal? All of sudden, the 0.5v points are shifted right or left by the addition of the low frequency noise that lifts or drops the signal between bits. Shifting the slopes up or down shifts the 0.5v points left or right. The greater the amplitude of the noise, and the greater the bandwidth constraint, the greater is the effect on timing (jitter).

Now if we add high frequency noise to a frequency-constrained signal you can see that the transition timing at precisely 0.5v is now hard to discern for any digital receiver. If the signal is vertical at the transition then noise does not affect it. But as soon as the transition is not vertical then noise changes the transition point. It is the combination of constrained bandwidth and noise that inevitably creates jitter (variation in data transition timing), regardless of how great the clock is.

hollowman's picture

I can say with certainty that of the digital-audio gear I've personally owned, subjective quality is not correlated with jitter metrics.

For example, Stereophile reviewed/measured both the Musical Fidelity A324 (2002) and Asus Xonar PCI sound card (2010); both measured very well in jitter and other metrics (unsurprising, given their use of S-D dacs). I own both of these units. I also own a few classic (but heavily modified) Philips/Magnavox CD players with their classic SAA/TDA chipsets.
Sterephile has measured some Philips chipset units (Naim, Marantz, Arcam), but IIRC (the reviews are not online yet), the jitter wasn't great.

HINT HINT to JA for re-measuring classic Philips units, when he has time.

In any case, my modded Philips units sound considerably better than the newer MF and Asus devices mentioned above.

====EDIT===
The Naim CD2 cd player (with SAA7220/TDA1541A) was reviewed in Feb 1997, with Measurements by Robert Harley. (Not online, so more work for JA, if he wants it!!). The unit had very good jitter for that ancient chipset. Not sure how Harley's instruments/techniques, from then, would compare against JA and his AP of today.

Forgot to note that I also own a Theta Chroma 396 (reviewed/measured) Stereophile Aug. 1996. The jitter on that unit was only fair.

X