Arcam rBlink Bluetooth D/A processor Measurements
The rBlink is a small, surprisingly heavy, black-finished box made, like all of Arcam's current products, in China. On one end is the jack for the supplied wall-wart power supply, the Bluetooth antenna, and a pushbutton to pair the rBlink with a source; on the other end are two RCA jacks for analog output and a single jack for the S/PDIF digital output.
I tested the rBlink (serial no. EBL 04114) mostly with Stereophile's loan sample of the top-of-the-line Audio Precision SYS2722 system (see the January 2008 "As We See It" and www.ap.com); for some tests, I also used my vintage Audio Precision System One Dual Domain. I used as data sources my iPhone 3GS and iPad 2 (AAC at 256kbps) and MacBook Pro (aptX codec; all MacBooks with OS 10.6.5 or later have aptX), each with its volume control set to the maximum.
I have measured two other Bluetooth DACs: the Chord Chordette Gem and the Musical Fidelity M6DAC. Much of the rBlink's measured performance didn't differ significantly from those two products, being dominated by the Bluetooth codec in use. I ran a complete set of tests from the rBlink's analog outputs; I also repeated many of the tests performing digital-domain analysis on the S/PDIF datastream. In general, the two sets of test results closely aligned.
The S/PDIF output's source impedance was impressively close to the target of 75 ohms, at 74.5 ohms. This will help keep jitter low, provided a digital datalink with a true 75 ohm characteristic impedance is used.
The rBlink operated at sample rates up to 48kHz. Its maximum analog output level at 1kHz was to specification, at 2.11V, and the output preserved absolute polarity (ie, was non-inverting). The output impedance was a fairly low 464 ohms at 20Hz and 1kHz, dropping slightly to 455 ohms at 20kHz. The frequency response with 44.1kHz data was flat from 20Hz to 20kHz, with the two channels matching to within 0.05dB (fig.1). The channel separation at 1kHz was an okay 73.5dB.
The blue and cyan traces in fig.2 show a wideband spectral analysis of the rBlink's analog output while it decoded aptX-encoded, 44.1kHz-sampled data representing white noise. The signal rolls off rapidly above 21kHz, and the series of low-level ultrasonic peaks very closely resembles those published in TI's datasheet for the PCM5102 DAC chip. The red and magenta traces in fig.2 show the rBlink's analog output spectrum with an aptX-encoded full-scale tone at 19.1kHz. The aliasing image of the tone at 25kHz is suppressed by 54dB, and the third harmonics of the tone can be seen at 66dB. The ultrasonic noise floor is higher than it is with white-noise data, the lower-frequency audioband noise floor lies at 70dB, and there is a rise in the noise floor to either side of the 19.1kHz tone.
One thing that has been puzzling me since this review was published in the magazine is that the first null in the reconstruction filter's stop-band lies at 24kHz rather than at 22.05kHz (indicated by the vertical green line in the graph below), which is where it should by rights occur with 44.1kHz data. I re-performed this measurement, making sure that the MacBook Pro's AudioMIDI utility was correctly set to transmit 44.1kHz data to the rBlink. The result was the same as before, however.
Sending, from my iPad to the rBlink, AAC-encoded data representing a dithered 16-bit tone at 90dBFS gave the spectrum shown in fig.3. It is significantly cleaner than the spectrum of the Musical Fidelity M6DAC's output decoding the same data (fig.8 at the link above), and the spike that represents the 1kHz tone peaks at exactly 90dBFS, implying excellent DAC linearity. Repeating the analysis with aptX-encoded data gave the spectrum in fig.4. The 1kHz tone is missing, and all the graph actually shows is the rBlink's analog noise floor. That the aptX codec becomes deaf to very low-level signals is graphically shown in fig.5, which plots the linearity error with aptX data against absolute level. When the low-level information is accompanied by high-level data, plotting the linearity error with aptX indicates that, below 80dBFS, the signal is transformed into random noise (fig.6).
Turning to high-level signals, fig.7 shows the rBlink's analog output spectrum with an aptX-encoded 50Hz tone at 0dBFS into 100k ohms. The components of the noise floor lie at around 115dB, which implies resolution between 13 and 14 bits; and though the third harmonic is the highest in level, it is still at a very low 93dB (0.0025%). Dropping the load impedance to 600 ohms drove the rBlink into clipping, meaning that it should be used with preamplifiers having an input impedance of at least 5k ohms. Changing to an aptX-encoded tone at 1kHz gave the spectrum in fig.8. The rise in the noise floor is almost 20dB higher than with the low-frequency tone, the aptX codec's limited bit budget becoming more of a problem with the higher-frequency signal. Lowering the signal level by 20dB dropped the noise floor by the same 20dB (fig.9)but with data streamed from my iPad (fig.10), while the noise floor lies at the 16-bit level, there is a significant amount of spectra spreading of the spike that represents the 1kHz tone, as well as a "skirt" of spurious tones above the frequency of the tone. The spectral spreading suggests some uncertainty in the frequency of the decoded tone; when I listened to the tone, its level was not quite steady.
This is not necessarily jitter. In fact, with the aptX codec, sending 16-bit Miller-Dunn J-Test data to the rBlink gave a narrow spectral spike at 11.025kHz, but with a noise floor around the 10-bit level (fig.11).
As I explained in my 2008 article on lossy codecs, it is most revealing to test the codec with a signal that simulates music. The spectrum of this signal, played back from CD and analyzed in the digital domain, is shown in fig.12. The signal comprises groups of tones spaced 500Hz apart, with clear gaps in the spectrum. Across the bottom of the graph, the background noise is uniformly spread out across the audioband. This noise results from the 16-bit Linear Pulse Code Modulation (LPCM) encoding used by the CD medium. Each frequency component of the noise lies around 132dB below peak level; if these are added mathematically, they give the familiar 96dB signal/noise ratio that you see in CD-player specifications.
Fig.13 shows what happened when I streamed this signal to the rBlink from my MacBook Pro using aptX. Peculiarly, the levels of the 500Hz-spaced tones have increased slightly. Below 5kHz, the noise floor drops to 100dBFS, implying 11-bit resolution, but is 30dB higher in the top octave, where the ear is less sensitive. With data streamed from my iPad (fig.14), again the levels of the individual tones are all a little higher than in fig.12, but the noise floor drops to the 16-bit level in the gaps between the tone clusters. This is so even in the top octaves, where preserving absolute resolution is not as important, given human hearing's lack of sensitivity in this region.
These measurements suggest that the rBlink's sound quality will very much depend on the codec used to stream audio data to it. The AAC codec appears to attempt to preserve resolution at the expense of noise-floor modulation and the introduction of enharmonic spuriae (though it is fair to point out that the latter might be masked by the music). By contrast, aptX throws away absolute resolution in favor of preserving a random noise floor, presumably because this will be less annoying with music. But the true test of a lossy codec is to listen to it.John Atkinson