Wadia 2000 Decoding Computer Page 3

The Wadia 2000 Decoding Computer executes over 72 million instructions a second in order to get the required accuracy for the output signal. Resampling at 64x, at a sampling frequency of 44.1kHz, results in 2,820,000 calculated points every second (3,072,000/s for DAT). The Wadia 2000 solves a 12th-order polynomial 44,100 times a second.

Please bear with me as I attempt to simplify some of the more basic concepts. For now, you can take their word for it: when Wadia calls their processor a Decoding Computer, they are not resorting to hype.

I owe many thanks to Donald Wadia Moses, the CEO of Wadia Digital, for working with me to keep matters simplified. Besides providing me with the design details of the 2000, along with some ample doses of patience, he also acted as my translator—and a very patient one at that—as I struggled to decode the technical jargon of digital into the more user-friendly realm of everyday language.

Not surprisingly, it was sampling that started our dialogue. If we are going to talk about resampling, we have to know what sampling is.

To begin with, let's forget about speed. Sampling is much easier to visualize if we disassociate it from a fast rate like 44,100 times a second, the CD standard. Think of sampling in the exact terms of what it says—taking a series of "glances"—snapshots, if you will—of a situation at regular intervals. Like a strobe light at a disco.

Pick a dancer, any dancer. The motion of the arm—yes, let's watch only her arm now—is broken up into a sequence of instantaneous positions each time the strobe exposes the flesh—oops, I mean exposes a flash—even though we know full well that the motion is continuous.

This process of converting what in nature is a continuous motion into a succession of still frames is known as "sampling." Provided that the sampling rate is more than twice the highest frequency of interest in the continuous phenomenon—the Nyquist criterion—no information is lost. Where information is lost in digital processes is when, following sampling, the physical aspect of interest is measured, then converted into digital code. Because there are only a finite number of values the measurement can take, dependent on the digital word length, this process, referred to as quantization, always loses information. We can only hope that the digital word length is long enough for our needs. Think of the dancer's arm again. If we measure the distance the arm rises above the floor, then record the distance at each strobed moment, then convert that accurate analog measurement into a digital code, we have quantized those distances.

Similarly, once the sound pressures of music are converted into electrical impulses by the microphone, the signal is electronically "strobed" at precise intervals. Each time the strobe hits, that instantaneous value is recorded and converted into digital code. Because music involves rates of change much higher than those produced by the Solid Gold Dancers, the rate of strobing in the case of CD takes place at 44,100 times a second. For music, the quantized signal values end up stored on tape in digital form, and are later transferred to CD.

Of course, once we play the CD we retrieve our digitized samples, but the name of the game is to get back to the original. That's where the digital processors take over.

Precisely because we are dealing with digital signals operating in a very rigorous manner, every step of the processor can be analyzed. Let's slow the process down and take a closer look by invoking another analogy.

Suppose we are working with data of a person's net worth spanning a period of 60 years. Since the IRS yearly exercises its privilege to examine a person's financial status, we have a sampling rate—once a year, a net worth statement is produced. We then take these values and plot them. The vertical axis denotes net worth in dollars, the horizontal axis time in years. Quantization can be likened to equating the wealth of an individual in terms of such currency as discrete dollars, even though the possessions consist of real estate, gold, cash, etc. Should the dollar values be rendered even more approximate by rounding off to the nearest $1000 for the sake of the plot, it is obvious that we have created an even larger quantization error.

In the terms of this analogy, here's what Wadia does when it resamples: They take a group of the net worth values of 12 consecutive years at a time, and interconnect these plotted points with a smooth curve. The shape of the curve is meticulously calculated to come up with the best-fitting contour of the 12 sample points by invoking a proprietary algorithm. In deriving the curve, not only does Wadia look at the values of each of the points, but they also predict the direction that the curve will take—ie, the slope at each point—in order to approximate the conditions of the original events.

What results is a curve, derived to be sure, of a person's net worth history for the last 12 years. But once we have this finished curve, it is a simple matter to divide each year into 12 segments and interpolate what the net worth was at each month, or, for a finer look, repeat that procedure for each week. Well, what we have just done is to resample the annual data at 12x and 52x rates, respectively.

The Wadia 2000 is designed to resample at a 64x rate and is able to fill in the gaps to a very fine degree. It is important to remember that each of the 63 resampled values, or the weekly values in the above example, are derived. They are based on the actual sampled value for the middle portion of the 13-year period used in the calculation.

As the next sample is added to the 12-sample grouping for the calculation, the first sample is discarded. The whole calculation is repeated using carefully selected instructions, and a new "best fit" contour is produced.

The algorithm is what determines just how successfully the original will be replicated. And who determines the algorithm? The designer. That's why the chosen algorithm is of paramount importance.

The "best fit" algorithm is the Frenchcurve trademark used by Wadia. Some of you may be familiar with a drafting aid of the same name used to draw smoothly fitting curves. The algorithm also contains instructions for calculating the direction, or slope, at each sample point. That's referred to as the "CSpline" portion of the algorithm.

Wadia is in the process of patenting their algorithm (footnote 4). They feel the mathematical manipulations are unique, thereby deserving protection under this proprietary statute.

But why does the algorithm have to bother to calculate 63 other intermediate points for each original sample? That has to do with filter byproducts. We can well imagine that if we have to connect a series of dots describing a waveform where the dots are spaced far apart, it is a more difficult task than when the dots are closer together. Small gaps signify a higher frequency at which the dots appear. And the higher the frequency, the easier it is to filter with simple techniques, without affecting the original spectrum of the musical content.

Wadia claims success for this approach because the resampling operations take place in the time domain. The sharp-cutoff, "brick-wall" analog filters used in earlier CD players were optimized for frequency-domain performance and introduced severe phase-shift and ringing anomalies in the audio band. The conventional FIR (finite impulse response) digital filters used in other contemporary players and processors, too, have been optimized for frequency-domain performance; they are even intended to ring in the time domain to work at all (footnote 5). The technique of resampling at a 64x rate coupled with the digital signal processing used by Wadia and Krell eliminates these harsh filtering considerations. Though the Wadia's "Frenchcurve" processing has a slow rolloff rate, the resampling byproducts are centered on 2.82MHz, a frequency far enough from the audio range that very simple analog filters can eliminate them from the audio outputs.



Footnote 4: Robert Wadia Moses received the IEEE Canadian Life Member Award for his paper "Improved Signal Processing for Compact Disc Audio System," presented in November 1987 at the MONTECH '87 conference in Montreal.

Footnote 5: The Wadia handbook provides information for those who would like to experiment with programming the 2000's DSP chips to implement a conventional 192-tap FIR low-pass filter. The program is held in eight 16k EPROMs.

COMPANY INFO
Wadia Digital Corp.
1556 Woodland Drive
Saline, WI 48176
(734) 786-9611
ARTICLE CONTENTS

X