Audio, Precision, & Measurement: Richard Cabot Page 3

And I'm also admittedly not as critical a listener as some people are. I have a friend here in town who does live location recording, and he can consistently hear things that I don't hear at first pass. But the more I listen to them with him, I can find those things later that he heard the first time through. But I just didn't notice it. I guess that's maybe a lack of training at hearing those differences. Most of the experimental psychoacoustic work I've done has related to sound localization. I tend to be pretty good at picking out things related to imaging and localization.

Harley: How much research is going on in correlating measurements with human musical perception?

Cabot: The most promising work I've seen related to that has been done by Louis Fielder at Dolby Labs. There's also some stuff that's been done by people in Germany relating to low-bit-rate digital coding. They're trying to squash more and more audio into a lower and lower bit rate. It's clear that you're abusing the audio: you're butchering the signal something fierce when you chop down the data rate, and you're definitely losing information. But they take the approach that, to make this still sound good, you have to assess what it is that the ear is going to hear and figure out from that how to measure this circuit and optimize its tradeoffs in terms of what information it throws away.

So when Louis Fielder at Dolby Labs is trying to build a Direct Broadcast Satellite [DBS] digital audio system and the guys in Germany are trying to build a digital audio system for direct broadcast of telephone communications or whatever, they know they're going to have to throw away large amounts of information in the audio signal to make it fit. They're going to have to do some very ugly things to the signal. You just can't say, "We're going to measure this so it looks OK on our equipment on the bench." You have to say what things can we throw away that the ear won't notice us throwing away. What do you have to keep? They've approached it from the "What can you hear?" viewpoint because they have no choice.

That kind of work will result in new approaches to measure equipment that give us a handle on what to look for. The problem is that those kinds of things don't have a lot of funding. Specific cases like Louis Fielder and the German bit-rate compression schemes have funding because they're going after extremely large dollar markets. If you can save a few kilohertz of bandwidth on a satellite, you're talking about a lot of money saved in satellite costs.

They've got the money to pay for the research that they need to solve their specific problems. But there's not the same budget for people who want to assess the audio quality of a piece of equipment for generic home use. You market it on numbers and things the consumer already understands. The common denominator among most consumers is power output; if you're lucky, they can understand the concepts of distortion and frequency response. But the first thing they ask is how many watts it is. If you talk about anything more esoteric—the masking effects of the ear or the sidebands due to jitter of some signal or dropouts due to missing samples—you'll lose 99% of the people out there. You don't find companies like Sony or Matsushita funding research into better ways to look at that stuff. If they did, they'd have to spend hundreds of millions of dollars trying to teach people that it's important in the first place. They would much rather just make something on a mass scale.

Harley: They'd rather leave it to the American high-end manufacturers to address the more sophisticated segment of the market.

Cabot: They're interested in making money. That's all. They know that money is made with truckloads or boatloads of stuff. I'm not sure where you find the resources to investigate that stuff. It's just a labor of love when people try to find measurements to correlate with what they hear.

I did a literature survey for the 1990 AES conference in Washington, DC on the perception of distortion—both linear and non-linear—in audio and what had been written about it and what experiments had been done. Surprisingly little has been done that tries to analyze what you can hear. The more interesting material in the literature that is yet to be understood is material on delayed resonances and their audibility, especially in how it relates to digital audio.

There's a paper done by Roger Lagadec of Studer, in a preprint for an AES Convention about five years ago, on the audibility of ringing in filters used in digital audio applications. They had devised a noise-reduction system that took the audio signal and chopped it up into 512 bands. Their system measured the level in each of those 512 bands and put a compressor on each band trying to filter out the noise to improve the sound quality of the signal to resurrect old, noisy recordings.

ARTICLE CONTENTS

X