You are here

Log in or register to post comments
terrerowicca
terrerowicca's picture
Offline
Last seen: 9 months 3 weeks ago
Joined: Nov 30 2013 - 5:34am
Speaker evaluations

This is my first post so I hope that I am doing it correctly.

I realize that there is alot of contraversy regarding the evaluation of loudspeakers by Stereophile and it is not my intent to add to that.

However, I would appreciate it if Stereophile would measure a quality loudspeaker in the listening room of each reviewer. It would help me in comparing loudspeakers if I knew how each reviewers room primarily but also reference equipment effected the measurement of each loudspeaker. Secondly I wonder if Stereophile ever considered using a loudspeaker evaluation methodology that was published many years ago in Audio Magazine that consisted of recording selections of music on a quality mic/tape-recoder system and repeatedly playing the tape and re-recording in order to identify any aberrations in the loudspeakers response.

John Atkinson
John Atkinson's picture
Offline
Last seen: 9 hours 5 min ago
Joined: Nov 7 2010 - 3:31pm
Re: Loudspeaker Measurements

terrerowicca wrote:
I realize that there is alot of controversy regarding the evaluation of loudspeakers by Stereophile and it is not my intent to add to that.

I am not aware of any meaningful controversy. What gives you this idea?

terrerowicca wrote:
However, I would appreciate it if Stereophile would measure a quality loudspeaker in the listening room of each reviewer. It would help me in comparing loudspeakers if I knew how each reviewers room primarily but also reference equipment effected the measurement of each loudspeaker.

I have regularly measured speakers in some of our reviewers' rooms: Wes Phillips (when he was active); Art Dudley, and Michael Fremer, for example.You can find these measurements in their speaker reviews.

terrerowicca wrote:
Secondly I wonder if Stereophile ever considered using a loudspeaker evaluation methodology that was published many years ago in Audio Magazine that consisted of recording selections of music on a quality mic/tape-recoder system and repeatedly playing the tape and re-recording in order to identify any aberrations in the loudspeakers response.

Simply to do this with a microphone, as Audio used to do, does not produce very meaningful results, as the ear/brain behaves very differently to a microphone. However, Sean Olive at Harman has dne some great work investigating the use of a dummy head microphone to record binaurally the sound of a pair of speakers in a room. That looks like a promising technique.

But personally, I believe that the manner in which our reviewers evaluate loudspeakers, by listening to them in a familiar system with familiar ancillary components, using recordings that reveal the positive and negative attributes of the speakers being tested, is fine. Certainly, I don't know of any magazine or website that does a better job at this than Stereophile. And only one publishes a set of measurments that is as thorough and consistent as Stereophile.

John Atkinson

Editor, Stereophile

Peter Duminy
Peter Duminy's picture
Offline
Last seen: 2 months 1 week ago
Joined: Nov 1 2005 - 9:29pm
Re: Speaker evaluations

I think we can all rest assured that John's measurements and comments are held in the highest regard by manufactures and listening enthusiasts world wide.

Managing Director | CEO
DLC Loudspeaker R&D Group

michael green
michael green's picture
Offline
Last seen: 1 hour 3 min ago
Joined: Jan 10 2011 - 6:11pm
Interesting

I wish the OP would have come back and gone a little deeper into this. I'm not sure he was doubting (don't want to put words in mouths), but I more got the sense that he wanted to get more of a feel for how things are done, you know, a view into the reviewers listening world. This hobby comes in layers and levels, and with those layers we as listeners find ourselves many times going beyond what a particular reviewing standard would be able to reach. For example, I'm involved in extreme listening to where a typical living room couldn't come close to the purity levels of the acoustics that we would test speakers in. Therefore our results would be different than that of a review that is done in a basic room situation.

I think that reviews should not only be done in a typical room setting but also in the extreme settings and the two blended together to give a balance. It does seem (always has) a little wierd to me that a $25,000.00 and up price tag item would be reviewed in a basic setting not able to reveal what those products are really doing. I agree that Stereophile and TAS and maybe a couple of others are at the top of the game when it comes to living rooms, but there is another level. And that level I feel is more in line than the high end audio standards that have been set. After doing what I do for so many years I couldn't imagine throwing a million dollar system, or even a $75,000.00 one into a typical room setting. It doesn't make sense to me. That listener is never going to hear what the system is doing or not.

I have had reviewers to my test rooms and they sit there with their jaws on the floor. A high end room built to play high end. For myself this on a higher end scale makes more sence than a super high end system being reviewed in an average room. I think, and this is just my 5 cents worth, any product reviewed as "high end" should be done in an average room, a test lab and a variable high end listening room.

Why Variable? That easy, so the product could be tested in the absolute best setting for that product. Then you would have, the average, the specs, and the max performance to pull from.

Basically, you don't test a high end car on a dirt road. Well some do.

michael green
MGA/RoomTune

audiophile2000
audiophile2000's picture
Offline
Last seen: 1 week 3 days ago
Joined: Oct 10 2012 - 9:51pm
Double Blind Reviews

Personally, I think it would be interesting to see some of the reviews being done on a double blind basis and with a few people listening. For better or worse people have biases and the truth is, its hard to overcome those. While i'm sure none of this is intentional, its very easy to inject ones own bias into something that is very qualitative in nature.

An interesting clip of an interview below describes this way better than i ever could. Again not saying this is happening but would be interesting to see how some of the top products were ranked if we didn't know what brand they were. I'm sure many, this would just be a conformation, but would be interesting to see none-the-less.

http://www.youtube.com/watch?v=yiWHqf_K43o

ChrisS
ChrisS's picture
Offline
Last seen: 2 hours 32 min ago
Joined: Mar 6 2006 - 8:42pm
Did you hear that?

Unless I'm the one being tested with whatever component being tested inserted in my stereo system in my living room playing the kind of music I like to listen to, the results of using someone else's ears with someone else's stereo system in someone else's living room playing someone else's music is rather meaningless.

The way Stereophile reviews stereo components is fine by me.

commsysman
commsysman's picture
Offline
Last seen: 10 hours 20 min ago
Joined: Apr 4 2006 - 11:33am
Reviews

As far as I am concerned, the reviews Stereophile does on equipment of all kinds are the best in the business.

I have used their reviews as a guide to the gear I investigate for many years, and have seldom found any fault with them.

VALID double-blind tests are very difficult to set up and execute in a way that gives reliable results.

The few that CLAIM to prove something never seem to meet a proper standard for methodology or number of subjects, so those who prattle on about DBT, I say that they are pretty much full of it.

I get really sick of people who go on and on about DBT and obviously have no idea what the f they are talking about.

audiophile2000
audiophile2000's picture
Offline
Last seen: 1 week 3 days ago
Joined: Oct 10 2012 - 9:51pm
Interesting Response

commsysman,

I'm not sure I get your post. Are you rejecting the premise of DBT or its application to audio, if its the later, i agree there would be a lot of questions especially in a review context, but if its the former (outright rejection of DBT) then can you suggest a better way to remove bias since this is employed in a number of major statistical trials and test.

With the above said, I do think it would be interesting nonetheless to see how top performing flagships stack up against each other as well as comparisons to more reasonably priced products. i.e., I''m not surprised a 200k loudspeaker gets a good review, if it didn't sound good there is really something very wrong, but a more interesting question to me is how it compares to 100k speakers, 50k, 25k, sub 10/5k speakers. (i.e. what improvements are you getting). TBH I think the same thing is true about any speaker over 10k, I can't imagine a product not performing well and if it is, that is a huge red flag in my book. But to me the question I'm more interested in, is how does it compare to other speakers / lower priced speakers.

To me this is a better question and a more interesting question as performance is all relative to what is possible. I think this is the point i was trying to make. The only reason i added DBT in the post was I think this comparison is hard to make if you know the brands / reputations since personal bias is in everything we do.

With that said, I didn't say Stereophile was skewing the reviews or that there reviews were not helpful. Actually to the contrary, after listening to a number of the products they have done reviews on, they do a great job of communicating the sound they heard in text (which is by no means an easy feat). I just wish there was more comparison and further classifications.

For instance say you listened to speaker A but it was missing something, where would you go? Comparisons might help you shed light on this better as you could see which speaker compared to speaker A and showed improvement in the area you were looking for. Also further classifications would be useful since it is of my humble opinion that speakers have different design goals and to that end different end market. To put another way, if someone is looking for a reference monitor that speaker should be nothing more than a window, it should add nothing to the recording. in contrast to that, there are speakers that have their own sound and seek to color the sound in a way they is more pleasing, obviously a harder category to rank since there is not a true reference but none the less separating speakers into these two categories in my opinion would be helpful. These are two that i have seen but I'm sure there are more. Its no different than car magazines separate SUVs, Sports Cars, Sedans, etc., I think there are also clear lines in audio. Notice I didn't say which one is better, since that's not the point, but more so that there are differences.

michael green
michael green's picture
Offline
Last seen: 1 hour 3 min ago
Joined: Jan 10 2011 - 6:11pm
Can never be

There can never be to many listening tests. Whether someone thinks they are good or bad tests their going to help someone. There are so many different types of products and so many different types of hobbyist that in order for someone who purchases equipment blindly now more then ever, testing an different opinions of the sound is more useful than in the past.

I personally don't think any short term testing has as much value as long term review type testing. I also don't think spec testing does much, but I do think that different personality types respond to things that they feel more comfortable with based on how their brain is wired and in this hobby your going to find all different types of comfort levels whether they make sense to others or not.

I have seen many lower price products smash higher priced ones and so for me the playing field is completely level. If you guys saw what we as testers of equipment saw I think you would be more than a little surprised.

michael green
MGA/RoomTune

  • X
    Enter your Stereophile.com username.
    Enter the password that accompanies your username.
    Loading