I would like to comment on the frequency response of that speaker in the reviewer's room. How much of that horrible response was the room and how much the speaker? Is the reviewer's room that bad? Is there a room treatment or set up issue? Is it fair to vendors or the readers to have reviews done in such deplorable conditions? Isn't it incumbent upon reviewer to get their room and set up right - at least to a point where the speakers have a shot and the positive and negative results can be attributed more towards the speaker than the environment? Now I suppose one can make relative judgments with everything being equal. Meaning every speaker has to play in the same environment. But come on. That in room response especially in the low end is one of the worst I have seen in the magazine and I have been a subscriber for 15 years or so. If the speaker weren't so expensive would it have been slammed for that? When cars are tested we expect professional drivers right? We expect the performance testing to use appropriate test tracks don't we? Given your bias against A/B testing would you expect a reviewer to be able to walk in to 2 different rooms - same equipment as used in that review - one set up well and the other like the reviwer's and be able to instantly hear that difference? Isn't this a non-qualified review situation? (Funny that that chart looks like most hotel rooms sound when holding audio shows. I have been to a couple and everyone says they understand the issue and try to ignore it. It some respects that is understandable Having said that not trying to use some method - active or passive - to get those rooms as good as they can get is wrong as well)
Columns Retired Columns & Blogs |