I wanted to start a thread, separate from the current ART system thread to make a point about testing and methodology.
Ted Denney took it on the chin by admitting that he made a gross error in measurements and so the data in his graphs is erroneous. Ok, so he'll go back and correct his data collection procedures and hopefully we'll see a new set of graphs shortly.
Everyone makes an error, even a gross one and I wouldn't want to crucify him for that. However, what I do find objectionable is that Ted and Synergistic Research are scrambling AT THIS POINT to develop testing methodologies, measurement techniques and collect data to prove that their products work.
They should have had all the testing methodology developed and all the data readily available not to prove to others that their products work, but to use as tools to develop their products.
It seems obvious that the ART system was developed out of thin air without any thought given to identifying parameters and problems that the system was to address, without any thought to developing a testing and measuring methodology to ensure that the product would work CONSISTENTLY in different environments and to identify modifications that would be needed (if any) in larger, smaller, unusual venues.
At best, this shows that Synergistic Research is run by amateurs with products that are hit or miss. At worst, it's run by nefarious professionals who knowingly produce substandard products in order to cheat the unsuspecting. Whether it's the former or the latter, the end result is the same: Inconsistent products, no set standards for product design, no proper testing methodologies, no data collection to ensure that products work as stated.
Folks, please understand that it is THIS that makes manufacturers look like charlatans not the fact that the products look like Tibetan bowls. There is no reason why SR should be scrambling right now. No reason at all. All of this data collection should have been done eons ago at the product development stage. The testing methodologies and test data should have been completed during the QA phase. In fact, this tells me there was no QA. It was all done off the cuff.
It is time that high end manufacturers and reviewers recognized and embraced the well established product development principles, testing methodologies and data collection techniques. Otherwise, the high end manufacturers look like charlatans and audiophiles look like gullible fools.