If anyone has heard better interconnects than Oritek X-2's, I'de be interested in hearing from them. I'de like an explaination why they think so, also.
I am not familiar with the ones you mentioned, but I can make a few general interconnect comments.
I have made a lot of my own from many different kinds of cable, and tried a lot of commercially-made ones.
My experience is that in many instances, there is is very little change in sound quality between the cheapest and the most expensively made or exotic, where in other instances the difference is extreme and clearly audible.
My experience is that the sound difference, when it is there, is very specific to the two pieces of equipment being connected.
I think this is due to variations in grounding configurations between the two pieces of equipment, resulting in the signal current in the shield being masked by a ground loop of sorts in the cases where the difference is pronounced...but after many calculations and experiments with sophistcated elctronic analysis equipment, I still do not really know.
The most extreme case I have experienced was when I bought my Sony SCD777 and hooked it up to my Audio Research LS2-B preamp, using the cables Sony supplied, which seemed reasonably well-made. The unit sounded terrible; to a distressing degree. I had just listened to it in the store, and I had no reason to think my system was inferior, yet it sounded awful. I thought it was defective (a $3000 unit, we are talking about here...).
I decided to try some different cables, and it immediately became obvious that some of the cables I tried made it sound somewhat better, though not what I expected of it.
The guy who sold it to me suggested I try the Audioquest Viper cables, and this changed the picture totally; it sounded absolutely wonderful. I have since tried 4 or 5 other cables, and never found any that sounded as good.
WITH THIS PARTICULAR PAIR OF COMPONENTS, NO ONE COULD POSSIBLY NOT HEAR THE DIFFERENCE IN SOUND THAT OCCURRED!!
The Sony cable made it sound downright grungy, with virtually no bass and gritty, ugly, thin sound throughout. I am sure the cheapest CD player made would have sounded no worse; it was really that bad. With the Audioquest cable, it was Class A all the way; wonderful sound. With other cables, it was poor but not as bad as the Sony cables.
Why? I taught electronics for thirty years, and I can only give you an educated guess.
When people say that cables are cables and no sonic difference is possible, I want to demonstrate this case to them. They could not help but be convinced.
I design antennas and am familiar with transmission line theory, and yet every calculation and experiment I have tried gives me no satisfactory theoretical answer to this.
there is also no doubt about the changes in sound quality I have sometimes heard.
I have looked at many so-called technical papers on audio cables and I find only pseudo-science and outright B.S. I have never read anything that I find relevant to my actual listening experience, and it bugs the hell out of me!
I sure would be interested in some answers, but no engineer I have talked to has a clue either.
Welcome to the boards, Commsysman.
Like you, I've heard night and day differences in cable. With only a basic understanding of cable theory, I simply accept cables as low pass filters. Some component combinations are obviously more responsive to this than others.
Actually, I wanted to expand on this just a bit. If you consider the output impedence of a component and the input impedence of a partnering component, it's not difficult to predict a very marginal match between the two. Tossing in cables that have either a very high or very low resistance character could certainly explain a substantial difference in sound quality.
The series inductance of a typical audio cable is too low to be significant at any frequency of interest, as is the parallel capacitance. I have used computers to model the electrical characteristics of the cable, the typical input impedance of the terminating preamp, and the source impedance of the CD player or whatever.
The models certainly show that the cable is never a low-pass or high-pass filter at any frequency below 100 khz, and that only the most improbable extremes of input or output impedance could affect anything.
I have spent hundreds of hours up those roads with no hints emerging.
In any case, the distortion levels I have heard in some cases does not suggest a problem in the frequency domain, but perhaps rather a masking of the signal by inaudible high-level signals (perhaps high-level oscillations due to a circuit resonance at an inaudible frequency..?).
The lack of results in testing could be explained by the fact that inserting the test equipment alters the loading enough to cause the parasitic oscillations to be damped out and disappear...but that is only supposition.
This has beeen driving me crazy for 15 years...yoicks.
You should consider an article about what you've learned, or maybe a talk at a hi-fi club or convention.
Or even a huge post about it here!
I like your theory, too.
Here is a link to a guy after my own heart. There is some seriously good reading on the topic and a host of measurements.
One of the things that I found particularly interesting is his belief that freqency modulation occurs in cables that can have a similar affect like jitter in the digital domain.