Elliott Sound Products | Cables, Interconnects & Other Stuff - Part 3 |
All well designed interconnects will sound the same. This is a contentious claim, but is regrettably true - regrettable for those who have paid vast sums of money for theirs, at least. I will now explain this claim more fully.
The range (and the associated claims) of interconnects is enormous. We have cables available that are directional - the signal passes with less intrusion, impedance or modification in one direction versus the other. I find this curious, since an audio signal is AC, which means that electrons simply rush back and forth in sympathy with the applied signal. A directional device is a semiconductor, and will act as a rectifier, so if these claims are even a tiny bit correct, I certainly don't want any of them between my preamp and amp, because I don't want my audio rectified by a directional cable.
Oxygen free copper (or OFC) supposedly means that there is no oxygen and therefore no copper oxide (which is a rectifier) in the cable, forming a myriad of micro-diodes that affect sound quality. The use of OFC cable is therefore supposed to improve the sound.
Try as I might (and many others before me), I have never been able to measure any distortion in any wire or cable. Even a length of solder (an alloy of tin and lead) introduces no distortion, despite the resin flux in the centre (and I do realise that this has nothing to do with anything - I just thought I'd include it ). How about fencing wire - no, no distortion there either. The concept of degradation caused by micro-diodes in metallic contacts has been bandied about for years, without a shred of evidence to support the claim that it actually happens, let alone that it is audible.
At most, a signal lead will have to carry a peak current of perhaps 200uA with a voltage of maybe 2V or so. With any lead, this current, combined with the lead's resistance, will never allow enough signal difference between conductors to allow the copper oxide rectifiers (assuming they exist at all) to conduct, so rectification cannot (and does not) happen.
What about frequency response? I have equipment that happily goes to several MHz, and at low power, no appreciable attenuation can be measured. Again, characteristic impedance has rated a mention, and just as with speaker cables it is utterly unimportant at audio frequencies. Preamps normally have a very low (typically about 100 Ohms) output impedance, and power amps will normally have an input impedance of 10k Ohms or more. Any cable is therefore mismatched, since it is not sensible (nor is it desirable) to match the impedance of the preamp, cable and power amp for audio frequencies.
Note: There is one application for interconnects where the sound can change radically. This is when connecting between
a turntable and associated phono cartridge and your preamp. Use of the lowest possible capacitance you can find is very important, because the inductance of
the cartridge coupled with the capacitance of the cable can cause a resonant circuit within the audio band. Should you end up with just the right (or wrong) capacitance, you may find that an otherwise respected cartridge sounds dreadful, with grossly accentuated high frequency performance. The only way to minimise this is to ensure that the interconnects have very low capacitance, and they must be shielded to prevent hum and noise from being picked up. |
At radio frequencies, Litz wire is often used to eliminate the skin effect. This occurs because of the tendency for RF to try to escape from the wire, so it concentrates on the outside (or skin) of the wire. The effect actually occurs as soon as the frequency is above DC, but becomes noticeable only at higher frequencies. Litz wire will not affect your hi-fi, unless you can hear signals above 100kHz or so (assuming of course that you can find music with harmonics that go that high, and a recording medium that will deliver them to you). Even then, the difference will be minimal.
In areas where there is significant electromagnetic pollution (interference), the use of esoteric cables may have an effect, since they will (if carefully designed) provide excellent shielding at very high radio frequencies. This does not affect the audio per se, but prevents unwanted signals from getting into the inputs or outputs of amps and preamps.
Cable capacitance can have a dramatic effect on sound quality, and more so if you have long interconnects. Generally speaking, most preamps will have no problem with small amounts of capacitance (less than 1nF is desirable and achievable). With high output impedance equipment (such as valve preamps), cable capacitance becomes more of an issue.
For example, 1nF of cable capacitance with a preamp with an output impedance of 1k will be -3dB at 160kHz, which should be acceptable to most. Should the preamp have an output impedance of 10k, the -3dB frequency is now only 16kHz - this is unacceptable.
I tested a couple of cable samples, and (normalised to a 1 metre length) this is what I found ...
Single Core | Twin - One Lead | Twin- Both Leads | Twin - Between Leads | |
Capacitance | 77pF | 191pF | 377pF | 92pF |
Inductance | 0.7uH | 1.2uH | 0.6uH | Not Tested |
Resistance | 0.12 Ohm | 0.38 Ohm | 0.25 Ohm | Not Tested |
These cables are representative of medium quality general purpose shielded (co-axial) cables, of the type that you might use for making interconnects. The resistance and inductance may be considered negligible at audio frequencies, leaving capacitance as the dominant influence. The single core cable is obviously better in this respect, with only 77pF per metre. Even with a 10k output impedance, this will be 3dB down at 207kHz for a 1 metre length.
Even the highest inductance I measured (1.2uH) will introduce an additional 0.75 Ohm impedance at 100kHz - this may be completely ignored, as it is insignificant.
The only other thing that is important is that the cables are properly terminated so they don't become noisy, and that the shield is of good quality and provides complete protection from external interfering signals. Terminations will normally be either soldered or crimped, and either is fine as long as it is well made. For the constructor, soldering is usually better, since proper crimping tools are expensive.
The use of silver wire is a complete waste, since the only benefit of silver is its lower resistance. Since this will make a few micro-ohms difference for a typical 1m length, the difference in signal amplitude is immeasurably small with typical pre and power amp impedances. On the down side, silver tarnishes easily (especially in areas where there is hydrogen sulphide pollution in the atmosphere), and this can become an insulator if thick enough. I have heard of some audiophiles who don't like the sound of silver wire, and others who claim that solid conductors sound better than stranded. Make of this what you will .
The use of gold plated connectors is common, and provides one significant benefit - gold does not tarnish readily, and the connections are less likely to become noisy. Gold is also a better conductor that the nickel plating normally used on 'standard' interconnects. The difference is negligible in sonic terms. Not all 'gold' is actually gold - in some cases it may only refer to the colour!
There is no reason at all to pay exorbitant amounts of hard earned cash for the 'Audiophile' interconnects. These manufacturers are ripping people off, making outlandish claims as to how much better these cables will make your system sound - rubbish! Buy some good quality audio coaxial cable and connectors from your local electronics parts retailer, and make your own interconnects. Not only will you save a bundle, but they can be made to the exact length you want.
Using the cheap shielded figure-8 cable (which generally has terrible shields) is not recommended, because crosstalk is noticeably increased, especially at high frequencies. That notwithstanding, for a signal from an FM tuner even these cheapies will be fine (provided they manage to stay together - most of them fall to bits when used more than a few times), since the crosstalk in the tuner is already worse than the cable. With typical preamp and tuner combinations, you might get some interference using these cheap and nasty interconnects, but the frequency response exceeds anything that we can hear, and distortion is not measurable.
However, the above must be tempered somewhat by simple reality. It is possible (although rather unlikely) that some digital interconnects may have the wrong characteristic impedance, and under some conditions there might be some degree of degradation that increases the BER (bit error rate) beyond what can be corrected by common error correction schemes.
For example, HDMI expects a BER of no more than 10-9 (one error bit in 1 billion bits), and is a one-way protocol, so the receiver can't tell the transmitter that there was an error. However, you don't need to pay hundreds of dollars per metre, as perfectly good HDMI cables are available for sensible prices. By all means spend a little more if it makes you feel better, but shelling out for snake oil is just silly. If you get a good picture with a cheap HDMI cable, an expensive one will not make it better.
Bear in mind that the receiver reconstitutes the digital signal wave shape, it is usually buffered, and will often use some form of error correction as well. As for claims that the difference is audible ... it's possible, but generally unlikely (in a blind test).
Crosstalk is all but eliminated by the use of good quality shielding, which will generally also reduce interference. Keeping lead lengths to the minimum needed will also help reduce any possible negative influences.
I know that this is heresy to some, but I really don't care. This is factual, and I can prove my claims, while the makers of these fancy cables can't.
I have seen home-made cables, braided from multiple strands of wire-wrap wire. The shielding on some of these can be mediocre (at best), so experiment, but don't expect miracles.