kpk021 wrote:Yes and no. Digital signals are sent over these cables in an analog form. Interference and signal degradation can still play a role. Yes it's digital at both ends, but you can still run into problems if the signal gets messed up enough to cause issues with that signal level is considered on or off (1 or 0). Otherwise, HDMI cable length limitations wouldn't vary so greatly depending on cable conductor (28 awg vs 24 awg, etc.).
With that said, I haven't read up on any studies to know what the real world differences in performance are (if any) between the various cable sources and quality. So for the majority of purposes, the money saved on cheap cables might well be worth it. And obviously running out to Best Buy to buy a $90 HDMI cable is never worthwhile.
Digital signals are sent over these cables in an analog form.
..will simply mislead more people. I'm assuming you are talking about how a digital signal is effectively encoded on an analog signal? Well, all digital signals are sent this way. This isn't something relegated to HDMI. The reason that digital is less prone to interference is twofold:
1. Error correcting protocols. If some bits are lost they can be reconstructed up to a point.
2. Quality of the signal output is not tied to the signal strength in a linear fashion.
Regarding 2, consider a conventional TV with rabbit ears. As I move the rabbit ears, I'm changing their ability to pick up a given frequency. In one position I may have a strong picture and in another position I could have snow and ghosts. Because it is an analog signal, there are an infinite number of gradations of signal quality between no signal at all and the highest signal possible. Between the two, the quality of the picture will also change in a linear fashion.
In the digital world it doesn't work like this. The signal is really a bunch of 1's and 0's. Consider picking up a digital channel over the air today using a UHF antenna. As you move that antenna, your signal from the source transmitter will vary in a similar fashion as the analog example above. However, as you move the antenna away from the perfect position, the picture and sound will not degrade at all until you've moved the antenna to a position where the signal is so poor bits are dropped and aren't able to be recovered by the error correcting mechanism. At this point, your picture and sound will stutter, freeze or dropout altogether. There will be no ghosts, snow or other intermediary visual or audio degradation.
Let's say for argument the following signal strength readings were taken in analog:
Position 1. 100
Position 2. 85
Position 3. 60
Position 4. 45
Position 5. 20
At position 2, you might start seeing snow on the picture. At position 3, the vertical sync starts to go and the screen rolls intermittently. At position 4, there's barely a discernible picture at all. At position 5, there's just snow.
Now consider digital. At position 2 you still get a perfect picture and sound. At position 3, there's enough bits along with error correction that you still have a perfect picture. At position 4, you start to notice occasional stuttering and freezes. At position 5, there is no signal at all.
Signals traveling through a wire act the same way. There are many forces at work that can attenuate a signal over a wire. Mainly there's resistance/impedance (length of cable, material and guage) and interference (shielding). The analog signal quality will go down in a linear, analog fashion with as the amount of attenuation occurs. The analog receiving circuitry cannot replace missing information. It's gone for good. In the digital realm, all the receiving circuitry has to do is faithfully reproduce the bit stream. If the attenuation is not enough to drop enough bits to prevent the receiving chips from receiving the entire stream, the signal will continue on in a perfect state, bit for bit the same as the original.
This is the beauty of digital, and it's why you can get away with cheaper cables when dealing with digital signals.
Sorry for going all Cliff Claven on you guys.