HDMI Cable comparisons


I'm retiring my XBR CRT and installing an Elite plasma very soon. I've gotten mixed suggestions regarding HDMI quality and brand. I'm interested in hearing comparisons of cables you may have upgraded and the differences. Are there any HDMI cable reviews?

It seems like there is a lot going on in such a small package.
vicdamone

Showing 9 responses by jkalman

Don't waste money on an expensive HDMI cable...

My Bachelor degree is in Computer Science with a specialization in networking; this included studying electrical signals and their propagation along electrical lines. Passing signals across HDMI is a form of handshake networking.

I can assure you different cables will not change the video or audio quality of a digital signal, especially since the signal is buffered into memory along the way. When a signal is buffered it is stored temporarily in a cache memory as a charge then re-propagated again to be transmitted somewhere else or to be processed into analog. To sum up, the cable just carries the digital bitstream signal temporarily and the signal is reformed at the other end. So basically even if something was happening to the signal in-between, it would be completely taken out of the bitstream in the rebuffering process...

The digital signal passing is also an all or nothing proposition. If the cable were malfunctioning past the point of error correction, the signal would stop working completely.

People who think they are hearing and seeing differences with different types of digital signal cables are suffering from the placebo effect. Don't believe the bologna...
Where exactly is this cache residing? Not in the cable. So if it exists at all, it would be beyond the input receiver and would have nothing to do with the cable.

It has everything to do with the cable. As long as the signal gets to the receiver where the cache resides with the bits "distinguishable," even if highly attenuated, the signal is regenerated. In other words, you don't need to spend 100s of dollars to get the signal to a receiving end when a $11 to 20$ cable will give the same exact result as long as it is in line with standard specifications. You don't have to spend a lot of money to buy an in-spec cable.

If shielding is poor (or the cable is too long) and the signal has degraded enough that the input receiver is unable to recognize it, you are hosed.

And why do you think this is contrary to any of the points I am making? Read what I wrote...

There are plenty of cheap cables that are built to proper specifications. Spending 100s and 1000s won't get you any more in-spec than the standard demands.

I have no clue whether or not HDMI supports any error detection or correction. Interference can be transient so that drop outs are the result. If it happened frequently you might notice it.

Whatever the case may be, you still don't have to spend excessive amounts of money to get a well built, in-spec cable. People that think so are either delusional or trying to sell you something...
BTW Bob, I would consider those Blue Jeans cables to be in order with appropriate pricing and in-spec cabling...
Bob,

Some people argue that you are getting a better maintained signal when you use a better cable. Some people also argue that bits aren't bits and that signal degradation is going to affect the sound even if the bits are still recognizable as ones and zeros. My point is, it doesn't matter even if you believe those arguments. If the cable follows HDMI standards, which it doesn't have to be expensive to do, then it will get the signal there unless the cable is broken. The signal is regenerated by the processor chip cache and/or memory buffers, so the attenuation issue some people consider a problem is a non-issue as long as the cable is following the standards (sizing/gauge/etc).

If the bits aren't getting to the receiver, then the cable is broken or is not adhering to standards... Paying more for a cable isn't going to prevent the possibility of getting a malfunctioning cable. If you have a broken cable, return it.

Perhaps there isn't an implicit error correction for HDMI itself. I do know that the signal isn't processed when the cable is malfunctioning with certain media, but with other media some kind of error detection/correction must be occurring. I've had one malfunction and then break on me due to mishandling - I tried to snake it through the wall one too many times. Ironically enough, it was a relatively expensive cable (over ~$60). Signal passing worked on that cable with some material and not with other material.

I know with CDs there are multiple error checks when the disc is read. If you are sending data via bit stream to a preprocessor to have it decode certain compression algorithms, I believe the decoding process error checks the stream as well when converting it to PCM. Am I 100% certain of which algorithms perform error correction or not? Not off the top of my head. There are kinds of error correction that don't require a resending of information (redundancy checks), they are usually built into the stream in the form of some kind of checksum or the stream itself is framed a certain way to provide a checksum. These checks take place via decoding software.

CD ROMs have multiple redundancy checks, and so do Internet protocols (at more than one layer of the TCP-IP stack, as well as between larger ISP trunks using proprietary signaling frames).

I used bad terminology by referring loosely to the underlying digital signal and HDMI as one and the same...

If you are looking for more detail than that, you are going to have to research it yourself. I am taking four engineering classes ATM as prerequisites for an MS program in an engineering field I want to pursue (I had a test on Monday in Calculus 3, a test last night in Engineering Physics, and a test this morning in Statics...). This leaves me little time for leisure. I don't feel like wasting too much of that time researching topics in which I have only a passing interest. I would be interested in what you find out though. :D
There is a well known problem of interface jitter from transmitting the clock signal along with the data. I would agree that in the scheme of things this form of distortion is usally pretty small nowadays (compared to other problems like speaker distortion) but nevertheless it provides an example of why a digital cable might make a difference.

The jitter issue you are referring to is completely independent of the cable and symptomatic of the chosen bit stream framing format (tying the video and audio signals together in a certain way). Changing cables from a cheaper brand to more expensive brand has no effect on the jitter issue you are referring to in your post. Given the same exact lengths of cable and gauge, a $20 cable and a $2000 cable have the same exact jitter issues...

Likewise, the jitter levels introduced are imperceptible. Except by people with overactive imaginations, but they are hearing their imagination, not jitter... :D
Shadorne,

In other words, what you are referring to is a product of HDMI in general (though no one has scientifically proven they can actually hear that small a level of jitter). It is an issue with the HDMI format itself, independent of cabling.
Here is an older thread, but it has good technical info on jitter and HDMI. I'll post a link to page 3 which contains the most relevant info, but it is a great thread overall for info on jitter in general.

http://www.avsforum.com/avs-vb/showthread.php?t=908665&page=3
I was only trying to point out that in some cases of borderline equipment performance then an audible difference "might" occur.

Not in the context of $100+ (or even $50+) HDMI cables vs. $20 HDMI cables of the same length and gauge. If the cables have the same gauge and length, they will behave the same... You don't have to pay excessive amounts of money to get the same gauge and length as some companies that charge $200+. The jitter is going to be the same amount in both cables, yet you are insinuating someone should pay more for one digital than the other because it "might" be different. It isn't different... So don't pay more for a cable to get an imaginary benefit.

That thread I pointed out to you also mentions that jitter is eliminated at several places in the receiving end before the signal is turned into analog (though there was some questioning of buffer size needed to prevent overrunning the buffer). Plus there were links to pages that provide DBTs with jitter. It really is a non-issue now a days at the levels these preprocessing units function at...