Being a transmission lines for high-frequency digital signals (megahertz range), all digital cables must first meet the "minimum" requirement of a 75-Ohm impedance. Some expensive digital cables unfortunately do not even meet this simple standard and most consumers have no way of knowing that.
Having the correct 75-Ohm impedance alone is of course no guaranty of a perfect signal transmission. As Albertporter already mentioned, connectors, termination, length, etc., could also affect the signals.
I am lucky enough to have a friend, a EE who worked in data transmission and has the equipment to measure the basic performance of digital cables. He showed me on his scope signal problems caused by impedance mismatch and reflections in the cables. The test signals became distorted with overshoots and other ugly noise spikes. This is not acceptable, especially from cables costing upward of $500. He did some simple things to the the connectors/terminations, and suddenly the test signal emerged perfectly undistorted. Listening tests with and without the fix confirmed the clear sound improvement.
And these are only problems that we understand pretty well. Engineering variable, however, are rarely fully defined and understood. There are often hidden parameters that we failed to account for at first, second, or even third pass.
So as a consumer chosing a digital cable, the first thing is to get absolute assurance of a 75-Ohm impedance. Then, find out about the quality of the connectors and terminations. If these issues are resolved, you would have avoided the first-order problems. Your ears will do the rest. It's pointless to try a digital cable that's are not even 75 Ohm.
Sometimes I wished that there were stricter regulations for digital cables, for example a 75-Ohm label like food nutiritional-fact label, based on actual measurements, not just wishful thinking. It would not be a guaranty of great sound, but it's a good start.
Having the correct 75-Ohm impedance alone is of course no guaranty of a perfect signal transmission. As Albertporter already mentioned, connectors, termination, length, etc., could also affect the signals.
I am lucky enough to have a friend, a EE who worked in data transmission and has the equipment to measure the basic performance of digital cables. He showed me on his scope signal problems caused by impedance mismatch and reflections in the cables. The test signals became distorted with overshoots and other ugly noise spikes. This is not acceptable, especially from cables costing upward of $500. He did some simple things to the the connectors/terminations, and suddenly the test signal emerged perfectly undistorted. Listening tests with and without the fix confirmed the clear sound improvement.
And these are only problems that we understand pretty well. Engineering variable, however, are rarely fully defined and understood. There are often hidden parameters that we failed to account for at first, second, or even third pass.
So as a consumer chosing a digital cable, the first thing is to get absolute assurance of a 75-Ohm impedance. Then, find out about the quality of the connectors and terminations. If these issues are resolved, you would have avoided the first-order problems. Your ears will do the rest. It's pointless to try a digital cable that's are not even 75 Ohm.
Sometimes I wished that there were stricter regulations for digital cables, for example a 75-Ohm label like food nutiritional-fact label, based on actual measurements, not just wishful thinking. It would not be a guaranty of great sound, but it's a good start.