Digital XLR vs. Analog XLR - Balanced Cables


What is the difference between a digital XLR/balanced cable and an analog XLR/balanced cable?

What if I used an analog XLR/Balanced cable to carry a digital signal from the digital output of one device to the digital input of another device?

Any risks/damage, etc. . .
ckoffend
Ckoffend - Maybe whole thing it is too technical but it is important to
understand that 192kHz signal is really transmitted at 25MHz. Maybe part of
article below will explain better why digital cables exist. I don't want to
engage more in discussion here since it's becoming counterproductive and
I'm signing off.

"This article is from the Audio Professional FAQ, by with numerous
contributions by Gabe M. Wiener.

5.8 - What kind of cable AES/EBU or S/P-DIF cables should I use? How long
can I run them?

The best, quick answer is what cables you should NOT use!

Even though AES/EBU cables look like orinary microphone cables, and S/P-
DIF
cables look like ordinary RCA interconnects, they are very different.

Unlike microphone and audio-frequency interconnect cables, which are
designed to handle signals in the normal audio bandwidth (let's say that
goes as high as 50 kHz or more to be safe), the cables used for digital
interconnects must handle a much wider bandwidth. At 44.1 kHz, the digital
protocols are sending data at the rate of 2.8 million bits per second,
resulting in a bandwidth (because of the biphase encoding method)
of 5.6 MHz.

This is no longer audio, but falls in the realm of bandwidths used by
video. Now, considerations such as cable impedance and termination become
very important, factors that have little or no effect below 50 kHz.

The interface requirements call for the use of 110 ohm balanced cables for
AES/EBU interconnects, and 75 ohm coaxial unbalanced interconnects for
S/P-DIF interconnects. The used of the proper cable and the proper
terminating connectors cannot be overemphasised. I can personally testify
(having, in fact, looked at the interconnections between many different
kinds of pro and consumer digital equipment) that ordinary microphone or
RCA audio interconnects DO NOT WORK. It's not that the results sound
subtly different, it's that much of the time, it the receiving equipment
is simply unable to decode the resulting output, and simply shuts
down."
The last explanation about slew rate and ignores the relationship between the two. A little background in Fourier theory may clear up the lack of understanding as to the relationship between signal shape, frequency, and slew rate.

Your home audio equipment is not going to miss pulses by the changes caused by sending a pulse with fast rise and fall times through a path with a bandwidth typically available through analog interconnects.

As to T-line effects, the explanation misses the forest for the trees. The ultimate problem caused by standing waves is the rounding of the pulses.

The text proffered in the last few posts were simply lifted from elsewhere and offered as an explanation. But, they are out of context and inapplicable to the discussion at hand, which is whether digital vs analog IC's make any difference in home audio interconnects. If the person who initially posted the question is using the cables to transfer audio data in a typical fashion, i.e. from for eg. a CD transport to an outboard DAC, he doesn't need 110 ohm interconnect to do so - whcih was my original statement.

What is amazing is that this posts from Kijanki started out with a statement that bandwidth made no difference when it comes to jitter, but yet now offers quotes that refer to the importance of bandwidth. The reason for this contradiction appears to be a lack of a firm grounding in the meaning of the terms and effects discussed i.e. slew rate, frequency, bandwidth, and t-line effects.
Ok...So in a pistachio shell, say you have the option to use either an AES/EBU digital 110 ohm cable OUT from a USB to SPDIF converter to a WYRED4SOUND DAC2 AES/EBU input OR a 75 ohm digital SPDIF interconnect from same USB to SPDIF converter into same W4S Dac2. I keep hearing two tales: 1) that there's no difference since both are digital and 2) that there is a difference, claiming the EAE/EBU connection is superior.
Any opinion or truth is truly appreciated.

Thanks.
I've used an analog XLR in place of a digital XLR temporarily several times with no problems.
Renato13 - It is system thing. XLR protect from induced noise by using twisted pair and differential signal of much higher amplitude but at the same time might slow down transition if drivers have limited slew rate because they swing higher voltage. In addition shield is grounded on both ends - possible source of ground loops. Any jitter creation is always system dependent. It is usually wise to use 1.5m cable because signal travels forth and back (reflection) about 30ns (5ns/m propagation) just clearing original transition that lasts typical 25-30ns. Longer cable adds to noise pickup.