Will the differnece be that great between 19 vs. 100?
Unlike analog signals or clock timing signals, there should not be the slightest difference between digital signals if the cables are within tolerance and the bits are getting through. The whole point of digital is to elimiate the dependence on analog signal accuracy by sending a stream of two widely different states with clearly identifiable positions that remain clear given any vagaries in transmission/power/noise etc...i.e. a clearly disernable stream of what may be accurately identified as ONE's or ZERO's.
Of course a very poorly manufactured cable may give problems - but this should be the exception and not the rule.