I believe that apart from what has been said about running power through the USB cable, and also the cable length affecting reliability of transmission, the quality of cable "might" affect SQ as well. I know it sounds voodoo, but it's actually based on science. It's called "jitter" or transmission timing errors. Transmission of 0's and 1's over digital connection is not as simple as most of us might imagine, when it comes to audio data...
Since unlike regular data transmission, such as between PC and USB hard drive which transfers data by bulk and at maximum possible speed, transmission from PC to an external DAC is "real-time", so the source and DAC communicates at a certain "timing". For USB, this uses a "clock", and there are 2 general approaches, one is where the timing is "synchronous" i.e. controlled by the clock on the source (e.g. your computer), while the other is "asynchronous" i.e. the timing is independent of the source's clock (so the DAC's clock controls timing). In the real world, this "clock", regardless of which of the above approaches is used, is never perfect. So, particularly for synchronous USB, right from the source (e.g. computer) there is already jitter (called "transmitter jitter"). At the other end of the connection, i.e. the DAC, there is normally some form of jitter rejection, but this is at different levels. So here the design of the DAC matters, and obviously asynchronous USB has its advantages.
But what about the USB cable? There is such a thing called "line-induced jitter" which is caused by the transmission medium, in this case the USB cable, which causes waveform deformation of some sort due to its limited bandwidth and other electrical characteristics. Remember, at the end of the day, those "0"s and "1"s are still transmitted as waveforms, just like your analog signals. So the quality of the USB cable matters in minimizing jitter (unless you have a DAC with perfect jitter rejection, which I don't think has been invented yet?)
But how does jitter degrade the analog signal that results in the D/A conversion? The errors in the digital audio info translates to analog in what I think they call "sidebands", where aside from certain frequencies that are actually contained in the musical waveform, adjacent frequencies are also present at varying (but much lower) amplitudes. You can imagine how this can change the sound quality.
I'm not a subject expert on electronics (but I know about computers), so the above may not be as exact as I hoped for, but hopefully it gives the gist.