What I get from Stanley B's post is that jitter is often blamed for degrading sound yet cannot be proven by listening tests and jitter measurements.
Modern day DAC's have less than 50ps (pico seconds) jitter and most of them are immune to what happens on the USB bus.
Listening tests by the many Wyrd/regen and other 'decrapifiers' would make you believe the jitter of a DAC would become much lower which then 'explains' the heard differences.
Most measurements do seem to suggest that jitter is not lowered nor altered to a 'less audible' kind of jitter.
It is true that older receiver chips (not only USB) had higher amounts of jitter, sometimes into microseconds, where jitter 'byproducts' could reach audible levels.
I too suspect 'jitter' is often blamed for heard differences because it is the only 'aspect' that can be blamed for perceived differences.
Granted, 'real' jitter may differ from 'inferred jitter' which all tests are based on.
Jitter is sortof the same as 'flutter' in the old cassette/tape world but less random and thousands of times less severe but in also into a higher frequency range.
It thus has different 'spectral' properties.
Audibility differs depending on the frequency of the recorded sounds as well as the 'spectrum' of the jitter as well as the jitter being related to the recorded (digital) signal or not.
Very complex to measure 'completely' thus most of the time peak-peak measurements are done (and those numbers are given) which do not say anything about any of the other aspects.
Do keep in mind that listening tests have revealed human hearing can detect pitch changes (which jitter actually is, a very fast (thus modulating and not constant) pitch change.
HERE you can test how small of a pitch change you can hear (a listening test)
From what I understand from tests testsubjects can detect deviations in pitch up to 5ct, which works out to about +/-0.3% on a 440Hz A4 tone (+/- 1.3Hz).
Don't know if musicians can do better. Willing to bet most of them will have a tuning aid which can show frequencies or deviations.
One should realise that even if we have a real crappy DAC with 1 microsecond (= 1,000ns = 1,000,000ps = 1,000,000,000fs) of 'jitter' the pitch change of a 440Hz tone would be 0.05% i.e. 0.2Hz (That is if I did my math right) which is already 6x better than what would be audible.
As most modern DAC's are easily capable of having jitter well below 100ps, which is
10,000 x less than the primordial DAC calculation example, I too wonder about the audibility.
Can we truly 'detect' a pitch change of 0.0000002Hz on a 440Hz tone in some form of 'degraded sound' but not as pitch error ?
Another give away is the fact that with inferred jitter we see 'poles' in the plots as low as -120dB or lower opposite the max output signal and knowing that the human hearings 'dynamic' range is about 70dB this would mean that any audible 'jitter base distortion products' are at least 20dB below even the average levels of the most dynamic recordings out there. And well below the noise floor of even the best microphone/amplifiers in real life recordings.
So it would seem that 'measurements wise' jitter is hard to blame for modern DAC's.
In essence when you have a modern DAC jitter does not seem to be a real practical problem and the blame of sonic degradation may probably need to be looked in other aspects.