Analogue glitches well. There's a great deal of nostalgia for it. Steampunk. Our digital devices do not glitch nearly so well. Usually, they either work or don't. The noise level on a channel determines how much data can be pushed through it, the Shannon limit. If that is exceeded, the channel fails entirely. Basic information theory. Except, it doesn't need to be this way. We can imagine digital devices of the same complexity to the ones we have today that don't behave like this. Take digital TV. If the signal is good enough it's perfect, otherwise it's unwatchable. Clearly in reality different receivers have different noise levels. The current design ignores this. It is therefore a poor design. There's nothing stopping us boosting the redundancy of some part of the signal, and reducing the redundancy of another part, so every receiver can at least get part of the signal.
Lossy coding is easy, and receivers can continue to improve after the protocol is specified. A smart receiver could take a degraded signal and make a best guess, degrade toward a cartoon, hallucinate to an acceptable degree. In other words, glitch. Like analogue, only far more wonderfully.