Why don't we count how many electrons are going past the cable to make the numbers even more impressive

. Your software player is not in charge of serializing the bits and sending them on USB bus. A dedicated piece of hardware does that. The role of the software is to provide a chunk of data from which the hardware siphons the bits. It is like you driving the car. You change the accelerator. The computer in your engine takes that and then modulates the fuel injectors with extreme accuracy measured in microseconds. The fact that they do that, does not mean that you need to move your foot with the same microsecond precision or that if you did, you would change the way the injectors work.
This is a fundamental way non-real-time systems are made real-time. We allocate pools of memory and dump lots of data in there when we get a chance. The hardware then on precise intervals meters the bits out. We get to be sloppy and non-real-time while the system as a whole stays real-time.
From the last post by JPlay, it seems that he undone the above mechanism by making the buffer small. Once he did that, then he created a world of hurt for himself since now he has to service the hardware on much more frequently and on more precise timing requirements. He took an easy job and made it hard. His motivation is that people said smaller buffers sound better. If I were him, I would have tested that concept with some measurements before running off and making ton of changes to the playback pipeline to keep the system from falling behind.
As an analogy, he is saying in winter you can just put a t-shirt and go outside. But to not freeze, now you need to run all the time to stay warm. Me? I would question why we would want to put a t-shirt on in winter

.