Hi Orb. None of that matters . Answer is still very much the same.
I mentioned the TCP bit as a clarification to your answer, not in the sense of a proof point that networ jitter is not material. UDP has an advantage over TCP when we stream A/V data over the Internet in that it doesn't have forced retransmissions as TCP does. On a LAN at home, it is not going to matter due to very low latency and much higher reliability (relative to the Internet). So this distinction is not material to anything we are talking about.
The key thing here is that your music player has *no* error mitigation. So it has no mechanism to change the sound as VOIP would. If it runs out of data, it simply stops playing. In other words, your data is either 100% there or not. Lots may happen on your network but to the extent the player buffer has data to fill, it will be drained using its high-precision clock oscillator, not the network arrival time. What happens on the network side simply decided whether you will run out of data or not and hence glitch. It cannot impact audio fidelity directly.
We call the above pseudo-real-time. The process of playing audio is indeed real-time. Our network however by definition is not real-time. So buffering is used to decouple network glitches -- drop out, jitter, or whatever -- from the real-time end of the pipe.
I have to run but happy to explain more when I get more time.
I mentioned the TCP bit as a clarification to your answer, not in the sense of a proof point that networ jitter is not material. UDP has an advantage over TCP when we stream A/V data over the Internet in that it doesn't have forced retransmissions as TCP does. On a LAN at home, it is not going to matter due to very low latency and much higher reliability (relative to the Internet). So this distinction is not material to anything we are talking about.
The key thing here is that your music player has *no* error mitigation. So it has no mechanism to change the sound as VOIP would. If it runs out of data, it simply stops playing. In other words, your data is either 100% there or not. Lots may happen on your network but to the extent the player buffer has data to fill, it will be drained using its high-precision clock oscillator, not the network arrival time. What happens on the network side simply decided whether you will run out of data or not and hence glitch. It cannot impact audio fidelity directly.
We call the above pseudo-real-time. The process of playing audio is indeed real-time. Our network however by definition is not real-time. So buffering is used to decouple network glitches -- drop out, jitter, or whatever -- from the real-time end of the pipe.
I have to run but happy to explain more when I get more time.