Thank you Kalahaan

Amir just in response.
If using wireless and as an example Apple Airplay (which I think is the most likely wireless implementation Devialet will use), the file is still read by the application on the PC-Ipad-etc and then streamed to the receiver, so in this scenario it is still comparable to VoIP as the client is the PC-Ipad and it must stream the file to the far end, so it is not reading the file remotely.
http://support.apple.com/kb/HT4437
I think Airplay is a good example because they approach the issues in a way similar to DNLA but provide probably the best end-to-end architecture, also Airplay is good example to use because it employs RTSP that is specific to streaming audio-video.
In further detail it seems Airplay uses a combination of both TCP and UDP, although to be sure what it is someone needs to be bothered to capture how 554 RTSP is used (does not matter if udp or tcp as this is port is used to establish and control sessions normally) and importantly ports 16384-16403 are used that are UDP and for Audio RTP and Video RTP streaming delivery.
However if it is doing actual streaming, then what is outlined as challenges for VoIP is also applicable to D-Premier/Airplay/etc.
http://support.apple.com/kb/TS1629
Again RTP is associated with streaming and also VoIP, so the use by me is correct generally, but depends upon specific implementation where as I mention we are all making assumptions.
http://en.wikipedia.org/wiki/RTSP
http://en.wikipedia.org/wiki/Real-time_Transport_Protocol
And specific about use of both tcp for control and udp for streaming:
http://en.wikipedia.org/wiki/AirTunes
Note it mentions;
The AirPlay protocol was reverse-engineered by Jon Lech Johansen in 2004.[6] It uses UDP for streaming and is based on RTSP.[7] The streams are encrypted with AES, requiring the receiver to have access to the appropriate private key to decrypt the streams.
But any solution needs to be comparable to the behaviour of RTCP and RTP, as a summary here is a basis summary from wiki.
The Real-time Transport Protocol (RTP) defines a standardized packet format for delivering audio and video over IP networks. RTP is used extensively in communication and entertainment systems that involve streaming media, such as telephony, video teleconference applications and web-based push-to-talk features.
RTP is used in conjunction with the RTP Control Protocol (RTCP). While RTP carries the media streams (e.g., audio and video), RTCP is used to monitor transmission statistics and quality of service (QoS) and aids synchronization of multiple streams. When both protocols are used in conjunction, RTP is originated and received on even port numbers and the associated RTCP communication uses the next higher odd port number.
It is one of the technical foundations of Voice over IP and in this context is often used in conjunction with a signaling protocol which assists in setting up connections across the network. RTP was developed by the Audio-Video Transport Working Group of the Internet Engineering Task Force (IETF) and first published in 1996 as RFC 1889, superseded by RFC 3550 in 2003.....
RTP was developed by the Audio/Video Transport working group of the IETF standards organization. RTP is used in conjunction with other protocols such as H.323 and RTSP.[1] The RTP standard defines a pair of protocols, RTP and RTCP. RTP is used for transfer of multimedia data, and the RTCP is used to periodically send control information and QoS parameters.[2]
RTP is designed for end-to-end, real-time, transfer of stream data. The protocol provides facility for jitter compensation and detection of out of sequence arrival in data, that are common during transmissions on an IP network. RTP supports data transfer to multiple destinations through multicast.[3] RTP is regarded as the primary standard for audio/video transport in IP networks and is used with an associated profile and payload format.[1]
Real-time multimedia streaming applications require timely delivery of information and can tolerate some packet loss to achieve this goal. For example, loss of a packet in audio application may result in loss of a fraction of a second of audio data, which can be made unnoticeable with suitable error concealment algorithms.[4] The Transmission Control Protocol (TCP), although standardized for RTP use,[5] is not normally used in RTP application because TCP favors reliability over timeliness. Instead the majority of the RTP implementations are built on the User Datagram Protocol (UDP).[4] Other transport protocols specifically designed for multimedia sessions are SCTP and DCCP, although, as of 2010[update], they are not in widespread use.
So by definition sending high quality audio is real-time in same way as VoIP,both use tcp for establishing and managing sessions, while tcp can be used for streaming delivery I know from experience it usually is not but cannot comment for Airplay so the specific port would need to be monitored, however if previous quote is correct then it is udp anyway for Airplay.
I think DNLA is pretty comparable where the HTTP is used for establishing and controlling-managing, as I know the Digital Media Players must use UDP for streamed MPEGs/H.264/etc.
Also it seems that Upnp uses HTTP over UDP for some aspects.
Need to log for now unfortunately but hopefully enough there for chatting.
Thanks
Orb