AppleTV X - I am playing with something new

@Xymox I haven't personally compared 4K vs HD in the last year, and I know Apple has revised their system considerably in that time.

Do you still prefer HD over 4K with the current ATV OS?
 
A TV locks to a HDMI signal. So does a surround processor. The HDMI source clock/jitter propagates to the rest of the TV/surround decoding. So a HDMI source device is your master clock if you will. So feeding a super clean HDMI signal to devices results in a lot of the systems downstream having less jitter.

There is a LOT more going on besides HDMI signal clean up. On at ATVX the CPU, RAM, HDMI chip and ethernet is all getting dejittered and power supply to each system is vastly cleaner and tighter regulated. This results in much cleaner clocking of data thru the whole system. A measurable / visible result of this as one example is video frame decoding jitter. Each frame sent out is more periodic. Each decoded frame ends up being really well spaced in time.

If your using a surround rcvr/decoder then you can plug the ATVX directly into the surround rcvr. This is a vastly better way to decode sound then using the TV optical or eARC out if your doing surround for it. eARC is a whole additional set of issues and a whole additional HDMI link that degrades sound VS hooking the ATVX directly to the rcvr.

You gain a lot of control of the source material. For example there was a issue with some Amazon series stuff and I was able to switch the ATVX to 4K23.98 and this mostly fixed the issue.

You can choose SDR/HDR/DV or even things like 2K VS 4K.

SO having a separate streaming source device has a lot of advantages over a built-in app AND will have a better pic and sound because of a list of reasons only some of which i covered.

A app running inside a TV has a lot going on. The CPU in a TV is doing other things besides just streaming, so these interrupts and stuff cause unavoidable jitter in various aspects of the process. The TV is a terrible electrical environment and the power supply rails are shared with other systems and are very noisy and poorly regulated. The RF noise environment means the data bus and other linked systems will end up with noise which will end up as jitter. The ATV is a much more powerful hardware platform then what you have in a TV and so this can lead to apps being able to do more powerful things. For example i can play 700Mbps video streams off a local server. So TV apps are a lot less worthy of high performance use.
This is incredibly helpful and very insightful. I overlooked a few obvious details, but learned a bunch just from this. Thank you for the info and I look forward to purchasing one soon!
 
@Xymox I haven't personally compared 4K vs HD in the last year, and I know Apple has revised their system considerably in that time.

Do you still prefer HD over 4K with the current ATV OS?

You mean HDR vs SDR ? Or actual HD ( 2K ) VS 4k or even 8k ?

Both are good questions..

The HDR vs SDR vs Dolby Vision question is quite topical as it turns out..

I recently redid part of a section of my lab/production area. I have used a computer monitor and a 4 port HDMI switch to check incoming ATVS and do burn-in once I mod them. The computer monitor was fine because I do not judge pic on that setup, I just need to see them working, update firmware etc.. I got a Sony 42" A90J and swapped out the computer monitor. After setting up the unit the ATV asked if I wanted to "Try out Dolby Vision".. Ok, ok,, sure... Just mainly to test the new 4 port HDMI2.1 switch.. So it flipped into Dolby Vision mode. This looked surprizingly good. I have another A90J that has older firmware, I fact reset it and looked at DV on it. Sure enough, the newer firmware looked better. I updated its firmware and yep, it looked better.

So fact reset on the TV, then ATVX to Dolby Vision. This defualts to Dolby Vision "bright".. WHich in my lab makes sense. After a lot of playing with SDR and DV, DV turned out to be reasonable. At least so far on the material I have tried. This is with the ATVX set to match frame rate but match dynamic range to OFF.. Meaning its always in Dolby Vision. I made minor changes to the A90J, the normal stuff like motion stuff.

SDR can still look better. But its gotten a lot closer. It most likely is also that 18.x Apple firmware updated DV too. Dolby vision can make things look a bit unnatural. Colors look more "cartoonish" rather then natural. Dolby is doing some gamma stuff and making it more appealing to the eye even tho that is not the way it ACTUALLY looks. So if your a post production person keeping tone maps out of the picture as each one is different for each TV & model, then SDR is better then pretty DV, so SDR is still best. But its much closer now and many people might actually prefer it. At least on the Sony A90J/K..

Its getting hard and harder to stay in SDR. Directv 4K channels for example REQUIRE HDR / DV not just 4K. Some other apps also do this. Its sorta like OK Open reel tape is better, but its impracticable.

A quick refresher.. HDR/DV does not have any more data. Its the same bitstream. HDR and DV take the SDR data and use metadata to spread out the existing SDR data into a bigger space of brightness range. However, there is no such thing as a real HDR TV. Science can't do that yet. This requires way more contrast range then any science currently. TV can take the HDR/DV metadata and then use a tone map for that set to guess where best to map the brightness. There is no standard for tonemaps and so this is all over the place. TVs today are SDR, so giving them the straight SDR without the SDR > HDR/DV > HDR/DV > tonemaps > TV math and conversions can produce really good pictures.

So for a pure picture to see it the way it was encoded without extra processing, 4K SDR is the way to go.

DV is eye catching but not accurate and will vary by make/model of TV and on a badly HDR/DV mastered movie or show can look horrendous and so dark you cant even see it.

So for reference work on video material SDR.. For casual stuff that is not accurate but eye pleasing, DV..

___________

2K HD vs 4K

I know there are people out there doing HD who swear its way better then 4k. While that sounds crazy, its not. Your bitstream from a service is gonna be about the same for both. This is because content owners are hyper focused on costs and bandwitdh to stream content is one of the biggest costs. Mbps = $ / sec... What REALLY controls pic quality is the amount of compression.

A 4K pic is 8,294,400 pixels in monochrome. 8 bits minimu per pixel. 66,355,200 bits per frame. lets say 24 frams per second. 1,592,524,800 bits per second. 1.6 Gbps. Without color. 4:2:2 will double this minimum. So uncompressed 4K is about 3 Gbps - conservativly..

So right off compression starts right at the camera. Then more compression for storage.. Then cascading compressions from various stages along the way..

Final playout to your device is like 30Mbps avg. This 100:1 compression takes its toll.. Cascading decompression and recompression also takes its toll..

So the pipe at the end, the stream to you, is REALLY comportant to quality and its not gonna get higher because the costs grow exponentially.

OK if our pipe into the house is fixed at 30-50Mbps,, is it better to have less compression or more pixels and more compression ? There is a good argument that keeping compression lower is better then more pixels.

COmpression works by lowering resolution of things in motion. Well its one aspect of compression, but a big one.. So a 4K pic that is moving a lot drops in resolution because of compression. In fact, VERY little actual material is REALLY full 4K. Studios have figured out people cant really see 4K and they can still say its in 4K even tho after compression its mostly 2k or less..

So 4K is pretty much never 4k.

So 2K, HD, is less compressed to fit thru the same size pipe. Its lower compression can look pretty darn good. Its actual resolution stays closer to 2K.

So watching things in 2K is not as crazy as it sounds...

BUT.....

WHat the TV does with 2K/HD becomes critical. Upconverting. Scaling.. There are a lot of patents on this and it costs money for a TV maker to do this best. LG spends no money on this and thier upconversion looks terrible. You need a lumagen to play good 2K/HD on a LG. However a Sony 4K can look really good as they spent the money and engineering on upconversion. I can easily fool someone into going "OMFG, Thats the best pic I have ever seen" by using a highly modded Oppo and doing 1080 bluray on a Sony GTZ380.. They all looked shocked when i tell them is 1080 HD. So watching things in 2K vs 4K COULD make sense and even be better IF your TV handles upconversion well.. It varies..

On a ATVX you can go to video settings and pick any resolution and try these things out. The list is incredibly long and some more esoteric options are way down the list.

_______



There is another way to watch video. But I am fairly insane on this and its extremely impratical.

I have a insanely modded Panasonic Plasma ST60 that uses boards from a VT and has a tons of really extreme mods. Its so modded its illegal because it radiates RF so intensely its really a 200 watts radio jammer. No joke. I knock out AM radio for a 1 block radius with it on. BUt OH MAN the pic is insanely good..

While plasma has serious limitations like bit depth ( some banding ) and has flicker because its REALLY displaying the frames in cleanly defined flashes like a film camera with a shutter,, its other qualities are just stunning..

This was before a TV could really process a video signal. So it just takes the pixels and displays them. Its 2K. So I can feed it a 2K signal and this matches up to the pixels exactly and no scaling required. 4K is donwconverted which is MUCH easier and better quality they upconversion.

I have done countless blind A/Bs for people, lots of my SMPTE friends who are HIGHLY technical and VERY versed in picture quality and EVERYBODY prefers the Plasma VS a OLED. Well,, with a caveat, they all complain of course about some banding with the display panel being bitstarved. BUT after doing a AB, they still prefer the plasma.

So while I have handy a GTZ380 and a few generations of Sony OLED,, the best pic comes from a higly modded plasma that works as a radio jammer.

I like using the Plasma for development work because i can see subtle pic differences i can see with 4k device.
 
You mean HDR vs SDR ? Or actual HD ( 2K ) VS 4k or even 8k ?

Both are good questions..

The HDR vs SDR vs Dolby Vision question is quite topical as it turns out..

I recently redid part of a section of my lab/production area. I have used a computer monitor and a 4 port HDMI switch to check incoming ATVS and do burn-in once I mod them. The computer monitor was fine because I do not judge pic on that setup, I just need to see them working, update firmware etc.. I got a Sony 42" A90J and swapped out the computer monitor. After setting up the unit the ATV asked if I wanted to "Try out Dolby Vision".. Ok, ok,, sure... Just mainly to test the new 4 port HDMI2.1 switch.. So it flipped into Dolby Vision mode. This looked surprizingly good. I have another A90J that has older firmware, I fact reset it and looked at DV on it. Sure enough, the newer firmware looked better. I updated its firmware and yep, it looked better.

So fact reset on the TV, then ATVX to Dolby Vision. This defualts to Dolby Vision "bright".. WHich in my lab makes sense. After a lot of playing with SDR and DV, DV turned out to be reasonable. At least so far on the material I have tried. This is with the ATVX set to match frame rate but match dynamic range to OFF.. Meaning its always in Dolby Vision. I made minor changes to the A90J, the normal stuff like motion stuff.

SDR can still look better. But its gotten a lot closer. It most likely is also that 18.x Apple firmware updated DV too. Dolby vision can make things look a bit unnatural. Colors look more "cartoonish" rather then natural. Dolby is doing some gamma stuff and making it more appealing to the eye even tho that is not the way it ACTUALLY looks. So if your a post production person keeping tone maps out of the picture as each one is different for each TV & model, then SDR is better then pretty DV, so SDR is still best. But its much closer now and many people might actually prefer it. At least on the Sony A90J/K..

Its getting hard and harder to stay in SDR. Directv 4K channels for example REQUIRE HDR / DV not just 4K. Some other apps also do this. Its sorta like OK Open reel tape is better, but its impracticable.

A quick refresher.. HDR/DV does not have any more data. Its the same bitstream. HDR and DV take the SDR data and use metadata to spread out the existing SDR data into a bigger space of brightness range. However, there is no such thing as a real HDR TV. Science can't do that yet. This requires way more contrast range then any science currently. TV can take the HDR/DV metadata and then use a tone map for that set to guess where best to map the brightness. There is no standard for tonemaps and so this is all over the place. TVs today are SDR, so giving them the straight SDR without the SDR > HDR/DV > HDR/DV > tonemaps > TV math and conversions can produce really good pictures.

So for a pure picture to see it the way it was encoded without extra processing, 4K SDR is the way to go.

DV is eye catching but not accurate and will vary by make/model of TV and on a badly HDR/DV mastered movie or show can look horrendous and so dark you cant even see it.

So for reference work on video material SDR.. For casual stuff that is not accurate but eye pleasing, DV..

___________

2K HD vs 4K

I know there are people out there doing HD who swear its way better then 4k. While that sounds crazy, its not. Your bitstream from a service is gonna be about the same for both. This is because content owners are hyper focused on costs and bandwitdh to stream content is one of the biggest costs. Mbps = $ / sec... What REALLY controls pic quality is the amount of compression.

A 4K pic is 8,294,400 pixels in monochrome. 8 bits minimu per pixel. 66,355,200 bits per frame. lets say 24 frams per second. 1,592,524,800 bits per second. 1.6 Gbps. Without color. 4:2:2 will double this minimum. So uncompressed 4K is about 3 Gbps - conservativly..

So right off compression starts right at the camera. Then more compression for storage.. Then cascading compressions from various stages along the way..

Final playout to your device is like 30Mbps avg. This 100:1 compression takes its toll.. Cascading decompression and recompression also takes its toll..

So the pipe at the end, the stream to you, is REALLY comportant to quality and its not gonna get higher because the costs grow exponentially.

OK if our pipe into the house is fixed at 30-50Mbps,, is it better to have less compression or more pixels and more compression ? There is a good argument that keeping compression lower is better then more pixels.

COmpression works by lowering resolution of things in motion. Well its one aspect of compression, but a big one.. So a 4K pic that is moving a lot drops in resolution because of compression. In fact, VERY little actual material is REALLY full 4K. Studios have figured out people cant really see 4K and they can still say its in 4K even tho after compression its mostly 2k or less..

So 4K is pretty much never 4k.

So 2K, HD, is less compressed to fit thru the same size pipe. Its lower compression can look pretty darn good. Its actual resolution stays closer to 2K.

So watching things in 2K is not as crazy as it sounds...

BUT.....

WHat the TV does with 2K/HD becomes critical. Upconverting. Scaling.. There are a lot of patents on this and it costs money for a TV maker to do this best. LG spends no money on this and thier upconversion looks terrible. You need a lumagen to play good 2K/HD on a LG. However a Sony 4K can look really good as they spent the money and engineering on upconversion. I can easily fool someone into going "OMFG, Thats the best pic I have ever seen" by using a highly modded Oppo and doing 1080 bluray on a Sony GTZ380.. They all looked shocked when i tell them is 1080 HD. So watching things in 2K vs 4K COULD make sense and even be better IF your TV handles upconversion well.. It varies..

On a ATVX you can go to video settings and pick any resolution and try these things out. The list is incredibly long and some more esoteric options are way down the list.

_______



There is another way to watch video. But I am fairly insane on this and its extremely impratical.

I have a insanely modded Panasonic Plasma ST60 that uses boards from a VT and has a tons of really extreme mods. Its so modded its illegal because it radiates RF so intensely its really a 200 watts radio jammer. No joke. I knock out AM radio for a 1 block radius with it on. BUt OH MAN the pic is insanely good..

While plasma has serious limitations like bit depth ( some banding ) and has flicker because its REALLY displaying the frames in cleanly defined flashes like a film camera with a shutter,, its other qualities are just stunning..

This was before a TV could really process a video signal. So it just takes the pixels and displays them. Its 2K. So I can feed it a 2K signal and this matches up to the pixels exactly and no scaling required. 4K is donwconverted which is MUCH easier and better quality they upconversion.

I have done countless blind A/Bs for people, lots of my SMPTE friends who are HIGHLY technical and VERY versed in picture quality and EVERYBODY prefers the Plasma VS a OLED. Well,, with a caveat, they all complain of course about some banding with the display panel being bitstarved. BUT after doing a AB, they still prefer the plasma.

So while I have handy a GTZ380 and a few generations of Sony OLED,, the best pic comes from a higly modded plasma that works as a radio jammer.

I like using the Plasma for development work because i can see subtle pic differences i can see with 4k device.
Thank you for that! This may be one of the "top 10" posts I've ever read about HT video quality.

I must have misinterpreted something you wrote previously, and it may not even have been in these forums. Personally I don't like HDR because it's so stinkin' bright it hurts my eyes. But... I could've sworn you advocated for HD rather than 4K, and I've actually been using that on my ATV-X. Call me oblivious, but I've been quite happy with it.

I guess I'll try some more experiments...
 
So fact reset on the TV, then ATVX to Dolby Vision. This defualts to Dolby Vision "bright".. WHich in my lab makes sense. After a lot of playing with SDR and DV, DV turned out to be reasonable. At least so far on the material I have tried. This is with the ATVX set to match frame rate but match dynamic range to OFF.. Meaning its always in Dolby Vision. I made minor changes to the A90J, the normal stuff like motion stuff.
But if you leave it on Dolby Vision always, then its converting non-Dolby Vision HDR and also SDR video to Dolby Vision. Isn't this BAD picture quality wise?
 
But if you leave it on Dolby Vision always, then its converting non-Dolby Vision HDR and also SDR video to Dolby Vision. Isn't this BAD picture quality wise?
You would think so huh ? BUT In *Theroy* Dolby could just pass thru the SDR to the SDR TV..

I watched some Pluto TV. Startrek Original Series, Mission Impossible,, other very SDR and 2K things. They looked OK. Was SDR better ? for me yes, but, over 25 years of picture evaluation and being in front of the best displays and pic sources ever made,, I am jaded and I can find LOTS of faults with ANY picture.. Well.. a good print of Baraka in 70mm in a small screen with the right glass and xenon,,, well,, that is pretty damn good. I suppose carbon arc would make it better, but, I doubt that will ever happen. I am a fan of spectral reproduction, so i think all these displays that are based on Tri Stimulus are lacking in ways that we dont have good science for yet. And LED / Laser based razor thin spectra I feel are a long ways from real spectral reproduction.. IMHO.. So I think ALL image reproduction is lacking and maybe somewhere in the future we might figure out how to do full spectrum reproduction, not just razor thin tri stimulas.

So far for me,, 70mm film all made and processed back decades ago just leaves all electronic repoduction in the dust.

IMHO...
 
Switch X's getting ready to ship.. I run 20 TERAbytes thru them before they leave..

I'm thinking this is my next system move. Do you have any APs available to go with them? I assume Bill at GT Audio is still the right contact for a purchase...

Edited to add: I believe we have to source our own modules for the SFP ports, correct? What's the current "preferred" version and are there any recommended suppliers?
 
I'm thinking this is my next system move. Do you have any APs available to go with them? I assume Bill at GT Audio is still the right contact for a purchase...

Edited to add: I believe we have to source our own modules for the SFP ports, correct? What's the current "preferred" version and are there any recommended suppliers?
Bill can give you all of this info.
 
  • Like
Reactions: msimanyi

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing