Achieving the optimal video output settings on the Xbox One has been a topic of debate for quite some time. Controversy has raged across various internet forums regarding which settings should be enabled to ensure the highest image quality. Thanks to the use of signal capturing devices, the debate has finally been settled.
The Xbox One console should be set to output at an 8-bit color depth. This probably seems counterintuitive to those with 4K HDR TVs, whose panels support a 10-bit or higher color depth. However, it has been demonstrated that the Xbox One will still automatically output all HDR content in 10-bit or 12-bit color depth, even when the console’s video output settings are set to 8-bit. The advantage of setting the Xbox One’s color depth to 8-bit is quite simple. The console renders all non-HDR (SDR) content in 8-bit, and setting your console to output in 10-bit or 12-bit will negatively affect the image quality of SDR content due to the introduction of inherently flawed video reprocessing.
HDMI 2.0 is only capable of transmitting data at 18Gbps. This constraint does not allow the Xbox One to output full 10-bit or 12-bit 4K 60hz video in YCC 4:4:4. As such, the console is only capable of outputting 10-bit and 12-bit 4K 60hz video in YCC 4:2:0 or 4:2:2. All SDR games will render on your Xbox One in 8-bit, a signal that can be transmitted in full to an HDMI 2.0 TV at 60hz in 4K. This is one of the reasons why setting the Xbox One’s color depth to 10-bit or 12-bit will cause SDR games to lose color accuracy, a result of forcing the video to output in YCC 4:2:0 or 4:2:2.
Another video output option you’ll want to enable while playing games in HDR on your Xbox One is “allow YCC 4:2:2.” This will force your Xbox One to output 10-bit and 12-bit HDR content at YCC 4:2:2 instead of 4:2:0. Disregard Microsoft’s suggestion that you should only enable this feature if your console is having problems.
Although ultra HD Blu-rays are encoded in YCC 4:2:0, setting the system to output at YCC 4:2:2 will not negatively affect the image quality of the video. As Daniel Bastian correctly pointed out in the comment section, YCC 4:2:2 should be enabled at all times if your TV supports it. But remember, even if you’re only using your Xbox One to watch movies, you should still set the color depth to 8-bit. This will ensure the best video quality possible for your collection of SDR DVDs and Blu-rays.
YCC 4:2:0 and 4:2:2 refer to chroma subsampling. Basically, all you need to know about chroma subsampling is that YCC 4:2:2 is superior to 4:2:0. While YCC 4:4:4 would be ideal, the signal cannot be transmitted via HDMI 2.0 due to the limitations of the technology. The Xbox One Console renders SDR games in 8-bit RGB, not YCC, and the RGB signal does not employ chroma subsampling. HDMI 2.0 is capable of transmitting the 8-bit RGB signal in 4k at 60hz, which is exactly why you should set your Xbox One to output in an 8-bit color depth. Setting your Xbox One’s color depth to 10-bit or 12-bit won’t render SDR games in a higher color depth, it will simply force the console to output at a higher color depth by introducing additional video processing that will negatively affect the image.
Realistically, most people would never notice the difference. Then again, if you’ve already purchased a 4K HDR television and an Xbox One, why not get the most out of them? It’s also worth noting that not all games look best with HDR enabled. Some games that “support” HDR do not actually render in HDR. Red Dead Redemption 2, for example, does not properly render in HDR.
While playing games like Red Dead Redemption 2, which do not accurately render in HDR, you will need to disable HDR support in the Xbox One’s settings. Even when playing games that allow the option of disabling HDR through the in-game settings, it’s probably best to disable it via the console to prevent any conflicts between the game and the system. It’s fairly common for games to improperly render in HDR, and it’s worth researching information about each of your games to ensure they will render correctly in HDR.
Let’s explore the concept of HDR a little further. The 10 in HDR10 refers to 10-bit color. While it is possible to render SDR content above an 8-bit color depth, the Xbox One does not. The Xbox One will only render above the 8-bit color depth when outputting in HDR10 or Dolby Vision. Dolby Vision outputs in a 12-bit color depth, although few 4K TVs support this feature. Remember, your Xbox One will still automatically output HDR content properly in HDR10 or Dolby Vision when those options are enabled, even though the console’s color depth is set to 8-bit.
The Xbox One X will support HDMI 2.1, and Belkin recently released an HDMI cable capable of transmitting at 48Gbps. Unfortunately, these new cables won’t allow HDMI 2.0 televisions to receive 60hz 4K 10-bit or 12-bit YCC 4:4:4 video signals, due to the limitations of HDMI 2.0 technology. New TVs featuring HDMI 2.1 are coming soon, but it’ll likely be at least a year before the Xbox One X receives an update allowing it to transmit 10-bit and 12-bit YCC 4:4:4 4K video at 60hz and above to these new devices. Although, the Belkin 48Gbps HDMI cords are allowing Apple TV users to transmit 4K 10-bit YCC 4:2:2 video at 60hz, as the Apple TV has a strange defect that renders 18Gbps HDMI cables insufficient at transmitting the high bandwidth signal.
The above video is one of the best on YouTube at explaining these concepts. While this article currently provides the best advice for achieving optimal video quality from your Xbox One, we will update this content to reflect any future changes to the console’s settings by Microsoft. If you have any questions or comments about the information presented here, be sure to let us know in the comment section below.