It will downgrade the image to 8-bit 420 HDR as opposed to the full 10-bit 444 RBG. In that regard you can get 4K/120.
Admittedly it looks OK if you aren't aware that the colors are being downgraded, but once you see the color being lost from the downgrade, you can't unsee it.
Since you have a C4, you can click the green button on the remote like 10 consecutive times and it will show the bandwidth.
You should see something like RGB420 8b with some other numbers.
Full bandwidth would say RGB444 10b HDR
I ran some new tests, taking the Linux PC (RX 7600) from my office to my living room and connecting it to the LG C4:
CachyOS, RX 7600
- 120.0Hz
- FIXED (that is, VRR was not active, even though it was selected in the settings).
- 3840 x 2160P@120
- YCBCR420 8b TM HDR10
I asked Grok for details about this:
YCBCR420 - The RX 7600, in 4K 120 Hz + HDR10, can only send RGB/4:4:4 or 4:4:4 10-bit up to 98 Hz. Above that (100–120 Hz) the AMD driver automatically forces 4:2:0 to fit within the HDMI 2.1 bandwidth (48 Gbps).
8b - AMD, when forcing a 4:2:0 aspect ratio to achieve 120 Hz with HDR, also limits it to 8 bits + FRC (dithering) to save even more bandwidth. That's why it appears as 8b and not 10b. In practice, with LG's well-done FRC + WOLED panel, the visual difference to native 10-bit is almost zero.
TM - LG's active tone mapping (probably HDR Game or HGiG).
HDR10 - Static HDR10 working normally.
Out of curiosity, I decided to create a partition on my gaming PC that's connected to the TV and installed CachyOS. This PC has a 3060 Ti.
Windows, RTX 3060 Ti
- 118.80Hz~120Hz
- GSYNC
- 3840 x 2160P@120
- YCBCR444 10b 4L12 HDR10
YCBCR444 Chroma subsampling 4:4:4 → maximum color quality and super sharp text (same as RGB). NVIDIA has a more efficient HDMI 2.1 controller and better optimizes bandwidth.
10b - 10 native bits per channel (1.07 billion colors) → perfect gradients in HDR, AMD forces 8b + FRC at 120 Hz HDR to fit the band.
4L12 - 4 lanes (HDMI cables) at 12 Gbps each → is using the full 48 Gbps of HDMI 2.1. Confirm that the cable and port are running at the maximum standard.
HDR10 - Static HDR10 working normally.
CachyOS, RTX 3060 Ti:
- 39.59Hz~120Hz (The number varied considerably depending on whether I opened a window or moved the mouse.)
- GSYNC
- 3840 x 2160P@120
- RGB 10b 4L10 HDR10
RGB - Full RGB color format (equivalent to 4:4:4) → maximum possible text and color sharpness. On Linux, the NVIDIA driver currently does not force YCbCr at 4K 120 Hz HDR as it does on Windows. It sends native RGB even at 120 Hz.
10b - 10 native bits per channel → True HDR with perfect gradients. Same thing: the Linux driver allows true 10-bit at 120 Hz without downgrades.
4L10 - 4 lanes at 10 Gbps each → is using 40 Gbps (HDMI 2.1 with FRL 5 or TMDS). The Linux driver is running at a slightly lower link rate than the 12 Gbps of Windows (4L12), but there is still enough bandwidth left for 10-bit 120 Hz RGB.
HDR10 - Static HDR10 working normally.
Initially, the difference in image quality isn't that noticeable. I took some photos with my cell phone and zoomed in, and I could see a slight difference in the colors, in 10 bits, the colors are more saturated.
I took the opportunity to run some benchmarks in Shadow of the Tomb Raider to see how the Nvidia card was performing (and because the game has native HDR). I don't know if it was a configuration issue, but the HDR was low-brightness on Linux, which I found strange. On the AMD GPU, it was very bright. In SDR mode (testing another simple game, Marlow Briggs), KDE was doing tone mapping and the colors were more vibrant. I assume it's some kind of gamescope bug.
Performance in games, as usual, is very different.
Playing in 4K, everything on ultra, RT enabled, DLSS Performance
Windows - 56 fps
CachyOS - 11 fps
I tested another version of Proton, switched the driver from the open source to the proprietary one, but the performance remained poor.
When I disabled RT, the average performance was 53 fps.
As a curiosity, I tested it with RT on the RX 7600, without upscaling, and it averaged 8 fps. Around 25 fps, without RT.
I also tested RPCS3 with God of War 3, upscaling the resolution to 4K. I have a 5700X3D.
Windows: 39 fps
CachyOS: 36.65 fps
Although the numbers fluctuate quite a bit, I consider it a tie. A few months ago, Windows performance was far behind Linux, but it seems the developers have fixed the problem with the Windows build.