AMD GPUs may FINALLY have a legit HDMI 2.1 work around in Linux. Must read if you plan on building a SteamOS living room PC connected to a TV

JohnnyFootball

GerAlt-Right. Ciriously.
If you have an nvidia GPU, you don't have this issue. You have other issues. Also, if this is on a desktop, it doesn't matter since AMD can just use display port.

Long story short, if you hook up a Linux PC with an AMD GPU using HDMI (like to an OLED TV) you are limited to only HDMI 2.0 speeds, meaning you can get 4K/120 BUT you are limited to 8-bit RGB at 420 instead of full 10-bit 444 at 144Hz. This is due to AMD drivers being open source on Linux and the HDMI Forum refusing to certify. Whether you agree or not. That is the situation. I blame the HDMI Forum and would love to see the Linux community file a lawsuit...but whatever.

To be fair..most people could run an AMD GPU using those bandwidth limitations and it would still look fine to most people. I, myself, didn't notice the degraded color until I was made aware of the limitations. Sadly, once I noticed it, I couldn't unnotice it...:-(

Workarounds people tried, were DP to HDMI cables that required custom firmware that would sometimes work, but often lost HDR, VRR and/or both. It just wasnt stable or reliable.

However, it appears that UGREEN, who gets advertised on a lot of youtube tech channels might finally have the perfect cable.



I know we're gonna get the "fuck linux" "fuck amd" users coming into this thread to assure us that windows will rule the rest of our lives and that nvidia will always be the winners...please just fuck off and be grateful that other options are out there. Linux still is not the perfect gaming setup, but it is in a state where most newcomers can use it just fine. I've installed it for several novice pc gamers and they love the ease and simplicity.

If this ends up being a legit cable that can get 144 Hz (the LG OLED C4 and higher supports 144Hz) on an AMD GPU with no bandwidth, I will sell my 5070 Ti, buy a 9070XT and install Bazzite on my living room PC.

I felt that this was worth sharing. The HDMI 2.1 can legit become an achilles heal for AMD and that issue needs to get resolved as nvidia is not that far off from having a viable living room Linux experience.

I have no idea when this will be available in the US, but this is absolutely something that needs to be kept an eye on.
 
Last edited:
Great news for people thinking about getting a SteamBox, because this issue would have almost certainly impacted them.
Still. Getting the issue resolved with the HDMI forums is far and away the most desirable option.

It still requires quite a bit of tweaking and user work to get it to work properly, but in time it will hopefully become a non-issue.

I was using the Cablematters DP to HDMI 2.1 cable and it I was able to get it work, I was not able to keep it working though and I got tired of having to fickle with it over and over.

AMD really needs to get in gear and find some sort of workaround.

The biggest leg up AMD has on nvidia is linux performance. nvidia is slowly starting to take this issue seriously, which is a VERY good thing for those of us fed up with Windows.
 
Last edited:
You can't install Steam OS on anything other than a Steam Deck or presumably the upcoming GabeCube

If you were building a loving room gaming PC and wanted to install some flavor of Linux out of the thousands of variants out there it might be Bazzite or something but it won't be Steam OS
 
You can't install Steam OS on anything other than a Steam Deck or presumably the upcoming GabeCube

If you were building a loving room gaming PC and wanted to install some flavor of Linux out of the thousands of variants out there it might be Bazzite or something but it won't be Steam OS
You actually can, but Bazzite is an infinitely better option.
 
The biggest leg up AMD has on nvidia is linux performance. nvidia is slowly starting to take this issue seriously, which is a VERY good thing for those of us fed up with Windows.
I use Nvidia with Linux and it's almost flawless at this point. I don't even notice a performance difference. Granted, I have a 5090 so I may be brute-forcing myself through it, but I can't see switching to a AMD card, unless they offer something in the high-end space, akin to the xx90 series from Nvidia
 
I use Nvidia with Linux and it's almost flawless at this point. I don't even notice a performance difference. Granted, I have a 5090 so I may be brute-forcing myself through it, but I can't see switching to a AMD card, unless they offer something in the high-end space, akin to the xx90 series from Nvidia
Im getting there with my desktop PC with my 4090.

Getting there...but I am going to wait until DX12-to-Vulkan can offer comparable performance to windows.
 
Outside of this issue thats now fixed, HDMI 2.0 does not allow 4K@120hz, even when a lower colour bit depth and chroma subsampling is used its limited to 18Gbps, which allows 4K@60hz max.

I saw someone write this in another thread and was confused there, if you're getting 4K@120hz out of the card then the bandwidth of the port/cable must have exceeded 18Gbps.
 
Outside of this issue thats now fixed, HDMI 2.0 does not allow 4K@120hz, even when a lower colour bit depth and chroma subsampling is used its limited to 18Gbps, which allows 4K@60hz max.

I saw someone write this in another thread and was confused there, if you're getting 4K@120hz out of the card then the bandwidth of the port/cable must have exceeded 18Gbps.
No, you can get 8-bit 420 4K/120 in Linux. That is right around 18 GBps.

Believe it or not, 8-bit RGB 420 doesn't look too bad and most people wouldn't notice unless they know what to look for.

 
Last edited:
Nice! That's awesome to hear from AMD folks, as I know the conversion cables could be hit and a miss on certain setups. Seeing that reddit thread show a video of the person getting HDMI CEC support to work, holding the xbox button down, and both booting the computer from sleep +turning on the TV was a flex.

It's annoying this even needs to be done though, and displayport should've always been the TV port standard because the HDMI forum license fees are among the worst in connectors.

Another bonus for me today was seeing Zen Browser add support for fractional scaling in Wayland on Linux.
 
That's the crazy thing about protocols these days. You can do everything correctly, but than some kind of copy protection (this is nothing else, not certified = you're out) comes into your way and cripples your good work.

Is there any reason why HDMI is still in use instead of display port? Only backwards compatibility as far as I know. And this can be achieved via adapters. Or can HDMI do something DP can't?
 
I don't even have a 4K TV, let alone a 120hz HDR one, but I'm oddly still tempted. Maybe it's FOMO or fearing a future price hike because this is currently the one solution. So far I've found Ugreen stuff to be solid.
 
I think I'm going to pick the adapter up. I'll dual boot Bazzite and run it as my main OS connected to my LG TV.
 
How does it do that? With DSC?
No. It falls under the 18GBps limit of HDMI 2.0.

A device that is artificially limited to HDMI 2.0 bandwidth isn't necessarily limited to the resolution and refresh rate of HDMI 2.0. 4K/120 8-bit RGB 420 falls under 18GBps. Just barely, but it does.

The Steam Machine very likely is HDMI 2.1 ready, but due to the issue, Valve has to call it a 2.0 device.
 
Keep in mind that within 6 months (or less) the performance gap between AMD and Nvidia with DirectX12 on Linux will likely be closed. Nvidia knows what is causing the issue and has promised a fix.

As I said earlier in this thread, I run Linux with an Nvidia card right now, and everything runs perfectly fine.

Unless you are thinking about a Steam Box or are absolutely beholden to AMD's GPU offerings, there's no reason not to use Nvidia with Linux now.
 
Nice! That's awesome to hear from AMD folks, as I know the conversion cables could be hit and a miss on certain setups. Seeing that reddit thread show a video of the person getting HDMI CEC support to work, holding the xbox button down, and both booting the computer from sleep +turning on the TV was a flex.

It's annoying this even needs to be done though, and displayport should've always been the TV port standard because the HDMI forum license fees are among the worst in connectors.

Another bonus for me today was seeing Zen Browser add support for fractional scaling in Wayland on Linux.
HDMI made sense back in the day when people actually connected DVD players to their TV. Hollywood wanted DRM and remember these are TVs not monitors.
 
I retreated from the living room TV to a desk with a gaming monitor over ten years ago. I have not been paying much attention to TV technology outside of "OLED good". I guess I figured TVs would have at least one DP port as a standard feature by now. I just searched Best Buy and filtered by TVs and Projectors with at least one DP port. Only a few projectors come up.

Not to be a conspiracy theorist, but is this an agreement between all the TV OEMs and the IP owners in some way? Do they not want DP on TVs because it would make piracy easier or something? It just seems like at least a few OEMs would have a few TVs that include it as a way to differentiate their products and compete with their competitors... :pie_thinking:
 
HDMI made sense back in the day when people actually connected DVD players to their TV. Hollywood wanted DRM and remember these are TVs not monitors.

Yeah, I'm just surprised it has persisted so much. Displayport is an open standard, and they could've baked in whatever else they wanted on top of it without paying a royalty-fee just to include the connector at all.
 
Last edited:
Have I read this correctly that they enabled RT in their testing? Really?

Yeah, the article fails to point out the difference RT makes. Without RT, the performance difference is negligible.

"Valve claims that this shouldn't be an issue with the Steam Machine's APU, but testing suggests otherwise. In its benchmarks, Ars found that games like Returnal are playable on an RX 7600 with just 8GB of RAM at 1440p and max settings in Windows. But switch to SteamOS, and the performance is less than a third. The same issue is not present with the 16GB RX 7600 XT."
 
Going through so much hassle only to get worse performance.


I learned this recently, in tests I did on both systems, and I noticed the difference in VRAM usage in Shadow of the Tomb Raider. While on Windows the game was using almost 9 GB of VRAM, on Linux it was restricted to the 8 GB of the RX 7600.

In Windows, the GPU driver is more liberal with "overcommit", meaning it allows applications to allocate buffers that exceed 8 GB of physical VRAM, using shared memory (part of the system RAM) as a fallback via PCIe.

In Linux, the behavior is stricter and more conservative, the dedicated VRAM is rigidly 8 GB. The driver allows "spillover" to GTT (Graphics Translation Table, memory mapped to system RAM), but applications need to be designed to use separate heaps (VRAM as priority, GTT as secondary).

Many apps, such as Proton, do not allow overcommit to avoid excessive allocations and prevent instability.

Yeah, I'm just surprised it has persisted so much. Displayport is an open standard, and they could've baked in whatever else they wanted on top of it without paying a royalty-fee just to include the connector at all.

Because TV manufacturers are part of the HDMI Forum.
 
Keep in mind that within 6 months (or less) the performance gap between AMD and Nvidia with DirectX12 on Linux will likely be closed. Nvidia knows what is causing the issue and has promised a fix.

As I said earlier in this thread, I run Linux with an Nvidia card right now, and everything runs perfectly fine.

Unless you are thinking about a Steam Box or are absolutely beholden to AMD's GPU offerings, there's no reason not to use Nvidia with Linux now.
People prefer AMD due to it's open source nature.

I have read that nvidia has actually hired many Linux developers.

It would indeed be something if nvidia sweeped right in and ended up taking away AMDs biggest advantage.

I do think AMD needs to start getting serious on overcoming this HDMI 2.1 issue.
 
I retreated from the living room TV to a desk with a gaming monitor over ten years ago. I have not been paying much attention to TV technology outside of "OLED good". I guess I figured TVs would have at least one DP port as a standard feature by now. I just searched Best Buy and filtered by TVs and Projectors with at least one DP port. Only a few projectors come up.

Not to be a conspiracy theorist, but is this an agreement between all the TV OEMs and the IP owners in some way? Do they not want DP on TVs because it would make piracy easier or something? It just seems like at least a few OEMs would have a few TVs that include it as a way to differentiate their products and compete with their competitors... :pie_thinking:
DP is a feature that would be used by so few. It never make much sense to include it....until now.

However, I do feel that now it is starting to make sense as TVs are starting to include more and more gaming features. HDMI 2.2 has been announced with 96 GBps bandwidth.

Unless somehow 8K video really takes off, we have pretty much reached the max of what we can in terms of video quality.

But gamers will always have a demand for higher framerates and the HDMI Forum and TV makers know that gamers are one group that they can extract more money from.

LG is one company that always seems to go first on gamer features with their OLEDs.

They were the first with HDMI 2.1. First with GSync. The first to go 144Hz. They should also go first by offering 4K/240 with DP 2.1

I bought the C4 last year, but I would upgrade to that TV in a HURRY.
 
Last edited:
Unless somehow 8K video really takes off, we have pretty much reached the max of what we can in terms of video quality.

Funny, I have been doing some non-scientific testing lately. As far as I can tell, in most games, 3200x1800 seems to be the limit of my eyes for my tv. I would need the exact same scene, sitting still, side-by-side to tell it it's that or 4k. Makes the thought of 8k kind of a joke for me. Probably be pretty cool on a large monitor, though.
 
Funny, I have been doing some non-scientific testing lately. As far as I can tell, in most games, 3200x1800 seems to be the limit of my eyes for my tv. I would need the exact same scene, sitting still, side-by-side to tell it it's that or 4k. Makes the thought of 8k kind of a joke for me. Probably be pretty cool on a large monitor, though.
I'm in my 40s, I wear glasses/contacts and I game at 4k on a 83-inch OLED, I can't see myself getting that much benefit from 8k, other than bragging rights.

Sitting on my couch right now, I'd be hard-pressed to tell the difference between 1440 and 4K, and depending on the game, between 1080 and 4k.
 
Funny, I have been doing some non-scientific testing lately. As far as I can tell, in most games, 3200x1800 seems to be the limit of my eyes for my tv. I would need the exact same scene, sitting still, side-by-side to tell it it's that or 4k. Makes the thought of 8k kind of a joke for me. Probably be pretty cool on a large monitor, though.
Buying a really nice 1080p monitor was the best thing I ever did for my sanity. Mid range PC crushes everything, and I'll make the jump to 4k when this whole DRAM thing blows over yeah?
 
You can't install Steam OS on anything other than a Steam Deck or presumably the upcoming GabeCube

If you were building a loving room gaming PC and wanted to install some flavor of Linux out of the thousands of variants out there it might be Bazzite or something but it won't be Steam OS
Good to know I didn't actually install it on my generic Ryzen Mini PC.

The official distro works on pretty much any PC with a Ryzen GPU.
 
Buying a really nice 1080p monitor was the best thing I ever did for my sanity. Mid range PC crushes everything, and I'll make the jump to 4k when this whole DRAM thing blows over yeah?

I remember the first time I saw a 1080p tv in full glory playing a bluray. I remember thinking "wow that is incredibly sharp".

And while I appreciate higher resolutions, 1080 is still a lot of pixels. I can see a character's fucking fingernails.... I think i'm good as far as actual information from the graphics. And when you are looking at the total visual impact of a really good looking game, there's just not a ton in it. I think horizon fw on ps4 is a real unsung hero. It looks crazy good for a last gen game. 1080p doesn't manage to hurt it very much.

And then consider you can run all of today's games at 1080 on a $200 gpu.
 
Good to know I didn't actually install it on my generic Ryzen Mini PC.

The official distro works on pretty much any PC with a Ryzen GPU.
I guess I'm surprised the Steam OS drivers work on any old Ryzen APU but maybe I shouldn't have been since it's not like the Ryzen APU inside Steam Deck is some super special chip
 
I guess I'm surprised the Steam OS drivers work on any old Ryzen APU but maybe I shouldn't have been since it's not like the Ryzen APU inside Steam Deck is some super special chip
It's not just any Ryzen APU, but any Radeon GPU has open source drivers that Valve compiled in. The good NVIDIA drivers are closed source which is why they aren't included. Supposedly some work on getting NVIDIA support is also being performed.
 
I remember the first time I saw a 1080p tv in full glory playing a bluray. I remember thinking "wow that is incredibly sharp".

And while I appreciate higher resolutions, 1080 is still a lot of pixels. I can see a character's fucking fingernails.... I think i'm good as far as actual information from the graphics. And when you are looking at the total visual impact of a really good looking game, there's just not a ton in it. I think horizon fw on ps4 is a real unsung hero. It looks crazy good for a last gen game. 1080p doesn't manage to hurt it very much.

And then consider you can run all of today's games at 1080 on a $200 gpu.
I mean, with DLSS is 1080p what most people are still getting anyways? Just cut out the middleman and grow your own 1080p. Fuck the scaling industry.
 
I can easily tell the difference between 1440 and 4k, even more so if the screen is 4k native. In those cases and whenever I can't run 4k for whatever reason I rather set it to 1080 so it upscales better.

I'm not sure I'll be able to tell the difference from 4k and 8k but 8k screens will be paired with other features that will probably make it worthwhile like faster network connections. We can't seriously think screen tech has peaked.
 
I can easily tell the difference between 1440 and 4k, even more so if the screen is 4k native. In those cases and whenever I can't run 4k for whatever reason I rather set it to 1080 so it upscales better.

I'm not sure I'll be able to tell the difference from 4k and 8k but 8k screens will be paired with other features that will probably make it worthwhile like faster network connections. We can't seriously think screen tech has peaked.
DLSS4 and FSR4 looks so great these days that I almost always put them at Balanced even if I can handle Quality with ease. You still get the benefits of reduced power draw. Less power, less heat, less fans.
 
Bazzite is more tuned for desktop PCs.

I can't say that there is a performance difference between the two, but bazzite basically gives all the bells and whistles and appearance of SteamOS (as long as you install DeckyLoader).
I'm pretty sure that at this point, the performance difference between the distros is negligible. They all use the same drivers, etc.
 
Top Bottom