Intel at 0%.
PowerColor Reaper and Gigabyte Gaming OC, AsRock Steel Legend Dark are not that massive tbh.Also the 9070xts are absolutely massive cards which puts me off.
Must be an old monitor as almost all monitors these days that support GSync also support FreeSync.3070 rtx. Happy with it for what I play. Gives me 1440p and 80 to 100fps in what games I play. 144fps for dota and deadlock.
I really want to give AMD my money but I have a g sync monitor.
He is Giga Troll, doesnt mean he believes in what he posts xDNot even Leonidas...
You'd be surprised but there are still monitors releasing with g sync only. Don't they contain some Nvidia propriety tech?Must be an old monitor as almost all monitors these days that support GSync also support FreeSync.
I bought a 9070Xt for the sole purpose of running a Bazzite HTPC in my living room and it worked great...except for the lack of HDMI 2.1 and FSR4 not being ready until the SDK is released, so ultimately I decided to switch to Windows 11 Enterprise LTSC (the debloated version for businesses)I have both. Multiple Nvidia cards (4090 & 5090) and a AMD 7900 xtx that I purcahsed for a HTPC Linux gaming build.
I ditched the Bazzite-based Linux gaming build late 2024 because it was more trouble than it was worth, and now the 7900 xtx sits in my (mostly) non-gaming office PC.
Honestly, performance-wise, I saw very little difference between the 7900 xtx and 4090 (minus a few fps), but once I ditched Bazzite and switched back to Windows on the HTPC there really wasn't a reason to run AMD anymore.
I will say that for all its faults, I STRONGLY prefer the Nvidia app/control panel to AMD Adrenalin
GSync no longer requires propriety tech. That was removed way back in like 2018. However, your monitor being from 2017 likely doesn't have Freesync support. So you are most likely correct.You'd be surprised but there are still monitors releasing with g sync only. Don't they contain some Nvidia propriety tech?
Mine is old now I guess. Acer Predator IBS 1440p 144,,hz g sync I don't think you can buy it any more. Got it in 2017 or maybe 18.
Believe it or not, I actually appreciate what nvidia has done. They made me hold on to my 4090 as the 5090 is not remotely attractive to me at all. Sure it offers better performance, but the stability, increased power, bad connectors, missing ROPS, bad drivers and just the complete and utter disdain for gamers, I am no longer obsessed with running games at over 200 fps. 120 -180 fps is just fine.4090. But I also use it for work. And I hate that Nvidia's got me by the balls (I need CUDA).
As far as I know, the HDMI issue is "solved" by using a displayport to HDMI adapter. That's what I did and things seemed to work well.I bought a 9070Xt for the sole purpose of running a Bazzite HTPC in my living room and it worked great...except for the lack of HDMI 2.1 and FSR4 not being ready until the SDK is released, so ultimately I decided to switch to Windows 11 Enterprise LTSC (the debloated version for businesses)
Yes and No. CableMatters had one that worked with a custom firmware. I got full 4K/120 w/ VRR 10-bit HDR 444. It was perfect....until it wasn'tAs far as I know, the HDMI issue is "solved" by using a displayport to HDMI adapter. That's what I did and things seemed to work well.