AMD or NVIDIA what do you own GAF ? ( PC only plz)

What video card do you own?

  • AMD

    Votes: 74 23.7%
  • Nvidia

    Votes: 238 76.3%
  • Intel

    Votes: 0 0.0%

  • Total voters
    312
Intel 0
laugh-lol.gif
 
Have an 1080ti in my old PC, but I'm hoping to get a new build later this year (undecided if I'll go with Nvidia or AMD for the GPU). I know for sure though that the CPU will be AMD.
 
I have both. Multiple Nvidia cards (4090 & 5090) and a AMD 7900 xtx that I purcahsed for a HTPC Linux gaming build.

I ditched the Bazzite-based Linux gaming build late 2024 because it was more trouble than it was worth, and now the 7900 xtx sits in my (mostly) non-gaming office PC.

Honestly, performance-wise, I saw very little difference between the 7900 xtx and 4090 (minus a few fps), but once I ditched Bazzite and switched back to Windows on the HTPC there really wasn't a reason to run AMD anymore.

I will say that for all its faults, I STRONGLY prefer the Nvidia app/control panel to AMD Adrenalin
 
Last edited:
I prefer AMD but my last AMD card was a 5700xt which I managed to sell for £200 profit during the covid GPU madness and the only card I could get hold of fast to replace it was a 306ti which I then had to upgrade to a 4070s when i got a good deal on that. I would like to go back to AMD but currently they are not competing enough on cost and performance the 9070xt looks a decent card but not enough of an upgrade from my 4070 to be worth it for me. Also the 9070xts are absolutely massive cards which puts me off.
 
Must be an old monitor as almost all monitors these days that support GSync also support FreeSync.
You'd be surprised but there are still monitors releasing with g sync only. Don't they contain some Nvidia propriety tech?

Mine is old now I guess. Acer Predator IBS 1440p 144,,hz g sync I don't think you can buy it any more. Got it in 2017 or maybe 18.
 
I have both. Multiple Nvidia cards (4090 & 5090) and a AMD 7900 xtx that I purcahsed for a HTPC Linux gaming build.

I ditched the Bazzite-based Linux gaming build late 2024 because it was more trouble than it was worth, and now the 7900 xtx sits in my (mostly) non-gaming office PC.

Honestly, performance-wise, I saw very little difference between the 7900 xtx and 4090 (minus a few fps), but once I ditched Bazzite and switched back to Windows on the HTPC there really wasn't a reason to run AMD anymore.

I will say that for all its faults, I STRONGLY prefer the Nvidia app/control panel to AMD Adrenalin
I bought a 9070Xt for the sole purpose of running a Bazzite HTPC in my living room and it worked great...except for the lack of HDMI 2.1 and FSR4 not being ready until the SDK is released, so ultimately I decided to switch to Windows 11 Enterprise LTSC (the debloated version for businesses)

So now my living room PC uses Windows 11 LTSC and in hindsight had I known about the HDMI 2.1 issue and decided I was going to stick to Windows I would likely have bought a 4070 Ti Super months ago when they were affordable.

Nevertheless, I am happy with my 9070XT and am glad I did not give nvidia one damn penny this gen. Blackwell is just a lazy and less stable rehash of the 4000 series.
 
You'd be surprised but there are still monitors releasing with g sync only. Don't they contain some Nvidia propriety tech?

Mine is old now I guess. Acer Predator IBS 1440p 144,,hz g sync I don't think you can buy it any more. Got it in 2017 or maybe 18.
GSync no longer requires propriety tech. That was removed way back in like 2018. However, your monitor being from 2017 likely doesn't have Freesync support. So you are most likely correct.

However, if you get the inkling to upgrade monitors are one area where you can get some insanely good monitors for under $250 that support everything. MSI has a few 180Hz 1440p IPS panels for around $200.

Pretty much any gaming monitor released since 2020 supports both tech.
 
4090. But I also use it for work. And I hate that Nvidia's got me by the balls (I need CUDA).
Believe it or not, I actually appreciate what nvidia has done. They made me hold on to my 4090 as the 5090 is not remotely attractive to me at all. Sure it offers better performance, but the stability, increased power, bad connectors, missing ROPS, bad drivers and just the complete and utter disdain for gamers, I am no longer obsessed with running games at over 200 fps. 120 -180 fps is just fine.

I knew the 5000 series would be a shit show, as it was so obvious that nvidia was focused on AI and gamers would be left for sloppy seconds. However, even I was shocked at how badly it went and how few people recognized this. I shook my head at all those who didn't upgrade last gen as they wanted to wait for the 5000 series.

Unless the AI bubble collapses (who knows? maybe tariffs could end up being the thing that facilitates the bubble collapsing) the situation with the 5000 series will not improve.

AMD hit a home run with the 9000 series, but it wasn't quite a grandslam like it needed to be. Availability sucks and now prices are going up.
 
I bought a 9070Xt for the sole purpose of running a Bazzite HTPC in my living room and it worked great...except for the lack of HDMI 2.1 and FSR4 not being ready until the SDK is released, so ultimately I decided to switch to Windows 11 Enterprise LTSC (the debloated version for businesses)
As far as I know, the HDMI issue is "solved" by using a displayport to HDMI adapter. That's what I did and things seemed to work well.
 
As far as I know, the HDMI issue is "solved" by using a displayport to HDMI adapter. That's what I did and things seemed to work well.
Yes and No. CableMatters had one that worked with a custom firmware. I got full 4K/120 w/ VRR 10-bit HDR 444. It was perfect....until it wasn't

If I cold booted the system, it would max out to 4K/60 with no option in Gamescope or desktop to get 120 Hz. The only way I could get 120 Hz option to be available was to boot the system with the HDMI cable hooked up, go into desktop mode, disable the HDMI display, plug in the DP2HDMI. Then it would usually work. I could go into Gamescope and game with all the bells and whistles. VRR and everything. Until I booted the system from shutdown. To be fair, I never tried putting the system to sleep. When using HDMI normally, I could get 4K/120 HDR, but I was limited to 8-bit 420. Believe it or not I gamed that way for a long while. I could not tell that I was running low bandwidth color, until it was pointed out and then I couldn't unsee all the color banding and everything else. A perfect example of ignorance is bliss.

But I got tired of doing that and nobody in any of the Linux forums could help me find a solution. I was hoping to find some sort of script I could run, but never got any responses and I wasn't going to force myself to have to constantly switch out the cable. That totally defeated the purpose of having a Home Theater couch system.

So I just use WIndows and boot into Big Picture Mode and while its far from perfect, it gets the job done and accomplishes almost everything I wanted from Bazzite/SteamOS
 
Last edited:
I switched to amd when the os driver for linux started getting really good. Then I found that there are (were) so often really good deals on them. I've had three so far and each one was a great price. So between being on linux and a little cheap, I haven't thought of switching back for a long time.
 
I really like how AMD moved things up with FSR4. if it wasnt for multi frame gen and how they are only targeting mid range, i would have probably went with AMD this gen.

but.. nothing as powerful as 5080 even so... nope Nvidia for me.

also. i am annoyed with the 16 gigs offered by both Nvidia and AMD. some games at 4k with path tracing already exceed that number (I think Indiana Johns is one of them ? ) and AMD had 24 gigs last time. what the fuck going back to 16 for ? or lol @ 5080 with 16 gigs too.

if i had a 4090 I would have kept it. but nope. like an idiot i had to sell it. Luckily enough i have a 5090 FE MSRP price so.. it's not so bad of an upgrade.
 
Top Bottom