• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 4090 is 300% Faster than AMD’s RX 7900 XTX in Cyberpunk 2077: Phantom Liberty Overdrive Mode, 500% Faster with Frame Gen

GHG

Member
It's not like this is the standard performance delta between the 7900 XTX and 4090. It's literally one game. One game that has always performed uncharacteristically well on nVidia hardware and poorly on AMD hardware.


For the other 99% of games the 7900 XTX holds up just fine and actually provides a much better cost/performance ratio compared to the 4090.

I literally said "the games they want to play".

If someone wants has $1000+ to spend and they want to play cyberpunk with the full suite of features enabled (with that amount of money being spent, why wouldn't they?) then what are they supposed to do? Cut their nose off to spite their face and purchase an AMD GPU for the sake of "muh competition"?

And no, it's not just one game when you start factoring games that feature ray tracing and DLSS, which happens to be a lot these days in case you haven't been paying attention.

So tired of people defending substandard products from huge businesses who can and should be doing better.

But this game is designed with Nvidia hardware in mind so Cyberpunk should run best on Nvidia GPUs. I'm not sure how well can path tracing run on AMD's GPU if the devised and equivalent tech but I can't say for certain AMD GPUs are bad.

AMD are about 2 generations behind when it comes to RT performance, they can't do path tracing well because their technology is so behind and isn't up to scratch. That's nobody's fault but their own.

What do you want devs and Nvidia to do, not push/develop new technologies and sit around waiting for poor little AMD to catch up?
 
Last edited:

analog_future

Resident Crybaby
I literally said "the games they want to play".

If someone wants has $1000+ to spend and they want to play cyberpunk with the full suite of features enabled (with that amount of money being spent, why wouldn't they?) then what are they supposed to do? Cut their nose off to spite their face and purchase an AMD GPU for the sake of "muh competition"?


Frankly if someone is spending $1600+ on a GPU so they can play one game at max settings, that someone is an idiot.
 
Last edited:

shamoomoo

Banned
I literally said "the games they want to play".

If someone wants has $1000+ to spend and they want to play cyberpunk with the full suite of features enabled (with that amount of money being spent, why wouldn't they?) then what are they supposed to do? Cut their nose off to spite their face and purchase an AMD GPU for the sake of "muh competition"?

And no, it's not just one game when you start factoring games that feature ray tracing and DLSS, which happens to be a lot these days in case you haven't been paying attention.

So tired of people defending substandard products from huge businesses who can and should be doing better.



AMD are about 2 generations behind when it comes to RT performance, they can't do path tracing well because their technology is so behind and isn't up to scratch. That's nobody's fault but their own.

What do you want devs and Nvidia to do, not push/develop new technologies and sit around waiting for poor little AMD to catch up?
The RX 7900xtx/xt is getting 49 and 44 fps at 1080p in a Nvidia implemented version of path tracing, while AMD's gpu aren't as performant,I don't know what you mean by they are 2 generations behind Nvidia.
 
Last edited:

GHG

Member
That's fine, but then don't get upset just because consumers pick the better value proposition when given the opportunity.

Price != Value

"Value" is subjective, and the majority of GPU purchasers aren't seeing value where you seem to think there is.

And for the record, I'm not the one in this thread complaining about people buying Nvidia cards, "walled gardens", or the current performance delta in flagship RT titles.

The RX 7900xtx/xt is getting 49 and 44 fps at 1080p in a Nvidia implemented version of path tracing, while AMD's gpu aren't as performant,I don't know what you mean by they are 2 generations behind Nvidia.

performance-pt-1920-1080.png



Path tracing enabled, the 7900xtx is below the 2080 ti. That's 2 generations last time I checked.
 
Last edited:

Bojji

Member
It's not like this is the standard performance delta between the 7900 XTX and 4090. It's literally one game. One game that has always performed uncharacteristically well on nVidia hardware and poorly on AMD hardware.


For the other 99% of games the 7900 XTX holds up just fine and actually provides a much better cost/performance ratio compared to the 4090.

All games with PT perform like this, this type of RT workload just kills AMD GPUs.
 

poppabk

Cheeks Spread for Digital Only Future
How "smart" do you have to be to interpret simple benchmark numbers, really simple technological facts, as "celebration"?
Facts are neutral. The issue is you....
This would be true except 3080RTX was getting twice the path tracing performance of the 6800XT in Quake 2 but now the 7900XTX is performing at 50% of the 3080. Kinda suggests that maybe the optimization favors Nvidia here in what seems like the de facto Nvidia RT showcase.
 

analog_future

Resident Crybaby
All games with PT perform like this, this type of RT workload just kills AMD GPUs.

All games? There's like three retail games with path tracing. nVidia partnered Cyberpunk, nVidia developed Portal with RTX, and nvidia developed Quake with RTX.


Hardly a balanced baseline. And even if it is reflective of how AMD is going to perform with PT moving forward, it's such a tiny, tiny, minuscule subset of games that by the time a significant amount of games start pushing it, we'll be two generations of hardware ahead of now.
 

twilo99

Member
I wonder what MSFT and Sony are going to do for their next gen consoles. How are they going to get ray tracing without Nvidia? Or any of the AI/DLSS upscaling techniques and frame gen they're using?

The harsh reality is that there is absolutely nothing they can do because they are relying on AMD for that.

Lots of wishful thinking around on how Sony will use some ultra secret mega sauce and fix the situation with the “pro” version of the PS5, not to mention the 6.. but is not going to happen. Some of the devs might be able to remedy the situation, but what you see here with the 4090 is not coming to consoles for at least 10-15 years.
 

Bojji

Member
All games? There's like three retail games with path tracing. nVidia partnered Cyberpunk, nVidia developed Portal with RTX, and nvidia developed Quake with RTX.


Hardly a balanced baseline. And even if it is reflective of how AMD is going to perform with PT moving forward, it's such a tiny, tiny, minuscule subset of games that by the time a significant amount of games start pushing it, we'll be two generations of hardware ahead of now.

Alan Wake 2 is around the corner.

AMD doesn't touch PT because they have abysmal performance with it, otherwise there would be a lot of AMD sponsorships.

Q2 and Portal were tech demos but CP 2077 is AAA, massive game that is fully playable with most advanced lighting ever used in video game on Nvidia hardware (at least few top tier GPUs). That's huge win for NV, after AW2 other games will have it too.

Edit: I had RX 6800 when PT patch dropped for CP, I had 8FPS with it... This was the last time I bought AMD GPU.
 
Last edited:

GHG

Member
AMD is good for consoles because console gamers have no choice but to praise the "power" of AMD (PS5/Xbox).

The main reason why AMD is "good" for consoles is because AMD have expertise in the area of APU's in ways that Nvidia don't.

That's why Nvidia were looking to purchase arm in order to nullify the advantage AMD have in that area (not specifically for consoles though).
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Even with RDNA 3 to 4 with 1.5x RT improvement and then again RDNA 4 to 5 with again 1.5x improvement, they would just catch up to 4090 in path tracing.

2080 Ti outperforms the 7900XTX, 2 gen behind, a card that is way gimped in transistors, bandwidth, cache and clocks compared to RDNA 3 flagship.

And this is starting with the baseline that Cyberpunk 2077 rasterization performs really well on AMD

They have to throw the hybrid RT pipeline in the trash can. Intel 2nd iteration will be very dangerous for AMD if they don’t step it up.
I agree. Intel had very little performance penalty for ray tracing for their first GPU gen and XESS is arguably better than FSR.

AMD really needs to step up and hire some of these software engineers. Intel has REALLY impressed me in regards to XESS and ray tracing especially considering it is only their first gen GPUs.
 

Buggy Loop

Member
Man, do you think RT will stay same?? since RDNA4 will get Traversal Engine, the last piece what AMD needed, so do not be surprised if RX 8700XT gonna be around 4070/ti level in PT

Well I hope it won't stay the same. Like i said, they need to throw the pipeline in the trash or revamp it majorly. It needs higher than 1.5x jumps as we saw from RDNA 2 → 3.
1.5x incremental jumps isn't that easy generation to generation. Especially as they drop the high end on RDNA 4.

Keep regurgitating that 1.5x number, nothing makes it true.

PC gamers have to be some of the dumbest people in existence, actively celebrating one company dominating the GPU industry while proactively building a walled garden.

gonna cry tobey maguire GIF


Those are the benchmarks. 1.5x incremental is no slouch. To go beyond that... they have to revamp the pipeline in a major way.. just like i said, they have to throw the patented pipeline in the trash.

What's the expected jump for RDNA 4 ? Especially since there's no high end cards. Give a number.

AMD wont release any highend cards with RDNA4

That too i didn't factor in. Doesn't matter too much that there's no high end cards personally, like I said in the thread when this was rumoured, it makes the most sense business wise. But to catch up technologically when Nvidia will probably go full speed ahead in RT & ML, is not reassuring.

And what do you propose consumers do exactly?

Purchase substandard products that don't run the games they want to play well because...?

I'm genuinely interested to hear the answer to this.

They propose pity buying

I bought ATI / AMD cards for 20 years, all the way back to 2D cards. I gave them more than a fair share of my cash throughout the gens. More than most peoples on this forum actually. I never even bought an intel CPU in my life.

I used to think its cool to support underdogs, now i don't give a fuck. I ain't pity buying to make sure a company stays afloat. These are the same peoples who wouldn't hesitate to jump on Intel's grave if it came to it, pure hypocrisy. It's a fucking cult.
 
Last edited:

shamoomoo

Banned
Price != Value

"Value" is subjective, and the majority of GPU purchasers aren't seeing value where you seem to think there is.

And for the record, I'm not the one in this thread complaining about people buying Nvidia cards, "walled gardens", or the current performance delta in flagship RT titles.



performance-pt-1920-1080.png



Path tracing enabled, the 7900xtx is below the 2080 ti. That's 2 generations last time I checked.
I confused one of the RT results for the PT result,but I still argue this is an Nvidia thing,not saying AMDs sponsorship would've produced outstanding performance.
 

SolidQ

Member
It needs higher than 1.5x jumps as we saw from RDNA 2 → 3.
Top was 270CU, moving to RDNA5. So you can calculate almost 3x in CU + Traversal Engine + other improvements. Maybe RDNA5 will have even more CU and other improvements for RT

When RDNA2 released AMD saying they will focus on RT(it's not RDNA3 because design was finished at same time as RDNA2 released), also Scott after GamesCon confirmed it. Just need wait RDNA4, and see what they do
 
Last edited:

RobRSG

Member
All games? There's like three retail games with path tracing. nVidia partnered Cyberpunk, nVidia developed Portal with RTX, and nvidia developed Quake with RTX.


Hardly a balanced baseline. And even if it is reflective of how AMD is going to perform with PT moving forward, it's such a tiny, tiny, minuscule subset of games that by the time a significant amount of games start pushing it, we'll be two generations of hardware ahead of now.
In the end the games with RT that are AMD sponsored are generally basic in what they implement, while using insane amounts of VRAM.

You will not see a sponsored by AMD path tracing game, and this means that AMD is picking its battles accordingly.
 

FireFly

Member
All games with PT perform like this, this type of RT workload just kills AMD GPUs.
The 7900 XTX does significantly better in Minecraft DXR, Quake 2 RTX and Portal RTX.

 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I used to think its cool to support underdogs, now i don't give a fuck. I ain't pity buying to make sure a company stays afloat. These are the same peoples who wouldn't hesitate to jump on Intel's grave if it came to it, pure hypocrisy. It's a fucking cult.
Fair enough. I have a 4090 because I wanted the best. I certainly want AMD to be relevant. However, if ray tracing doesn't really matter...and I still argue that it doesn't alter the image all that much as I'd bet that very few people could tell the difference between a ray traced image and a rasteurized image in a blind test...the 7900XTX is an outstanding card and worth a consideration. It outperforms the 4080 in many benchmarks despite being $200 cheaper. That does matter. In like for like non-ray tracing scenarios.

However, it cannot be ignored that DLSS is reaching a point that it makes resolution differences virtually non-noticeable and that doing like for like performances don't tell the real story. A 4070 with DLSS and FG will pseudo-outperform a 7900XTX in many cases. Does it make sense to go with nvidia in that case? Difficult to say?

Speaking of underdogs, in Intel continues to improve RT and XESS they could end up being the real underdog to support.
 
Last edited:

Buggy Loop

Member
I confused one of the RT results for the PT result,but I still argue this is an Nvidia thing,not saying AMDs sponsorship would've produced outstanding performance.

These have been micro-benched by knowledgeable peoples and there's nothing in the function calls (DXR agnostic..) or the behavior that just pops up as being fuckery from Nvidia.


They actually comment that RDNA 3 did good progress from RDNA 2 and that hardware utilization is comparable between Nvidia and AMD. So it comes down to "dedicated" RT cores, which in the hybrid AMD pipeline is fighting the scheduler for graphic workload (and we can't even add ML computer load here since AMD is absent for AI upscalers/frame gen, thankfully for them i guess, it would choke further).

Nvidia's "ASIC" like RT cores just don't choke (as much), at the disadvantage that at lower RT effects they don't scale too well. AMD needs dedicated hardware to manage BVH structure.

Fair enough. I have a 4090 because I wanted the best. I certainly want AMD to be relevant. However, if ray tracing doesn't really matter...and I still argue that it doesn't alter the image all that much as I'd bet that very few people could tell the difference between a ray traced image and a rasteurized image in a blind test...the 7900XTX is an outstanding card and worth a consideration. It outperforms the 4080 in many benchmarks despite being $200 cheaper. That does matter. In like for like non-ray tracing scenarios.

However, it cannot be ignored that DLSS is reaching a point that it makes resolution differences virtually non-noticeable and that doing like for like performances don't tell the real story. A 4070 with DLSS and FG will pseudo-outperform a 7900XTX in many cases. Does it make sense to go with nvidia in that case? Difficult to say?

Speaking of underdogs, in Intel continues to improve RT and XESS they could end up being the real underdog to support.

I don't think anyone is saying to buy Nvidia because of a single game feature (path tracing). But we can note that they are late to the tech race for the absolute bonker high end graphical features. I recommended the 6700 / 6700XT in almost all the budget friendly PC builds i made in the past 6 months, because it just doesn't make any goddamn sense to recommend a 4090 to enjoy a single game. I'm on an already overpaid (covid time) 3080 Ti and i'm just peeking at Overdrive, not starting a play of Cyberpunk yet because of that feature.

I do believe that DLSS & Frame gen & RT performances does add value ultimately. It does offer better resell value, you can score a buyer that DOES care. But even without those features, it still performs quite well natively so it's not all smokes and mirrors.

If Intel does step up the game on 2nd iteration, i wouldn't even hesitate to buy one mid-range. On first iteration their silicon area is kind of underutilized so hopefully they improve.

AMD drops a bombshell of a GPU and catch up in features to Nvidia? I would buy one. I buy what i feel adds the most value. Sadly for AMD, i do care for RT. Some might not. I do.
 
Last edited:

Bojji

Member
The 7900 XTX does significantly better in Minecraft DXR, Quake 2 RTX and Portal RTX.


Most of those games are also much simpler, only complex thing to render is lighting, CP is different. But yeah, still 4090 is more than 2x faster in most cases.
 

Pop

Member
I mean to be fair one is $1600 and the other around $1000

And there are more games than just Cyberpunk
 

proandrad

Member
Wondering if there is a point of upgrading my 3090 to 4090 or should I just wait for 5090?
If you are buying it at msrp the longer you wait the less value you are getting. I only buy high end pc parts within a month of release, but I also don't upgrade yearly.
 

SF Kosmo

Al Jazeera Special Reporter
nVidia ha been heavily investing in future paradigm shifts with AI and RT for three gens now and AMD has just been banking on the status quo, with patchwork solutions to tick boxes in upscaling and light RT.

That worked for AMD up to a point, but Cyberpunk 2.0 is the culmination of years of iteration and research, hardware and software, designed to make realtime path-tracing a viable reality for current gen games. Everything about that OD mode leverages nVidia hardware to do things that AMD cards simply don't do.

And while this is just a first step, I do think this is where computer graphics are going. Soon, most of the pixels you see on screen will be generated by AI, and the rest will be rendered with rays and not raster methods. I just hope AMD can get their stuff together in time for the next console gen.
 

Buggy Loop

Member
nVidia ha been heavily investing in future paradigm shifts with AI and RT for three gens now and AMD has just been banking on the status quo, with patchwork solutions to tick boxes in upscaling and light RT.

That worked for AMD up to a point, but Cyberpunk 2.0 is the culmination of years of iteration and research, hardware and software, designed to make realtime path-tracing a viable reality for current gen games. Everything about that OD mode leverages nVidia hardware to do things that AMD cards simply don't do.

And while this is just a first step, I do think this is where computer graphics are going. Soon, most of the pixels you see on screen will be generated by AI, and the rest will be rendered with rays and not raster methods. I just hope AMD can get their stuff together in time for the next console gen.

Probably the next thing we see is Nvidia's NRC, neural radiance cache. It supposedly replaces ReSTIR (fundamentally what we see in CP2077 overdrive), with an even faster and less noisy image via AI. They have so much R&D in this tech and they collaborate with multiple universities on the subject.



They're also working on wave optics, sorry to share this guy's insufferable speech pattern but nobody else covers this.



Without mentioning all the work being done to accelerate physics with AI. Like you say, everything will be ML soon. Pushing for traditional rendering won't be able to keep up.
 

00_Zer0

Member
Looking at Cyberpunk, as well as some recent and upcoming Unreal 5 engine games, it's quite obvious that judging game performance solely on native resolution/rasterization is on the way out. Developers are going to start developing single player games expecting gamers to turn on things like DLSS, FSR 2, and frame generation. Multiplayer games are probably the only exception.

Thankfully, I don't think it's too late for AMD to turn things around. Having AI carry the burden of visual fidelity tricks instead of throwing more transistors and brute force power towards the problem, is going to be the future of GPU development going forward.

AMD is going to need to invest more into RT and AI next gen if they plan on gaining more GPU market share. As much as I hate Nvidia for price gouging customers, even I can see the value of owning a card in the 4000 series.

I think this is the first gen of cards that makes a compelling case that AI is needed to run the latest and greatest games to their fullest potential.
 
Last edited:

Crayon

Member
I'll be able to afford this performance in 6-8 years lol. It's nice to have this game as a testbed. The cards that will do it are here for over $1000 but we need more games where raytracing is worthwhile. If most games had rt like like this that actually pay off, I'd consider ratcheting up my GPU budget.
 
Last edited:
Top Bottom