• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 4090 is 300% Faster than AMD’s RX 7900 XTX in Cyberpunk 2077: Phantom Liberty Overdrive Mode, 500% Faster with Frame Gen

The main reason why AMD is "good" for consoles is because AMD have expertise in the area of APU's in ways that Nvidia don't.

That's why Nvidia were looking to purchase arm in order to nullify the advantage AMD have in that area (not specifically for consoles though).
Nah everybody does "APUs" these days hell Intel was doing APUs before AMD they just didn't brand them, for example the Switch has an Nvidia "APU" . Every semiconductor company is doing it, what makes AMD special is that they are the cheapest company to work with. Nvidia is the best graphics company around but they are also the most expensive.
 

SABRE220

Member
Based on the absolutely pathetic R&D efforts from amd and intels tech breakthroughs from their first dedicated gpu arch....Its not that crazy a thought at all. Essentially intel has surpassed amds rt and ml tech in their first effort while amd will have to swallow the poison and throw away their hybrid approach which they have carried for 2 gens to make actual ground.
 
Last edited:
according to Todd I should be upgrading so I can play Starfield at over 60 in a city.
What's funny about this is Starfield is still using 2006-era tech like cubemapping in that ancient engine. No one knows what Creation Engine is doing that makes it run so bad even on ridiculous modern computers. At least with Cyberpunk you can clearly see on the screen what is crushing your computer.
 

Silver Wattle

Gold Member
It’s not consumer’s fault that AMD has fallen so far behind. They’ve been making GPUs for a very long time and they have proven they can’t keep up.

And what does walled garden mean to you? Because PC is literally the only option in gaming if you don’t want to participate in one.
Who said anything about it being the consumers fault they are behind on RT+ML?
Walled garden is when you cant use their features without using their product, DLSS is a walled garden, before you jump to conclusions I'm not saying they can't do it, but many people here can't even see the wall being constructed around them and actively promote its construction and expansion.
And what do you propose consumers do exactly?

Purchase substandard products that don't run the games they want to play well because...?

I'm genuinely interested to hear the answer to this.
What answer are you hoping for? re-read my comment but take off your fanboy goggles mate, I never said people should buy AMD for cyberpunk PT.
What's the expected jump for RDNA 4 ? Especially since there's no high end cards. Give a number.
They propose pity buying
It's a fucking cult.
What number are you looking for? Because there are NO numbers for RDNA 4 that are credible, just like your 1.5x numbers have no merit, but unlike you I don't throw out baseless speculation and pretend it's a near certainty.
Who proposes "pity" buying? Because I never have, I just support competition when it makes sense to as a consumer.
Cult? look in the mirror, the Nvidia stans in here(you being up there next to leonidas) are the most rabid in this whole thread, which is a thread about shitting on AMD specifically due to having poor PT(which they do).

You guys are just proving my point.
 

SlimySnake

Flashless at the Golden Globes
While this is not a good look for AMD, this only affects like 1% of games. Cyberpunk, Metro and Control are literally the only games in the last five years that have utilized ray tracing like this.

I will speak from my own experience, but I had a 2080 that i got for cyberpunk. Turns out its not enough to run cyberpunk at a decent resolution with RT on so i waited to get a 3080. Eventually got it almost 2 years later only to find out that i would have to set ray tracing to medium and dlss to performance mode to get anywhere close to 60 fps. now path tracing is pretty much a mess even with ray reconstruction.

Meanwhile, this year alone, the following games have come out and ive had to disable ray tracing for one reason or another. RE4 would crash if i used high res textures and ray tracing. Hogwarts would also run out of vram if i used rt at 4k dlss quality. so stuck with ultra settings and 4k instead. star wars is a disaster but its a disaster on all GPUs. TLOU had no ray tracing and no nvidia advantage.

In fact, since cyberpunk came out, the 6950xt which was going for the same price as my 3080 when i got it, (its cheaper now) has beaten the 3080 in every single game. Go look at games that have come out in 2021, 2022 and 2023 and you would be surprised at how few have ray tracing. let alone ray tracing you can use on these vram starved cards.

And yet no one really talks about the actual user experience. i wouldnt be surprised if there are less than 500k 4090s in circulation. AMD should definitely improve their RT performance but once cyberpunk path tracing is done, you can look at Alan Wake 2 and then forget about another path tracing game until the next metro comes out in 3 years.
 

Dream-Knife

Banned
Walled garden is when you cant use their features without using their product, DLSS is a walled garden, before you jump to conclusions I'm not saying they can't do it, but many people here can't even see the wall being constructed around them and actively promote its construction and expansion.
It's not a walled garden. Nvidia invented something that uses hardware they invented to push tech forward.
 

Dice

Pokémon Parentage Conspiracy Theorist
Maybe FSR3 will help...


Maybe?
FSR doesn't process raytracing. Dedicated hardware is always going to win over software. That is what nvidia has, which makes them superior at RT (even moreso when they enhance it with software), which will make devs develop specifically for them when they want good RT. It's something nvidia has done before so we know it is true. It's kind of unbelievable that AMD seems to have had no plan whatsoever for RT. Like why wouldn't people want 100x better lighting?

Meanwhile, this year alone, the following games have come out and ive had to disable ray tracing for one reason or another. RE4 would crash if i used high res textures and ray tracing. Hogwarts would also run out of vram if i used rt at 4k dlss quality. so stuck with ultra settings and 4k instead. star wars is a disaster but its a disaster on all GPUs. TLOU had no ray tracing and no nvidia advantage.
Not enough VRAM is one issue nvidia has for sure. DLSS is great but almost mandatory for them because AMD is much better at traditional rasterization.
 
Last edited:

hlm666

Member
Looks like Nvidia managed a fatality, AMD's own goal with starfield probably helped. Someone else can make a dedicated thread, I don't care enough.

 
Looks like Nvidia managed a fatality, AMD's own goal with starfield probably helped. Someone else can make a dedicated thread, I don't care enough.

He was probably the guy who came up with the idea of paying developers to exclude DLSS. That hasn't exactly gone well for AMD, it's made them look bad and a lot less plucky underdog-like than they would have liked.

Still, Starfield is a game that runs on a 2006-era engine and uses 2006-era technology, it was the perfect promotional game for AMD cards now that I think about it
 
Last edited:

hlm666

Member
He was probably the guy who came up with the idea of paying developers to exclude DLSS. That hasn't exactly gone well for AMD, it's made them look bad and a lot less plucky underdog-like than they would have liked.

Still, Starfield is a game that runs on a 2006-era engine and uses 2006-era technology, it was the perfect promotional game for AMD cards now that I think about it
It's not like any of their other sponsored games have fared much better lately. It seems like another Kaduri moment, make someone walk the plank and take the blame with them with promises next time will be better.
 

Buggy Loop

Member
What number are you looking for? Because there are NO numbers for RDNA 4 that are credible, just like your 1.5x numbers have no merit, but unlike you I don't throw out baseless speculation and pretend it's a near certainty.

The previous gen to gen was 1.5x. Its simple extrapolation that keeping the status quo isn't enough, they have to do better, as i originally said.
It's not 1.5x number that is certainty, what's certain is AMD has to do better.

Who proposes "pity" buying? Because I never have, I just support competition when it makes sense to as a consumer.

Well anyone that supports a company "dominating" GPU industry, as per your word, is the dumbest people in existence.
So... we're left with what? Support competition... because?

Underdog? To not let Nvidia dominate?

That's pity buying.

AMD would naturally gain market share if they were not playing catch-up.

Cult? look in the mirror, the Nvidia stans in here(you being up there next to leonidas) are the most rabid in this whole thread, which is a thread about shitting on AMD specifically due to having poor PT(which they do).

You guys are just proving my point.

Oh yea,

You
got
me
!

walter white GIF


You're
Sherlock
Holmes?

Truely
I can't
critique
Nvidia

Oh wait I can critique, everyone, because I'm not in a fucking cult.

AMD doesn't even need nvidia stans for damage, their own "fans" *cough cult cough* who can't critique them and youtubers speculating all the bullshit hype for the next nvidia killer already do all the damage themselves. Which i got BASHED and DRAGGED on this fucking forum's ground by many AMD fans in the previous years for just suggesting to take the rumours with a grain of salt because it just didn't make any fucking sense in a world bound by physics. Funny to see those gurgle hype without thoughts. RDNA 3 rumours were comical. I spent 20 years in the most hardcore ATI/AMD forums, I can spot the pattern a mile away, It's the same as the Sonic cycle for Sega fans actually. Could use the same motto "gotta go fast!"
 

XesqueVara

Member
People having hopes hopes with Intel is lol, they are still stuck with the Trash Ip who is Gen 12, a 400mm² chip keeping up with a 200mm² is Bad.
 

Monserrat

Banned
CDPR has been working for Nvidia for years now, no surprise. They even made the remaster of the Witcher 3 a cpu hog just in time for nvidia's new feature that benefits the most of cpu heavy bottlenecks (frame interpolation). But I don't expect AMD to move a finger anyway, they are a duopoloy and have splitted between them the gaming graphics chip market and do whatever they can to keep prices high.
 

GHG

Member
What answer are you hoping for? re-read my comment but take off your fanboy goggles mate, I never said people should buy AMD for cyberpunk PT.

You guys are just proving my point.

I've re-read your comment and all I'm seeing is waffle about mythical a "walled garden" and how people are "dumb" for supporting Nvidia.

People support them because they currently offer the best products and solutions. Get over it.
 

Dr.D00p

Member
The only people this affects are those who've already made the decision that RT is of no importance to them or have accepted vastly inferior RT performance as part of the deal for going AMD.

..so it's not really a story, IMO.
 

Silver Wattle

Gold Member
I've re-read your comment and all I'm seeing is waffle about mythical a "walled garden" and how people are "dumb" for supporting Nvidia.

People support them because they currently offer the best products and solutions. Get over it.
Appears you need glasses too, because my post was about the people actively celebrating Nvidia dominating(this thread is full of them), where did you read me targeting people supporting Nvidia for offering a better product? I'll wait.
 

GHG

Member
Appears you need glasses too, because my post was about the people actively celebrating Nvidia dominating(this thread is full of them), where did you read me targeting people supporting Nvidia for offering a better product? I'll wait.

And who exactly is celebrating Nvidia dominating?

People are commenting on the current situation in the GPU market, where if you haven't noticed, AMD have fallen behind in areas that many gamers now deem to be key (RT and AI upscaling technologies).

It's like you want people to pity AMD. What for exactly? It's a business, not a charity.
 

JohnnyFootball

GerAlt-Right. Ciriously.
AMD is a joke. I so hope the next generation of consoles switches to Nvidia. They won’t though…
You don't want this.

You do realize that nvidia would effectively control the entire gaming market?

Nobody with a functioning brain should want that.
 

winjer

Gold Member
The hybrid architecture that RDNA2 uses is probably due to consoles.
Most people forget that consoles are very price sensitive and one of the things companies like Sony and MS have to take into consideration is die size and yields.
Unlike a GPU on the PC, an SoC of a console needs to have much more than the GPU die. It also has to have the CPU cores, caches, IO die, memory controller for the CPU and GPU, and a ton of paths to shuffle data around.
For anyone that remembers the presentation for these consoles and RDNA2 on PC, one of the main pints was that having RT accelerators on the TMUs saved a lot of space.
Just for comparison, the RT cores on a 3080, that released at the same time as RDNA2, use around 20% of the GPU die space. And another 10-15% die space for the tensor cores.
So now imagine the PS5 and Series S/X with 35% less Compute Units. We would have a PS5 with 6-7 TFLOPs. The Series X with 7-8 TFLOPs. And the Series S with 2.6 TFLOPs.
It would be nice to have a 500$ console with the hardware of a 5000$ PC. But that is not how the real world works.
 

Zuzu

Member
You don't want this.

You do realize that nvidia would effectively control the entire gaming market?

Nobody with a functioning brain should want that.

Hmm, yes, I hadn't thought of it from that angle. Nvidia would pretty much have total domination of gaming if that were the case and that wouldn't be good.

I do hope AMD are able to significantly improve their ray-tracing for the next consoles. Hopefully Sony & Microsoft can help them with that.
 

winjer

Gold Member
Hmm, yes, I hadn't thought of it from that angle. Nvidia would pretty much have total domination of gaming if that were the case and that wouldn't be good.

I do hope AMD are able to significantly improve their ray-tracing for the next consoles. Hopefully Sony & Microsoft can help them with that.

There is also the matter that MS and Sony already had one console each with nvidia GPUs. And it went so bad, neither ever wanted to deal with nvidia again.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
People having hopes hopes with Intel is lol, they are still stuck with the Trash Ip who is Gen 12, a 400mm² chip keeping up with a 200mm² is Bad.

They are a generation behind with Alchemist, which you could say was a proof of concept.
The A770 keeps up with the 3060Ti and 6700XT......all of them are approx 400mm².
The A770 spanks the 6700XT in RT and upscaling tech.
Intel is approx 2 years behind Nvidia......AMD is 5 or more in term of tech and die usage.
 

winjer

Gold Member
They are a generation behind with Alchemist, which you could say was a proof of concept.
The A770 keeps up with the 3060Ti and 6700XT......all of them are approx 400mm².

Not really. The 6700XT uses 335mm2. While the A770 uses 406mm2. That is s big difference.
And consider that the 6700XT uses N7, while the A770 uses N6, which has a 15% area reduction over N7.
And the 6700XT and 3060T are on average 15% faster in rasterization.

average-fps-2560-1440.png
 
Last edited:
performance-pt-1920-1080.png
performance-pt-3840-2160.png


RDNA 2 → RDNA 3 = 1.5~1.6x
Turing → Ampere = 2.1~2.4x
Ampere → Ada = 1.8~1.9x (not including frame gen )

7900XTX is 14.5 fps at 1080p
14.5*1.5 = 21.8 fps
21.8 fps*1.5 = 32.7 fps

You're right, i was wrong, it doesn't even catch up to 4090. They're royally fucked.
Might as well just bin off my 3090.
 

FireFly

Member
They are a generation behind with Alchemist, which you could say was a proof of concept.
The A770 keeps up with the 3060Ti and 6700XT......all of them are approx 400mm².
The A770 spanks the 6700XT in RT and upscaling tech.
Intel is approx 2 years behind Nvidia......AMD is 5 or more in term of tech and die usage.
Why not compare with the 7600? Both chips are on 6nm and the A770 is only 1% faster at 1440p according to the Techpowerup benches.
 

Loxus

Member
performance-pt-1920-1080.png
performance-pt-3840-2160.png


RDNA 2 → RDNA 3 = 1.5~1.6x
Turing → Ampere = 2.1~2.4x
Ampere → Ada = 1.8~1.9x (not including frame gen )

7900XTX is 14.5 fps at 1080p
14.5*1.5 = 21.8 fps
21.8 fps*1.5 = 32.7 fps

You're right, i was wrong, it doesn't even catch up to 4090. They're royally fucked.
With the inclusion of the Traversal Engine, I don't think it'll be 1.5x anymore.
Most likely will be in RDNA4 too.

 
Last edited:

winjer

Gold Member
With the inclusion of the Traversal Engine, I don't think it'll be 1.5x anymore.
Most likely will be in RDNA4 too.



That BVH hardware. Plus the tensor cores in CDNA. And all that is missing is dedicated RT cores.
Maybe AMD can narrow the gap with RDNA4.

And hopefully, the new lead on the Radeon group will make things differently.
 

shamoomoo

Banned
And who exactly is celebrating Nvidia dominating?

People are commenting on the current situation in the GPU market, where if you haven't noticed, AMD have fallen behind in areas that many gamers now deem to be key (RT and AI upscaling technologies).

It's like you want people to pity AMD. What for exactly? It's a business, not a charity.
You mean what Nvidia dictated as the highlighted tech wasn't a big thing before Nvidia's Turning architecture. Also, there's nothing stopping MSAA or SSAA from making a comeback; those types of anti aliasing only fell out of favor for less expensive type of AA.
 

winjer

Gold Member
You mean what Nvidia dictated as the highlighted tech wasn't a big thing before Nvidia's Turning architecture. Also, there's nothing stopping MSAA or SSAA from making a comeback; those types of anti aliasing only fell out of favor for less expensive type of AA.

MSAA went away, mostly because of deferred rendering engines.
SSAA has always been stupidly expensive on the hardware.
 
  • Like
Reactions: GHG

GHG

Member
You mean what Nvidia dictated as the highlighted tech wasn't a big thing before Nvidia's Turning architecture. Also, there's nothing stopping MSAA or SSAA from making a comeback; those types of anti aliasing only fell out of favor for less expensive type of AA.

So you're suggesting Nvidia introducing (and subsequently building on) RT and DLSS technologies is a bad thing because they developed and introduced them?

If people didn't see value in those technologies then they wouldn't use them, and they certainly wouldn't pay through the nose for Nvidia cards in order to take advantage of them. Wanting to begrudge Nvidia for developing technologies that are beneficial to graphics pipelines and overall image quality makes no sense. Would you rather they cease to innovate?

You answered your own question regarding MSAA/SSAA. There's a reason they are no longer relied upon so heavily.
 
Last edited:

shamoomoo

Banned
The hybrid architecture that RDNA2 uses is probably due to consoles.
Most people forget that consoles are very price sensitive and one of the things companies like Sony and MS have to take into consideration is die size and yields.
Unlike a GPU on the PC, an SoC of a console needs to have much more than the GPU die. It also has to have the CPU cores, caches, IO die, memory controller for the CPU and GPU, and a ton of paths to shuffle data around.
For anyone that remembers the presentation for these consoles and RDNA2 on PC, one of the main pints was that having RT accelerators on the TMUs saved a lot of space.
Just for comparison, the RT cores on a 3080, that released at the same time as RDNA2, use around 20% of the GPU die space. And another 10-15% die space for the tensor cores.
So now imagine the PS5 and Series S/X with 35% less Compute Units. We would have a PS5 with 6-7 TFLOPs. The Series X with 7-8 TFLOPs. And the Series S with 2.6 TFLOPs.
It would be nice to have a 500$ console with the hardware of a 5000$ PC. But that is not how the real world works.
Maybe,but the road to PS4 suggested it was possible to have a RT unit on or around the PS4 APU. Now that doesn't mean the cost would've stayed as cheap as possible.
 

winjer

Gold Member
Maybe,but the road to PS4 suggested it was possible to have a RT unit on or around the PS4 APU. Now that doesn't mean the cost would've stayed as cheap as possible.

Where did you see RT units during the road to PS4 presentation?
 

shamoomoo

Banned
So you're suggesting Nvidia introducing (and subsequently building on) RT and DLSS technologies is a bad thing because they developed and introduced them?

If people didn't see value in those technologies then they wouldn't use them, and they certainly wouldn't pay through the nose for Nvidia cards in order to take advantage of them. Wanting to begrudge Nvidia for developing technologies that are beneficial to graphics pipelines and overall image quality makes no sense. Would you rather they cease to innovate?

You answered your own question regarding MSAA/SSAA. There's a reason they are no longer relied upon so heavily.
No one was asking for RT or AI anything outside of smart NPCs. Also,I said older methods of AA could make a comeback as current GPUs are many times faster than GPUs that only had MSAA or similar types of AA as the method for prestige IQ.
 

winjer

Gold Member
Mark Cerny talked about it but devs at that time didn't want a GPU that could do RT.


There is the matter of "if". As there was no GPU at the time that could do RT.
But even if there was, it would be very bad. Even Turing that released in 2018 had limited RT capabilities. Now imagine what a GPU in 2012 would be cable off, regarding RT.
 
Top Bottom