• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 4 GPUs To Incorporate Brand New Ray Tracing Engine, Vastly Different Than RDNA 3

Game averages do not matter, we need true Scotsman RT games to judge.

I hope TPU will roll out "RT performance in True RT Games" charts.


This narrative is even older.
Reality has changed, but the ball of scheisse just keeps rolling.

Who gives a shit in games where you can't tell if the shit is off or on?
 

llien

Member
AMD sponsored games with some low quality rt effects are not really the best indicator of RT power of the GPUs.
As opposed to the NV sponsored games that look like it's 2005 without RT enabled, say, Control.
Which also happens (happened?) to use different code path for different GPU manufacturers.

Who gives a shit in games where you can't tell if the shit is off or on?
"What % of gamers enables the RT gimmick" would be an interesting statistic to see.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
7900XTX-All-p.webp


4090



4090-All-p.webp


Ah, yes, that beautiful average chart where Shadow of the Tomb Raider gets tested twice, once at Medium where it has the same performance, and the other time at Ultra where it loses 44% performance on the 7900 XTX. Why HU decided to test a 2018 game at Medium setting on a 7900 XTX is beyond me...actually, it isn't. Those guys have been on a campaign to discredit ray tracing for years.

What to say of games such as Dead Space remake where there's only shitty RT AO that makes absolutely no difference in regards to visuals?

Once again, looking at averages is a nice way to completely miss the point and get a bad surprise when a game with full RT runs 40% faster on the 4080 compared to the 7900 XTX.

As opposed to the NV sponsored games that look like it's 2005 without RT enabled, say, Control.
Which also happens (happened?) to use different code path for different GPU manufacturers.
Control still looks great without ray tracing.
 
Last edited:
As opposed to the NV sponsored games that look like it's 2005 without RT enabled, say, Control.
Which also happens (happened?) to use different code path for different GPU manufacturers.


"What % of gamers enables the RT gimmick" would be an interesting statistic to see.

Plenty for those who have the cards to run and the effects of it are very good. Basically the bottom of the chart that guy just posted. lol
 

Bojji

Member
Game averages do not matter, we need true Scotsman RT games to judge.

I hope TPU will roll out "RT performance in True RT Games" charts.


This narrative is even older.
Reality has changed, but the ball of scheisse just keeps rolling.

4080 on par with 7900XTX in RT - or is it?

68ppE6n.jpeg
42JS1Sx.jpeg


This is sadly vs 4090. Without RT 4090 is ~23% faster than 7900XTX. With RT in some games?

b1UVOCt.jpeg
Nm6Zw16.jpeg

lOXhDcR.jpeg
hn8ZjWk.jpeg
ZiNwTDu.jpeg
u3O6feq.jpeg


It can be 2x faster or even more.
 

llien

Member
Plenty for those who have the cards to run and the effects of it are very good. Basically the bottom of the chart that guy just posted. lol
What % of the games counts as "bottom of that chart" please.

(Can I cherry pick a handful of games to make a point?)
No, I'm afraid, not. And it is remarkable that you've picked up "A Plague Tale Requiem" as an example of a good RT implementation:



If I got you right you've rolled out quite an accusation towards AMD and I want to clarify that first, if I may.

So RT in games doesn't do what NV presentation said it would, because games are AMD sponsored?
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
4080 on par with 7900XTX in RT - or is it?
Everyone knows it isn't, but this dude does nothing but shill for AMD and piss and moan about NVIDIA.

Even in his very own Techpowerup link the 4080 still won by 17-22% despite being around 5% slower on average, so RT allows the 4080 to claw back over 20% of performance. That's not equal. Furthermore, I have no idea why they post average charts when we KNOW which games have good ray tracing implementations that are worth turning on and which don't. Hint: RE4, Far Cry 6 and Dead Space suck. Count those that make a big difference such as Alan Wake 2, Black Myth Wukong, Control, Rift Apart, Cyberpunk, and a bunch of others, suddenly, the 7900 XTX doesn't look so hot. If the argument is that most games have shitty RT implementation, then that's correct. That doesn't make the 4080 and 7900 XTX equal in RT though.
 

Gaiff

SBI’s Resident Gaslighter
Also hilarious that peopled initially called Frame Generation and Ray Tracing gimmicks, yet that's where Mark Cerny is putting his trust in. Sadly, NVIDIA was once again ahead of the curve and everybody is playing catch up.
 
Last edited:

Bojji

Member
What % of the games counts as "bottom of that chart" please.


No, I'm afraid, not. And it is remarkable that you've picked up "A Plague Tale Requiem" as an example of a good RT implementation:



If I got you right you've rolled out quite an accusation towards AMD and I want to clarify that first, if I may.

So RT in games doesn't do what NV presentation said it would, because games are AMD sponsored?


I used word "heavy" and not "good" for purpose. AMD sponsored games tend to have light RT (on the GPU) implementations - with those games you can build image in your head that AMD has good RT performance. But then game with heavy RT drops and completely destroys this card, Nvidia sponsored or unoptimized - it doesn't matter for customers, does it?
 

Wolzard

Member
Ah, yes, that beautiful average chart where Shadow of the Tomb Raider gets tested twice, once at Medium where it has the same performance, and the other time at Ultra where it loses 44% performance on the 7900 XTX. Why HU decided to test a 2018 game at Medium setting on a 7900 XTX is beyond me...actually, it isn't. Those guys have been on a campaign to discredit ray tracing for years.

What to say of games such as Dead Space remake where there's only shitty RT AO that makes absolutely no difference in regards to visuals?

Once again, looking at averages is a nice way to completely miss the point and get a bad surprise when a game with full RT runs 40% faster on the 4080 compared to the 7900 XTX.


Control still looks great without ray tracing.

You should have read the source to know why. They did a mega test with RT and observed where it impacts performance and how. They tested basically all games that have RT.

In Shadow of the Tomb Raider, RT on medium is the minimum possible to choose from. For Ultra, the difference is not very big, so it may be that changing the quality does not have a visual impact, but has an impact on performance.


2019-03-21-image-4-j.webp
 

llien

Member
Well considering RT is somewhat new
6 years, with 4th gen of cards around the corner is not that new in my humble opinion.

AMD sponsored games
Your comment was made in the context of about 40 (!!!) games reviewed.
How many of those are AMD sponsored pretty please?

Why it matters: for average RT figures to NOT matter (your argument) there should be a lot of AMD sponsored games in the list.
 
Last edited:

Wolzard

Member
Also hilarious that peopled initially called Frame Generation and Ray Tracing gimmicks, yet that's where Mark Cerny is putting his trust in. Sadly, NVIDIA was once again ahead of the curve and everybody is playing catch up.

It's a gimmick. Very few games have convincing effects and the impact on performance is still high. It's worth remembering that this has been pushed to players since 2018, even though the hardware wasn't ready yet.

Obviously Cerny is putting his faith in this, because he is already thinking about the PS6 and there, RT performance will be increased and become an industry standard.

We are in a transitional period, so obviously, it is not 100% worth it. It's like the transition from 2D to 3D, the PS1/Saturn games were very rudimentary. Vast games with satisfying 3D, only from the PS2 generation onwards.
 
6 years, with 4th gen of cards around the corner is not that new in my humble opinion.
The current gen of consoles have been out 4 years. The heavy RT games are comming out at an increased clip. There's still another 4 to go and 2 years of crossgen after that. Those 7xxx series cards will be left in the dust.

Edit: quote fail
 
Last edited:

Bojji

Member
6 years, with 4th gen of cards around the corner is not that new in my humble opinion.


Your comment was made in the context of about 40 (!!!) games reviewed.
How many of those are AMD sponsored pretty please?

Why it matters: for average RT figures to NOT matter (your argument) there should be a lot of AMD sponsored games in the list.

If not AMD sponsored then aimed at console RT power - no difference.

Anyway:

Cm4KqIL.jpeg
 

Gaiff

SBI’s Resident Gaslighter
You should have read the source to know why. They did a mega test with RT and observed where it impacts performance and how. They tested basically all games that have RT.
I know the source and watched the entire video a month ago. It's HU with the usual bullshit when it comes to RT. Same with their video when Tim was doing his damndest to show the RT problems with Indiana Jones and I couldn't even see them until he zoomed in by 300%. Ironically, Steve always recommends NVIDIA if you care about RT, but you guys are trying to suggest that AMD is just as good.
It's a gimmick. Very few games have convincing effects and the impact on performance is still high. It's worth remembering that this has been pushed to players since 2018, even though the hardware wasn't ready yet.
It isn't. When implemented properly, it's transformative. The problem is too many mid or bad devs just flip a switch such as with Dead Space and call it a day. You end up with mediocre or bad results. The objective shouldn't be to dismiss what RT can do because of bad implementations, but embrace what it can do with good implementations. If it were a gimmick, it wouldn't have a single game where it elevates the graphics to a higher level.
 
Last edited:

Wolzard

Member
I know the source and watched the entire video a month ago. It's HU with the usual bullshit when it comes to RT. Same with their video when Tim was doing his damndest to show the RT problems with Indiana Jones and I couldn't even see them until he zoomed in by 300%.

It isn't. When implemented properly, it's transformative. The problem is too many mid or bad devs just flip a switch such as with Dead Space and call it a day. You end up with mediocre or bad results. The objective shouldn't be to dismiss what RT can do because of bad implementations, but embrace what it can do with good implementations. If it were a gimmick, it wouldn't have a single game where it elevates the graphics to a higher level.

You seem biased against HU, I don't see anything wrong with them, they are more realistic than other youtubers who live on hype.
I personally think that you are the one who is too excited and thinks this is super revolutionary. Ray tracing is something interesting, it has been known for around 50 years.

Running in real-time games is still at a transitional point and not worth it in most cases. It's not simply a matter of implementation, the hardware is not yet capable enough. It's getting there, but it should become standard in the next generation of consoles, which is the market that determines the popularity of these technologies.
 

Superbean

Member
Honestly I have a 9800x3d and 4090, I game 4k on OLED. Ray tracing is only worth turning on in a couple games, even then the performance hit is hardly worth it. Id have to say it's 99% bad implementation by devs. Cp2077 looks a bit better with psycho on, same for AW2, but the performance hit just isn't worth the marginal visual "upgrades"
 

MikeM

Member
The APUs are interesting too:


RADEON-8050S-8060S.png


Wow thats really surprising. You could do some legit gaming on that thing.
 

Gaiff

SBI’s Resident Gaslighter
You seem biased against HU, I don't see anything wrong with them, they are more realistic than other youtubers who live on hype.
Nah, I like HU, but their takes on RT are garbage.
I personally think that you are the one who is too excited and thinks this is super revolutionary. Ray tracing is something interesting, it has been known for around 50 years.
There's a fine line between revolutionary and a gimmick. It's a great advancement and in some games, it elevates the visuals to a new level, yet here you are trying to dismiss as a mere gimmick because devs don't have a clue how to implement it properly.
which is the market that determines the popularity of these technologies.
Nonsense. RT, DLSS, and Frame Generation have taken off on PC and consoles are playing catch up. This isn't the 90s. PC has been at the forefront of the most important technological advancements in games in the past few years. VRR, AI upscaling, frame generation, ray tracing, input latency reduction technologies such as Reflex, ray reconstruction, etc.
 
Last edited:

Zathalus

Member
Last edited:

FireFly

Member
You seem biased against HU, I don't see anything wrong with them, they are more realistic than other youtubers who live on hype.
I personally think that you are the one who is too excited and thinks this is super revolutionary. Ray tracing is something interesting, it has been known for around 50 years.

Running in real-time games is still at a transitional point and not worth it in most cases. It's not simply a matter of implementation, the hardware is not yet capable enough. It's getting there, but it should become standard in the next generation of consoles, which is the market that determines the popularity of these technologies.
The biggest elements that contribute towards the "video game look" are a lack of shadows on smaller objects in a scene, light leakage leading to areas looking unnaturally bright, and a lack of reflections on objects, or reflections missing elements that are out of camera. All of that can be solved by ray tracing, and you can run at 60 FPS even on mid-range GPUs.

So yes, if game scenes looking realistic is a gimmick, then ray tracing is a gimmick.
 
Last edited:

Wolzard

Member
Nah, I like HU, but their takes on RT are garbage.

There's a fine line between revolutionary and a gimmick. It's a great advancement and in some games, it elevates the visuals to a new level, yet here you are trying to dismiss as a mere gimmick because devs don't have a clue how to implement it properly.

Nonsense. RT, DLSS, and Frame Generation have taken off on PC and consoles are playing catch up. This isn't the 90s. PC has been at the forefront of the most important technological advancements in games in the past few years. VRR, AI upscaling, frame generation, ray tracing, input latency reduction technologies such as Reflex, ray reconstruction, etc.

You have difficulty interpreting the text. I'll say it again, yes, it's revolutionary in a few games, but hundreds of games are released every year. Therefore, because it makes a difference in 1% of these titles, it is considered a gimmick. And even where it's worth it, a lot of hardware is required to obtain good results.

I haven't discounted the importance of the effect, I'm just saying it's in a transitional period. In the next generation of consoles, the effect will be more present in several games. For now, developers are testing, adding extra support through patches, migrating their tools.

PC has always advanced faster than consoles, but consoles are what determined when a technology becomes standard.

Bump mapping was there on GeForce 3, but became standard in the PS360 era.
Tessellation came out around 2011, but only became standard now with the PS5.
With RT it will be the same thing, we currently have support, but it will become something common on PS6.
 

Wolzard

Member
The biggest elements that contribute towards the "video game look" are a lack of shadows on smaller objects in a scene, light leakage leading to areas looking unnaturally bright, and a lack of reflections on objects, or reflections missing elements that are out of camera. All of that can be solved by ray tracing, and you can run at 60 FPS even on mid-range GPUs.

So yes, if game scenes looking realistic is a gimmick, then ray tracing is a gimmick.

Look, RT doesn't solve 100% of this. This is more about good artists than technology.
 

Zathalus

Member
Are you quoting me from other threads? Cmon man..all my posts quoted are related to the thread topic (trolling or not)

But what can I say ...
Your posts I quoted were from a thread about Steam player statistics. First post is trolling about player rigs and then you go on about “Sonybros” winning. Clearly very little to do with the thread at hand and straight up trolling. Bringing Mark Cerny up in a thread about RDNA4 and RT is way more on topic than that.
 

Gaiff

SBI’s Resident Gaslighter
You have difficulty interpreting the text. I'll say it again, yes, it's revolutionary in a few games, but hundreds of games are released every year. Therefore, because it makes a difference in 1% of these titles, it is considered a gimmick. And even where it's worth it, a lot of hardware is required to obtain good results.
That's not what a gimmick is at all. Choose your words better.
PC has always advanced faster than consoles, but consoles are what determined when a technology becomes standard.
Ray tracing, AI upscaling, and frame generation are already standard on PC. Consoles don't determine this. Once again, this isn't the 90s. PC has a larger market share than any individual console and has supplanted Xbox as the second platform where AAA games sell the most and can sometimes outdo PlayStation. We're no longer in the era where PC adopts technologies 4 years early and no game uses them because consoles can't. The reason we're not seeing more advanced ray tracing is simply because it's for the most part far too cost-prohibitive even on PC. How can you look at games with cutting-edge visual features such as BMW, Cyberpunk 2077, Alan Wake 2, or even the RT in Indiana Jones and go, "Yep, consoles dictated this"?
 
Last edited:

FireFly

Member
Look, RT doesn't solve 100% of this. This is more about good artists than technology.
The point is not that RT fixes everything, but that it can be used to address the biggest technical contributors to games looking artificial. Do you think that artists can "fix" reflections not showing objects that are off camera?
 
Last edited:

Wolzard

Member
That's not what a gimmick is at all. Choose your words better.

Ray tracing, AI upscaling, and frame generation are already standard on PC. Consoles don't determine this. Once again, this isn't the 90s. PC has a larger market share than any individual console and has supplanted Xbox as the second platform where AAA games sell the most and can sometimes outdo PlayStation. We're no longer in the era where PC adopts technologies 4 years early and no game uses them because consoles can't. The reason we're not seeing more advanced ray tracing is simply because it's for the most part far too cost-prohibitive even on PC. How can you look at games with cutting-edge visual features such as BMW, Cyberpunk 2077, Alan Wake 2, or even the RT in Indiana Jones and go, "Yep, consoles dictated this"?

Your definition of "standard" is seriously in need of revision.

PCs are a large market, but in revenue they are behind consoles. There's nothing you can do, look at the games, they have a better version on the PS5, even though the PC has more technology. And again, new technology is restricted to a few games.

Anyway, here is a thread about RDNA 4 GPUs, you are getting too far off topic.
 

Gaiff

SBI’s Resident Gaslighter
Your definition of "standard" is seriously in need of revision.
Most AAA games ship with ray tracing, DLSS, frame generation, or frequently all three. On consoles, you often get only one or none of these. If this isn't standard, then yeah, I need to revise my definition.
PCs are a large market, but in revenue they are behind consoles. There's nothing you can do, look at the games, they have a better version on the PS5, even though the PC has more technology. And again, new technology is restricted to a few games.
And you make the mistake of treating consoles as a single entity. They aren't. Switch, Xbox, and PS5 all have different software and hardware configurations. There's no single platform called "console", so why do you lump them all together as if the Switch and Series S didn't exist and weren't far below the SX and PS5? Hell, there often are complaints about the Series X being an afterthought because the PS5 sells better and therefore, games get more attention on it.
Anyway, here is a thread about RDNA 4 GPUs, you are getting too far off topic.
What do you mean, "you"? I brought up Mark Cerny's presentation which is directly related to RDNA4. You went on a tangent about consoles driving technology standards and then accuse me of going off-topic.
 
Last edited:

Wolzard

Member
The point is not that RT fixes everything, but that it can be used to address the the biggest technical contributors to games looking artificial. Do you think that artists can "fix" reflections not showing objects that are off camera?

Why do off-camera objects need to be fixed if you're not going to see them so precisely? I think that most discussions involving graphics always occur when analyzing static images, when in games you are moving at at least 60 frames per second and facing enemies, you won't notice an irregular reflection.
It's like when people complained about paper trees in racing games.

RT's biggest gain is time. Before RT, each scene needed to be well analyzed for artists to implement lighting, shadows and reflections. And that takes time. With RT, this task can be massively automated, but it runs the risk of creating an ugly scene, as it was not thought out in an artistic way.

The biggest mistake, in my opinion, of many players is to crave super realistic graphics, when this is not always the most beautiful thing.
 

Wolzard

Member
Most AAA games ship with ray tracing, DLSS, frame generation, or frequently all three. On consoles, you often get only one or none of these. If this isn't standard, then yeah, I need to revise my definition.

And you make the mistake of treating consoles as a single entity. They aren't. Switch, Xbox, and PS5 all have different software and hardware configurations. There's no single platform called "console", so why do you lump them all together as if the Switch and Series S didn't exist and weren't far below the SX and PS5? Hell, there often are complaints about the Series X being an afterthought because the PS5 sells better and therefore, games get more attention on it.

What do you mean, "you"? I brought up Mark Cerny's presentation which is directly related to RDNA4. You went on a tangent about consoles driving technology standards and then accuse me of going off-topic.

AAA are not all games on the market. What are they, about 10 games a year, compared to the thousands that come out? Obviously this is not standard.

PS5 and Xbox are literally the same hardware. Switch is older hardware, despite adopting a standard architecture that the industry knows (Switch supports many things that a PS4 supported).

You really brought information about RDNA 4, but then you focused on ray tracing, diverting a little from the main topic, which is about the new Radeon GPUs.
 

Gaiff

SBI’s Resident Gaslighter
AAA are not all games on the market. What are they, about 10 games a year, compared to the thousands that come out? Obviously this is not standard.
What do you think we're talking about? When we get excited for all these GPUs and new consoles, it's to play new AAA games, not indies or 2D metroidvanias. AAA games are implicitly always the point of the discussion.
PS5 and Xbox are literally the same hardware.
They aren't.
Switch is older hardware, despite adopting a standard architecture that the industry knows (Switch supports many things that a PS4 supported).
And a totally different system. Different APIs, different OS, different software libraries, and more.
You really brought information about RDNA 4, but then you focused on ray tracing, diverting a little from the main topic, which is about the new Radeon GPUs.
Did you read the thread title?

AMD RDNA 4 GPUs To Incorporate Brand New Ray Tracing Engine​


The thread is about the ray tracing in RDNA4. That you think talking about ray tracing is going off-topic tells me you might be confused as to what thread you're posting in.
 

FireFly

Member
Why do off-camera objects need to be fixed if you're not going to see them so precisely? I think that most discussions involving graphics always occur when analyzing static images, when in games you are moving at at least 60 frames per second and facing enemies, you won't notice an irregular reflection.
It's like when people complained about paper trees in racing games.

RT's biggest gain is time. Before RT, each scene needed to be well analyzed for artists to implement lighting, shadows and reflections. And that takes time. With RT, this task can be massively automated, but it runs the risk of creating an ugly scene, as it was not thought out in an artistic way.

The biggest mistake, in my opinion, of many players is to crave super realistic graphics, when this is not always the most beautiful thing.
The point is not that off camera objects are lit imprecisely. The point is that with the SSR reflections used in games, they don't exist! To give a concrete example, when you are climbing up a skyscraper as Spider-Man there simply won't be a reflection if all you have are SSR reflections. Of course developers fall back to cube maps for these situations, but they generally give a very imprecise approximation of what is being reflected and at least in their static form, don't reflect moving objects.
 

Gaiff

SBI’s Resident Gaslighter
He has a point, it wasn’t Cerny or console gamers dunking on frame-gen… it was PC gamers who were mad that Nvidia’s frame gen technology was exclusive to the 40 Series.
He doesn’t have a point. I said ray tracing and frame generation, not just ray tracing. He then got his panties in a bunch because I brought up Cerny and the future of PlayStation as if the PS5 didn’t use RDNA4 RT and the future of PlayStation didn’t include frame generation and ray tracing as centerpieces.

PlayStation is very much on-topic since it is currently the only device with RDNA4 ray tracing at the moment.
 

hinch7

Member
Performance estimates from Allthewatts is dissapointing. Under a 7900XT (raster?) and high end prices.

AMD never misses and opputunity to drop the ball when it comes to GPU launches.

I guess they might catch up in some capacity in RT and Ai upscaling *shrugs*
 
Top Bottom