• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX 9070 expected to start at $479, available as soon as late January (Update: Radeon 9000 series go on sale in late March)

Outlier

Member
What makes you think an AMD gpu scaled to 600W and $2000 price point couldn't match the 5090?

AMD isn't behind at all when it comes to hardware only blind fanboys believe that Nvidia is far ahead.
I'm talking about them as competitors.
Not hardware performance.
 
3gb chips too. Call it a 9075xtx or something and tack a few hundo on.

I would buy that coming from a xtx for better rt and full fsr4 even though it would only be slightly faster overall I think. Working on RDNA5 is probably higher priority then spending money refreshing these though.

Maybe if things go well they will se an opportunity though, who knows.
There is no RDNA5. It's UDNA a new architecture.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I am cautiously excited about the 9070XT. While my main gaming will be done on my 4090 (I have decided at the moment to not try and upgrade to the 5090), I do have a 7800X3D/7800XT in my living room SteamOS/Bazzite PC that is connected to my home theater that I would consider replacing the 7800XT with a 9070XT if ray tracing and FSR4 improvements end up being legit good. I know it wont be a super upgrade in raw performance but if it can make ray tracing a viable option and provide quality upscaling that improves, it could potentially feel like a super upgrade without breaking the bank.

Of course, it would be true to form for AMD to overpromise and underdeliver.

However, with the 5070 being priced where it is, AMD will not likely be able to charge over $500.
 

Crayon

Member
According to mlid, that alleged benchmark slide had wattages on it. XT was 304w and non was 220w.

...

If amd wants to change naming conventions, let us stop having to say "non-xt". It's ridiculous. Call it a "DX" like a Corolla or a Civic or something. :lollipop_neutral:
 

manfestival

Member
You can find benches for every PT game.



Isn't wukong tested in raster here and cyberpunk with hybrid RT (the one it launched with) and not PT?
That wasn't the point. I am aware it exists in more than 1 title but the point is that it in every conversation only gets brought up in discussion only really for cyberpunk. This thread and every other thread have been evidence of it regardless if even 200 titles existed with it.
 

FingerBang

Member
The cards are apparently at retailers and still no announcement from AMD. Something is very wrong here.
They don't know how to market their cards after Nvidia claimed the 5070 to be as fast as a 4090.

They want the 5070 and 5070ti to be out and show their real performance or they're going to look like idiots.

Even a 9070xt that hypothetically beats a 4080Super at $500 will still look like shit compared to a 549 cards as fast as a 4090.
 
They don't know how to market their cards after Nvidia claimed the 5070 to be as fast as a 4090.

They want the 5070 and 5070ti to be out and show their real performance or they're going to look like idiots.

Even a 9070xt that hypothetically beats a 4080Super at $500 will still look like shit compared to a 549 cards as fast as a 4090.
This doesn't make a lot of sense, only the 5090 and 5080 are launching in January anyways

The 5070 and 5070 Ti aren't even scheduled until February and no firm launch date has been set. If Nvidia felt like it they could delay those to March or even further. How long does AMD plan to wait for Nvidia LOL
 

Bojji

Member
According to mlid, that alleged benchmark slide had wattages on it. XT was 304w and non was 220w.

...

If amd wants to change naming conventions, let us stop having to say "non-xt". It's ridiculous. Call it a "DX" like a Corolla or a Civic or something. :lollipop_neutral:

XT pushes clocks to the limits, they clearly wanted to make it faster than 7900XT.

There are no consumer GPUs with clocks this high, non XT version despite using similar chip is like ~500MHz slower? This is weird as fuck, usually smaller chips have higher clocks. I can make my 4070ti super run ~3GHz but it will hit power limit in many games.

That wasn't the point. I am aware it exists in more than 1 title but the point is that it in every conversation only gets brought up in discussion only really for cyberpunk. This thread and every other thread have been evidence of it regardless if even 200 titles existed with it.

It has one of the biggest differences between Non RT, hybrid RT and Path Tracing. It's more than 4 years old and it's still used by DF and Nvidia to show off stuff. Game will be a benchmark for quite some time.

This doesn't make a lot of sense, only the 5090 and 5080 are launching in January anyways

The 5070 and 5070 Ti aren't even scheduled until February and no firm launch date has been set. If Nvidia felt like it they could delay those to March or even further. How long does AMD plan to wait for Nvidia LOL

Yep, GPUs are already in some stores. LOL.
 
Last edited:
AMD is really in a bind. Intel coming for the low end, Nvidia keeping their lead in the high end. GPUs at retailers and AMD is asking themselves "how can we possibly make money on these?"
 
On hopium that this is trading blows with a 4080S in raster, after final drivers are released. There's no way they'll even be close in RT to a 4080S, let alone a 5070, and I can live with that, the compensation being the 4 GB extra VRAM.

Penguin Pray GIF by Pudgy Penguins
 

Silver Wattle

Gold Member
I just don't even know how AMD's marketing department still have jobs:
 

Bojji

Member
I just don't even know how AMD's marketing department still have jobs:

Soon random people will have cards in their hands, lol.
 

SonGoku

Member
Even current gen doesn't lose "by a decent margin in RT". One needs to pick up that specific green sponsored thing to make a lame point. E.g. 7800XT is between 4060Ti and 4070 (it's the same pic across all resolutions):

relative-performance-rt-2560-1440.png


And let me dig up the pricing for those at the moment. Germany:

4060Ti- 410 Euro
4070 (12GB, lol) - 550 Euro

7800XT - 490 Euro

So 7800XT wins vs oth even at RT perf/$ , even more so at raster.

relative-performance-2560-1440.png

PS
Is it claimed in all seriousness that in Jan 2025 we have AAA "fully path traced" games? Like, that BMW thing, for f*cking real? All that right after Quake RT (which resorts to gazillion of tricks). Amazing.
This is childish what you are doing, its pointless to muddle the results with light use of RT. The point is to use games that make heavy use of RT, preferably Pathtracing to see how good or not a given card is at RT. Because at the end of the day its those games that make heavy use of RT that you want your card to be able to hold its own.

Same logic behind CPU benchmarks at 1080p to create scenarios that push CPU the hardest.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
This is childish what you are doing, its pointless to muddle the results with light use of RT. The point is to use games that make heavy use of RT, preferably Pathtracing to see how good or not a given card is at RT. Because at the end of the day its those games that make heavy use of RT that you want your card to be able to hold its own.

Same logic behind CPU benchmarks at 1080p to create scenarios that push CPU the hardest.
He knows what he’s doing. Once this releases blows over, he’ll crawl back into his hole and wait until another major event to come to go to war with NVIDIA.
 
Last edited:

CrustyBritches

Gold Member
I think AMD is waiting for real world benchmarks of the 5070 to be published. Nvidia did the equivalent of using lossless scaling at x3 or x4 frame generation and compared it to a 4090.


If that’s the case, AMD should do a snarky Wendy’s style tweet, saying that’s the plan.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I think AMD is waiting for real world benchmarks of the 5070 to be published. Nvidia did the equivalent of using lossless scaling at x3 or x4 frame generation and compared it to a 4090.


If that’s the case, AMD should do a snarky Wendy’s style tweet, saying that’s the plan.
Except real world benchmarks aren’t coming anytime soon. We’re looking at late February, possibly March, at the earliest. Those cards are already in stores. Surely, AMD won’t ask retailers to wait around 1 month before selling their supplies.
 

CrustyBritches

Gold Member
Except real world benchmarks aren’t coming anytime soon. We’re looking at late February, possibly March, at the earliest. Those cards are already in stores. Surely, AMD won’t ask retailers to wait around 1 month before selling their supplies.
I forgot they were only doing the 5080/5090. 5080 might serve the same purpose. Just spitballing.🤷‍♂️
 

FingerBang

Member
This doesn't make a lot of sense, only the 5090 and 5080 are launching in January anyways

The 5070 and 5070 Ti aren't even scheduled until February and no firm launch date has been set. If Nvidia felt like it they could delay those to March or even further. How long does AMD plan to wait for Nvidia LOL
I'm not saying they will wait for Nvidia. I probably should have said, "They wish Nvidia would release the cards before they introduce their own," but the rest stays the same.

Nvidia likely put them in a corner with their announcement.
 

llien

Banned
This is childish what you are doing, its pointless to muddle the results with light use of RT. T
It is "childish" to use results by reputable sites.

It is "grown up" to make up ephemeral list of cherry picked games "which matter".

Are you for real?

Taylor Swift Hair Flip GIF by MOODMAN



6 years into "hardwahr RT", where are we:



Oh, look at that, whopping 30% of "RT games" are "clearly improves visuals with RT gimmick on" category:

pfio2bG.png
 

llien

Banned
Same logic behind CPU benchmarks at 1080p to create scenarios that push CPU the hardest.
Artificial bottlenecking at 1080p and below was dumb as bread, as proven in practice by newer games running better on the first Ryzens.

And back to the RT topic: 6 years into 'hardwahr RT' we have 10 games in which it indisputably significantly improves visuals.

So much effort and money (including that of consumers) busted so that there is a handful of games in which it is worth it... if you have paid enough $$$ for your GPU... and are OK with FPS dip.

Can we use these 10 games as "RT games in which RT matters" list, or do you have a better one?
 
Last edited:

llien

Banned
hybrid RT and Path Tracing
The "really RT" games (I can call them "path traced" if it makes you happier):
  • Quake
  • Serious Sam: The First Encounter
  • Doom
  • Doom 2
  • Half-Life (#2 is incoming)
What is common about them? Primitive graphics/geometry complexity. But even Quake does a lot of... "tricks" to achieve acceptable framerate.

What makes you think that game as modern as CP2077, let alone that BMV thing (that brings even 4090 to a crawl even without RT gimmick) can enter the same room?
 
It is "childish" to use results by reputable sites.

It is "grown up" to make up ephemeral list of cherry picked games "which matter".

Are you for real?

Taylor Swift Hair Flip GIF by MOODMAN



6 years into "hardwahr RT", where are we:



Oh, look at that, whopping 30% of "RT games" are "clearly improves visuals with RT gimmick on" category:

pfio2bG.png

Only a disingenuous person could call RT a gimmick. Raster effects (such as SSR, cube maps, cascaded shadow maps, pre-baked lightmaps and GI) mimic what RT does, and therefore can be considered a gimmick. Ray tracing technology is the exact opposite of a gimmick, and that's why it's so expensive.

As of now there are over 100 games with RT support, while this hardware unboxed comparison only shows limited number of games from that list.

I agree about the dead space remake, RT is unnoticeable in this game, and I'm sure there are couple more games like that. However, in the vast majority of games I have played, RT has made at least a noticeable difference, and a generational difference when it was fully utilised.

RE Village looks definitely better with ray tracing. Water reflections no longer fade out as you move the camera and look sharper, and RT GI sometimes makes a big difference too (especially in darker areas lit only with indirect lighting) as shown in digital foundry comparison video. These are welcome improvements considering how well this game runs, I get 130-160fps in 4K native on my PC.

4.jpg


In RE4 remake RT was a little bit downgraded compared to the RE Village and even RE3 remake (RT reflections in the RE3R were pretty much everywhere), however RT water (on the lake) in RE4R still looks a lot better compared to raster while only tanking 1fps on my PC, so it was absolutely worth to use it.

Callisto Protocol, that game has much better shadows quality thanks to RT, and very noticeable RT reflections. I saw this difference almost everywhere without looking for it, so it was quite noticeable I would say.

RT reflections in callisto protocol have reduced resolution, but it's better to have reflections with reduced quality than no reflections at all.

4-K-FSRQ-Raster-2.jpg


4-K-FSRQ-RT-2.jpg


RT in shadow of the tomb raider doesnt make huge difference in all locations, but the difference is still there and I wouldnt want to play this game without RT. I hate cascaded shadow maps, and RT also increases the number of rendered shadows. In some locations (especially in the caves) that difference was huge.

Hardware Unboxed labelled RT in Mero Exodus (Standard Edition) as different image, but unclear if better
😂😂. Good joke. The lighting without RT looks flat in this game and character models dont even match the scene. It seems Hardware Unboxed experts are blind, and you are blind as well by blindly repeating their false opinions.

Metro-Exodus-2024-12-08-08-44-41-667.jpg


Metro-Exodus-2024-12-08-08-44-29-030.jpg


Metro-Exodus-2024-12-08-08-42-31-250.jpg


Metro-Exodus-2024-12-08-08-42-18-238.jpg


Metro-Exodus-2024-12-08-07-37-16-409.jpg


Metro-Exodus-2024-12-08-07-37-45-541.jpg


I even prefer original Metro Exodus compared to the EE, because developers have destroyed atmosphere and the look of the game. Enhanced Edition has raised blacks and blown out highlights.

As for the witcher 3 next gen, this game should be in the category "transforms visuals significantly". The lighting without RT looks flat, cascaded shadows maps draws literally in front of Geralt, and screen space reflections fede out as you move around the water.

1a.jpg


1b.jpg


2a.jpg


2b.jpg


5a.jpg


5b.jpg


3a.jpg


3b.jpg


Black Myth Wukong use RT regardless of the settings ("lumen" is software RT) but with PT the lighting looks a lot better even at medium PT settings while only tanking 3% performance on my PC.

1440p-DLSSQ-FG-Cinematic-medium-RT.jpg


1440p-DLSSQ-FG-cinematic-lumen.jpg


RT requires more hardware resources, but it's also very scalable. On my PC RT performance hit is usually around 20-30%, and up to 70% with PT but DLSS + FG can improve performance by a factor of 5x times, so personally I dont even care about RT performance impact. I always turn RT on. This would not be the case if I had bought the RX7900XTX. Even with medium PT games like Black Myth Wukong wouldnt run at 120fps. What's more the RX7900XTX doesnt even support RR, so I would get way more noise in games like Alan Wake 2. So yes, I can understand why AMD fans such yourself arnt happy about RT performance, but RTX 40 series (and soon RTX 50) offers very good RT performance.

31 fps on the RTX3080ti, and 81 fps on the RTX4090. You tried to say that the RX7900XTX dont have much worse RT performance compared to ADA Lovelace GPUs, but this AMD card cant even match the RTX3080ti in this game. The gap in RT performance is absolutely huge, and only the new RDNA4 9700XT could change that (will see).

Screenshot-20250119-135323-You-Tube-2.jpg
 
Last edited:

llien

Banned
Only a disingenuous person could call RT a gimmick.
Take a deep breath: no, not really.

As of now there are over 100 games with RT support and this hardware unboxed comparison only shows limited number of games from that list.
In other words "there are 2.5 times more RT games, HUB is hiding them to skew the realiy".
Yeah, ok.

Hardware Unboxed labelled Meter Exodus (Standard Edition) as different image, but unclear if better 😂😂. Good joke.
HUB has also demonstrated it in the video, with tongue-in-cheek "guess which side is RT", mind you.
But "HUB is lying", got it.

I even prefer original Metro Exodus compared to the EE
So, on the other hand, HUB is wrong about noting that ME EE RT implementation.
They do everything to simply mud the waters, hateful fucks!!!

(I like RT in W3)
OK. So does HUB, curiously.

(lumen is software RT)
It is, sort of.
And no, it doesn't count as "hardwahr RT", I'm so sorry about that.

RT vs RT talk
Please enlighten me about PT cores. Or are you seriously taking the "really fully ray traced" bait about BMW? Oh well. Not that it matters here anyhow.

We are trying to figure "really RT games" list. Mkay?

I got that you don't like the HUB findings. Would you mind improving their list/providing the better one.

+ FG can improve performance
Of course it can.

(I would not be able to do 5x FG inflation with 7900XTX and BMW
Ok.
(With 7900XTX I would get way more noise in games like Alan Wake 2)
Than in some other NV card, got it.

radeon cards in this game are even slower than 3080ti
Cards slower than 3080Ti suck, fine by me.
 
Last edited:

StereoVsn

Gold Member
AMD is being really weird with this. IF the card can come within striking distance to 4080S and they can actually reasonably price it for around $500 they should launch ASAP and shout that in a market campaign.

Otherwise they will have to wait another 2 months potentially since who knows when 5070/5070Ti will release.

Or I guess they could be waiting till end of the month for 5080 scores, but what is that going to give them?

Hmm… maybe drivers are still a bit raw but hardware is ready? Whole situation is a bit strange.
 

Bojji

Member
AMD is being really weird with this. IF the card can come within striking distance to 4080S and they can actually reasonably price it for around $500 they should launch ASAP and shout that in a market campaign.

Otherwise they will have to wait another 2 months potentially since who knows when 5070/5070Ti will release.

Or I guess they could be waiting till end of the month for 5080 scores, but what is that going to give them?

Hmm… maybe drivers are still a bit raw but hardware is ready? Whole situation is a bit strange.

If that recent leaked performance is real they have really good GPU in their hands. Question is, how much they will price it.

This card could go close to 5070ti. Problem with AMD is they are not capable of using chances like this to get gain market share so I doubt price rumored in thread title...

On the other hand, most things produced in last few years are super low on stock (5xxx series is rumored to be like that as well) so it may not matter what price they use, they will still sell out initial stock.
 
Last edited:

DenchDeckard

Moderated wildly
What thing?

The 9070 series was supposed to launch on Jan 23rd. One week ahead of 5080/5090. Rumours were that as Nvidia didn't share a date for 5070 that AMD didn't was their cards to be benchmarks or reviewed lose to nvidias new flagships as they will look much worse.

The AMD cards are more targeting 5070 so maybe they wait until those cards are releasing to release theirs at a cheaper price.

Stock is put in the wild which is wild. I've never seen it and I'm 10 years in the game lol
 
Take a deep breath: no, not really.


In other words "there are 2.5 times more RT games, HUB is hiding them to skew the realiy".
Yeah, ok.


HUB has also demonstrated it in the video, with tongue-in-cheek "guess which side is RT", mind you.
But "HUB is lying", got it.


So, on the other hand, HUB is wrong about noting that ME EE RT implementation.
They do everything to simply mud the waters, hateful fucks!!!


OK. So does HUB, curiously.


It is, sort of.
And no, it doesn't count as "hardwahr RT", I'm so sorry about that.


Please enlighten me about PT cores. Or are you seriously taking the "really fully ray traced" bait about BMW? Oh well. Not that it matters here anyhow.

We are trying to figure "really RT games" list. Mkay?

I got that you don't like the HUB findings. Would you mind improving their list/providing the better one.


Of course it can.


Ok.

Than in some other NV card, got it.


Cards slower than 3080Ti suck, fine by me.
Here's the full list. Hardware unboxed only tested a small percent of RT games.



Hardware usually means speeding up (acceleration) calculations. RTX cards use RT cores to accelerate RT calculations, so it's called HW RT. Lumen dont use any specific hardware to accelerate ray tracing calculation, but it renders more primitive representation of the scene, in order to speed up RT calculations (but not much, in Black Myth Wukong medium PT is only 3% slower on my card).

"Lumen is a hybrid tracing pipeline that uses Software Ray Tracing. It traces against the depth buffer first, which we call Screen Traces, then it traces against the distance field and applies lighting to ray hits with the Surface Cache. Lumen takes any given scene and renders a very low-resolution model of it".

 
Last edited:

llien

Banned
Hardware usually means speeding up (acceleration) calculations
Hardware means "using hardware". But it doesn't matter


I thought you were trying to help @SonDoku and myself identify the "RT games that matter".

You've identified RT was good in Witcher 3, in line with HUBs findings.
You didn't agree with them pointing out ME EE.
And there was another game that you think deserved more credit for RT implementation.

Any other contribution do you want to make for compiling "real RT games" list?

I'm afraid a list of 100+ games that you've shared would contain even more "not relevant" RT games.
 

Buggy Loop

Gold Member
I just don't even know how AMD's marketing department still have jobs:

Another
Marketing
Disaster
 

manfestival

Member
It has one of the biggest differences between Non RT, hybrid RT and Path Tracing. It's more than 4 years old and it's still used by DF and Nvidia to show off stuff. Game will be a benchmark for quite some time.
You are just explaining the rationale and justification for this but that is still missing the point since it was never about "why" it is happening. Just seems like you are responding just to respond if anything. Not trying to insult you but seems silly when the point was made and others understood but you respond with other things.
 
Hardware means "using hardware". But it doesn't matter


I thought you were trying to help @SonDoku and myself identify the "RT games that matter".

You've identified RT was good in Witcher 3, in line with HUBs findings.
You didn't agree with them pointing out ME EE.
And there was another game that you think deserved more credit for RT implementation.

Any other contribution do you want to make for compiling "real RT games" list?

I'm afraid a list of 100+ games that you've shared would contain even more "not relevant" RT games.
Hardware has a broad meaning. All of the calculations require some type of hardware to be executed. RT can even run on very old PCs, but you would have to wait for many, many minutes (if not hours) to render a single frame. Nvidia has revolutionised gaming with the addition of special hardware (RT cores) to its GPUs that can accelerate RT calculations and now people refer to it as HW RT. HW RT is simply RT acceleration.

I haven't played all RT games from that list in PCGamingwiki (251 as of now, but some games on their list still wasnt released), but almost every RT game in my library offered better visuals thanks to RT, and that's enough for me to conclude that RT is worth it.

Some games like the witcher 3, cyberpunk, minecraft, half life 1, doom 1/2 looks drastically different thanks to RT, but from my perspective even simple RT effects offers noticeable improvement. I have studied the rules of light and can immediately tell the difference between crappy SSR and RT reflections, or between cascaded shadow maps and RT shadows.

I dont agree with hardware unboxed and my screenshoted shows why. They suggest RT at low settings in cyberpunk doesnt make a noticeable difference, but there's a plenty of reflective surfaces in this game, and objects that cast shadows. RT low in cyberpunk enables both RT reflections and RT shadows.

Raster

raster.jpg


RT low

RT-shadows-reflections.jpg


Raster

raster.jpg


Low RT

RT-reflections-shadows.jpg


You don't even have to look closely to see the differences. RT in cyberpunk makes a huge difference regardless of the settings and performance cost is well justified.
 
Last edited:

phant0m

Member
If they could go ahead and get on with it that’d be great. Pretty keen on putting one of these in a Bazzite/SteamOS build

Though 7x00 series cards are starting to come down too
 
Last edited:

Wolzard

Member
Only a disingenuous person could call RT a gimmick. Raster effects (such as SSR, cube maps, cascaded shadow maps, pre-baked lightmaps and GI) mimic what RT does, and therefore can be considered a gimmick. Ray tracing technology is the exact opposite of a gimmick, and that's why it's so expensive.

As of now there are over 100 games with RT support, while this hardware unboxed comparison only shows limited number of games from that list.

I agree about the dead space remake, RT is unnoticeable in this game, and I'm sure there are couple more games like that. However, in the vast majority of games I have played, RT has made at least a noticeable difference, and a generational difference when it was fully utilised.

RE Village looks definitely better with ray tracing. Water reflections no longer fade out as you move the camera and look sharper, and RT GI sometimes makes a big difference too (especially in darker areas lit only with indirect lighting) as shown in digital foundry comparison video. These are welcome improvements considering how well this game runs, I get 130-160fps in 4K native on my PC.

4.jpg


In RE4 remake RT was a little bit downgraded compared to the RE Village and even RE3 remake (RT reflections in the RE3R were pretty much everywhere), however RT water (on the lake) in RE4R still looks a lot better compared to raster while only tanking 1fps on my PC, so it was absolutely worth to use it.

Callisto Protocol, that game has much better shadows quality thanks to RT, and very noticeable RT reflections. I saw this difference almost everywhere without looking for it, so it was quite noticeable I would say.

RT reflections in callisto protocol have reduced resolution, but it's better to have reflections with reduced quality than no reflections at all.

4-K-FSRQ-Raster-2.jpg


4-K-FSRQ-RT-2.jpg


RT in shadow of the tomb raider doesnt make huge difference in all locations, but the difference is still there and I wouldnt want to play this game without RT. I hate cascaded shadow maps, and RT also increases the number of rendered shadows. In some locations (especially in the caves) that difference was huge.

Hardware Unboxed labelled RT in Mero Exodus (Standard Edition) as different image, but unclear if better
😂😂. Good joke. The lighting without RT looks flat in this game and character models dont even match the scene. It seems Hardware Unboxed experts are blind, and you are blind as well by blindly repeating their false opinions.

Metro-Exodus-2024-12-08-08-44-41-667.jpg


Metro-Exodus-2024-12-08-08-44-29-030.jpg


Metro-Exodus-2024-12-08-08-42-31-250.jpg


Metro-Exodus-2024-12-08-08-42-18-238.jpg


Metro-Exodus-2024-12-08-07-37-16-409.jpg


Metro-Exodus-2024-12-08-07-37-45-541.jpg


I even prefer original Metro Exodus compared to the EE, because developers have destroyed atmosphere and the look of the game. Enhanced Edition has raised blacks and blown out highlights.

As for the witcher 3 next gen, this game should be in the category "transforms visuals significantly". The lighting without RT looks flat, cascaded shadows maps draws literally in front of Geralt, and screen space reflections fede out as you move around the water.

1a.jpg


1b.jpg


2a.jpg


2b.jpg


5a.jpg


5b.jpg


3a.jpg


3b.jpg


Black Myth Wukong use RT regardless of the settings ("lumen" is software RT) but with PT the lighting looks a lot better even at medium PT settings while only tanking 3% performance on my PC.

1440p-DLSSQ-FG-Cinematic-medium-RT.jpg


1440p-DLSSQ-FG-cinematic-lumen.jpg


RT requires more hardware resources, but it's also very scalable. On my PC RT performance hit is usually around 20-30%, and up to 70% with PT but DLSS + FG can improve performance by a factor of 5x times, so personally I dont even care about RT performance impact. I always turn RT on. This would not be the case if I had bought the RX7900XTX. Even with medium PT games like Black Myth Wukong wouldnt run at 120fps. What's more the RX7900XTX doesnt even support RR, so I would get way more noise in games like Alan Wake 2. So yes, I can understand why AMD fans such yourself arnt happy about RT performance, but RTX 40 series (and soon RTX 50) offers very good RT performance.

31 fps on the RTX3080ti, and 81 fps on the RTX4090. You tried to say that the RX7900XTX dont have much worse RT performance compared to ADA Lovelace GPUs, but this AMD card cant even match the RTX3080ti in this game. The gap in RT performance is absolutely huge, and only the new RDNA4 9700XT could change that (will see).

Screenshot-20250119-135323-You-Tube-2.jpg

Hardware has a broad meaning. All of the calculations require some type of hardware to be executed. RT can even run on very old PCs, but you would have to wait for many, many minutes (if not hours) to render a single frame. Nvidia has revolutionised gaming with the addition of special hardware (RT cores) to its GPUs that can accelerate RT calculations and now people refer to it as HW RT. HW RT is simply RT acceleration.

I haven't played all RT games from that list in PCGamingwiki (251 as of now, but some games on their list still wasnt released), but almost every RT game in my library offered better visuals thanks to RT, and that's enough for me to conclude that RT is worth it.

Some games like the witcher 3, cyberpunk, minecraft, half life 1, doom 1/2 looks drastically different thanks to RT, but from my perspective even simple RT effects offers noticeable improvement. I have studied the rules of light and can immediately tell the difference between crappy SSR and RT reflections, or between cascaded shadow maps and RT shadows.

I dont agree with hardware unboxed and my screenshoted shows why. They suggest RT at low settings in cyberpunk doesnt make a noticeable difference, but there's a plenty of reflective surfaces in this game, and objects that cast shadows. RT low in cyberpunk enables both RT reflections and RT shadows.

Raster

raster.jpg


RT low

RT-shadows-reflections.jpg


Raster

raster.jpg


Low RT

RT-reflections-shadows.jpg


You don't even have to look closely to see the differences. RT in cyberpunk makes a huge difference regardless of the settings and performance cost is well justified.

I would say that these games actually have a very poor raster mode. There are many games with pre-baked lighting and common reflections that are very beautiful. The problem is the effort that Digital Foundry makes by demonstrating defects that you normally don't notice during gameplay, although RT lighting also has its problems (such as noise and low resolution).

It's more a question I would say of good artists, who were more dedicated in the past. Nowadays, everything is kind of outsourced and mass-produced with the cheapest labor possible. Even with RT, many games aren't that impressive.
 

FireFly

Member
I would say that these games actually have a very poor raster mode. There are many games with pre-baked lighting and common reflections that are very beautiful. The problem is the effort that Digital Foundry makes by demonstrating defects that you normally don't notice during gameplay, although RT lighting also has its problems (such as noise and low resolution).

It's more a question I would say of good artists, who were more dedicated in the past. Nowadays, everything is kind of outsourced and mass-produced with the cheapest labor possible. Even with RT, many games aren't that impressive.
Cyberpunk already has a full GI probe system + SSR + planar reflections for the mirrors. They could use planar reflections more, but they are expensive as you have to redraw the scene twice. The shadow issues are a function of the shadow map resolution, which I think you would need something like VSM (from UE5) to solve. Which again, is expensive on its own. So actually I think Cyberpunk is an example of how to get the most out of rasterised lighting.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
5070's raw perf is barely above 4070, a card slower than your 7800XT.
The only way it would constraint 9070XT at $500 is if it is much slower than claimed.
Doesnt matter. Unless the 9070XT ends up regularly outperforming the 5070 Ti in raw performance in most benchmarks and is in the ballpark with FSR4 and ray tracing, they need to treat and price it like it is competing with the 5070. The name 9070 tells you exactly what they see it competing against.
 
Doesnt matter. Unless the 9070XT ends up regularly outperforming the 5070 Ti in raw performance in most benchmarks and is in the ballpark with FSR4 and ray tracing, they need to treat and price it like it is competing with the 5070. The name 9070 tells you exactly what they see it competing against.
Why?
Why can't the 9070 XT be positioned similarly to the 5070 Ti, if they perform similarly?
Not like there isn't a 9070 that can be positioned against the base 5070.
 

Bojji

Member
I hope it happens, but it needs to still be $500. That will shake up the market.

It won't matter IF they don't have enough GPUs manufactured to meet the demand.

Supply is huge problem for few years now. We have one company that does everything and most of their supply lines are reserved for enterprise stuff.
 
Last edited:

Buggy Loop

Gold Member
This is childish what you are doing, its pointless to muddle the results with light use of RT. The point is to use games that make heavy use of RT, preferably Pathtracing to see how good or not a given card is at RT. Because at the end of the day its those games that make heavy use of RT that you want your card to be able to hold its own.

Same logic behind CPU benchmarks at 1080p to create scenarios that push CPU the hardest.

Yup

Nobody gives a shit to see how new cards are gonna improve RT performances with resident evil 4, it's already a what, 2~5 fps drop

If AMD is gonna have to convince me to buy RDNA 4, its with path tracing. My 3080 Ti is already plenty for "light RT".
 
Top Bottom