Is Ray-Tracing the biggest scam in gaming?

The whole ray tracing naysayers thing all comes down to current consoles not doing it well. On PC it's absolutely incredible and I don't want to play another game without RT or PT.


It reminds me of the 30fps "debate", where 30fps was perfectly fine... until consoles could actually perform at 60fps and suddenly it's a gamechanger and 30fps is unplayable. Or 4k, or HDR, or VRR, or all these other technologies that were talked about as a "scam" because console players didn't have the hardware to take advantage of them. But magically when consoles caught up, suddenly they're vital to the gaming experience.
 
Last edited:
But you also have games that aren't raytraced looking pretty grounded too.
Uncharted 4 on a PS4

boon-cotter-epilogue-bedroom-01.jpg


boon-cotter-epilogue-livingroom-01.jpg

I was blown away by this in 2016, the best looking part of the game.

Problem is, everything needs to remain static for this to work, they baked lighting in this house very carefully. But try to break or move some elements like poppabk poppabk said, illusion would fall apart. That's why real time GI is important.
 
But you also have games that aren't raytraced looking pretty grounded too.
Uncharted 4 on a PS4
Nice throwback from the era of bullshots. I played Uncharted 4 recently on PC maxed out at 8k and it looked nothing like those shots.

In fact it looked very flat and dated in many areas.
 
Ray tracing is a scam if you can't afford a proper GPU to utilize it.

Ray tracing is incredible when done right, but yeah it's heavy. I do still tend to use it in everything I can though, the difference is that profound.
 
Ray tracing allows dynamic environment meaning you can destroy a window or entire wall and there will be no bugs in lighting like in the case with backed lighting.
This is the future if we want more credible worlds.
 
It's an in-between period right now. Things will improve, likely at some point.
I want to believe, but also I can't. If life taught me anything, if money is involved things will NEVER get better. In fact it will get worse. With devs getting fired left and right and an all time high-risk of working in IT, I can guarantee you most developers will do their outmost do delay their work. Its why companies are pushing hard for AI, and its why we need a huge control over its usage else it'll be fucking hell.

OR

We'll get amazing quality games every year with the use of AI and pc's and consoles will become absolute as all you'll need is just an AI device to run the game for ya in AR. Right? Right???
 
The whole ray tracing naysayers thing all comes down to current consoles not doing it well. On PC it's absolutely incredible and I don't want to play another game without RT or PT.

They'll change their fucking tune with GTA 6

Like yamaci17 yamaci17 said, if AC shadows had been very good and successful I don't think we would even have this thread because peoples would have a better example of good RTGI and good performance, but peoples skipped because its just another AC.

GTA 6 will simply wipe the floor with naysayers of RTGI.
 
ray tracing and "frame generation" have done absolutely nothing for competitive FPS games so from my perspective both technologies are worthless lol
 
Last edited:
I have a 5090 and I turn ray tracing off if I can.

The only time I do turn it on is to try it with games like single player games and MFG to hit that high frames.

but yes I think the whole technology is garbage and in most of the time it doesn't even look that great.
 
Last edited:
The whole ray tracing naysayers thing all comes down to current consoles not doing it well. On PC it's absolutely incredible and I don't want to play another game without RT or PT.


It reminds me of the 30fps "debate", where 30fps was perfectly fine... until consoles could actually perform at 60fps and suddenly it's a gamechanger and 30fps is unplayable. Or 4k, or HDR, or VRR, or all these other technologies that were talked about as a "scam" because console players didn't have the hardware to take advantage of them. But magically when consoles caught up, suddenly they're vital to the gaming experience.
Well that is the crux of this whole thread. Nvidia wants RT to be mainstream, but for it to become mainstream it "needs" mainstream attachment. Until then those who cannot get the equipment to properly experience RT are bound to write it off.

It is like VR' do you invest into a technology that may or may not be adopted by more than 50% of the market segment? If you do which tech do you buy to get the best experience? Not everyone can afford a Valve Index and may just opt for a Quest 3. What about those who buy 5090s and everyone else is only on 4060/5060. The experience is proportional to the affordable equipment.

Which tech do you develop for? Do you strive for the least common denominator and bottom barrel it with the most basic functions to reach out to the broadest amount of customers, or do you maximize it appealing to the smallest demographic available at release?
 
ray tracing and "frame generation" have done absolutely nothing for competitive FPS games so from my perspective both technologies are worthless lol
They could though - the ray traced hit detection in Doom:DA didnt seem to make a massive difference in SP - but in multiplayer it could really up the fidelity of shooting someone.
 
I remember turning on ray tracing finally when I got my 4090 and was severely disappointed. The only real exception I experienced was Cyberpunk. I still think the tech is lame. Recently I have been playing some indiana jones(not my kinda game after a couple hours) and doom dark ages(path traced and beaten) and still not convinced. Granted those 2 titles I do not have the real ability to turn off the ray tracing since it is baked in. The performance hit with path tracing sure is nasty for some visual improvement but.... in the end I preferred to keep it off for the extra performance for the game.
 
ray tracing and "frame generation" have done absolutely nothing for competitive FPS games so from my perspective both technologies are worthless lol

Arguably the biggest of them all Fortnite has Lumen and nanite, but this is an actual game where there's dynamic destruction/construction that benefit the most from ray tracing. Can disable it but it looks like shit.

The rest of FPS online shooters because none of them implement it?

Why would competitive FPS games make anything tougher to run, they seek potato PC users to run the game for higher userbase. There's no time of day to be had in those games either, most of them are static, or if they look like shit it doesn't matter peoples are running it in 720p to get >500 FPS.

This argument to say a technology is worthless because of such a niche use case is completely insane
 
Well that is the crux of this whole thread. Nvidia wants RT to be mainstream, but for it to become mainstream it "needs" mainstream attachment. Until then those who cannot get the equipment to properly experience RT are bound to write it off.

It is like VR' do you invest into a technology that may or may not be adopted by more than 50% of the market segment? If you do which tech do you buy to get the best experience? Not everyone can afford a Valve Index and may just opt for a Quest 3. What about those who buy 5090s and everyone else is only on 4060/5060. The experience is proportional to the affordable equipment.

Which tech do you develop for? Do you strive for the least common denominator and bottom barrel it with the most basic functions to reach out to the broadest amount of customers, or do you maximize it appealing to the smallest demographic available at release?

It's not relatable to VR because VR is a very propriety technology and a very specific buy-in. With RT it's not. It's not just Nvidia. AMD has made huge strides in their RT performance and we will only see that pay dividends with next-gen consoles. AMDs new GPUs on the PC side now provide acceptable RT performance and it's already been hinted that improved RT performance is going to be a huge focus on next-gen console GPUs.

In 3-4 years when the PS6 drops, we'll finally see quality RT/PT on the console side of things.
 
Benefits of ray tracing:

- Easier to iterate levels and lighting for developers
- Higher graphical fidelity, especially in dynamic time of day or changing lighting conditions
- Baked lighting can take up a lot of disk space.

Negatives of ray tracing:

- It is harder to run.

Hardly a scam, just a leap forward in rendering technology. Also, your Doom example is flawed, the game has other graphical and scale upgrades as well.
So easier for devs to be even more lazy and profit more, harder for the consumer to enjoy it. Sounds shitty to me.

Real talk - developers used tricks and fakery for so long to give the illusion of better real-time rendering that they utterly eclipsed what the hardware was actually capable of. This created a gulf - the difference between what can done in real time, and what can only be done when you spend millions on elaborate tricks to fake it.
If the "elaborate tricks" made the game easier to run for consumers, how is it bad? Of course it's so much better for devs who can make any slop on UE5 and put RT and make it look "good", what about the consumer though?

Poor modern devs can't make functioning mirrors anymore, but sure as hell can charge 70 dollars for a game.

Also you are not even considering the fact that no game needs RT at all to look good. So what gulf are we talking about exactly?

And eternal looks like a cartoon in comparison because it's a last gen Sherlock, did you just discover it?, if that's how RT will look and the perform then give me an games RT asap.
If it does it is because of general improvements, not RT lmao. I'm glad it flopped on PC.
 
So easier for devs to be even more lazy and profit more, harder for the consumer to enjoy it. Sounds shitty to me.
Two benefits you directly ignored, less disk space consumed vs baked lighting, and higher visual fidelity. The only way regular rasterisation can match RT is with offline baked lighting in a completely static environment. Faster game development is also something that can benefit consumers greatly.
 
So easier for devs to be even more lazy and profit more, harder for the consumer to enjoy it. Sounds shitty to me.
I swear, some devs reading forums must laugh at how utterly moronic gamers are. It's more efficient and gives better results. How is that an easier way to be 'more lazy"?

Next you're gonna tell us how vehicles make people more lazy because they walk less.
 
Last edited:
So easier for devs to be even more lazy and profit more, harder for the consumer to enjoy it.
You are playing games not to drool on shadows and reflections? You are the weird one. It's the crux of the modern gaming, especially when Battaglia will point using 1800% zoom where to look. You are missing out hugely, and I don't know what's the point even to play games in this case
 
Last edited:
It's not relatable to VR because VR is a very propriety technology and a very specific buy-in. With RT it's not. It's not just Nvidia. AMD has made huge strides in their RT performance and we will only see that pay dividends with next-gen consoles. AMDs new GPUs on the PC side now provide acceptable RT performance and it's already been hinted that improved RT performance is going to be a huge focus on next-gen console GPUs.

In 3-4 years when the PS6 drops, we'll finally see quality RT/PT on the console side of things.
It is completely relatable.

VR is a motion tracking head mounted monitor and back in 2015 the market was desperately trying to get people to buy into the technology as the next evolution to gaming and augmented reality computing. It did not take.

Raytracing is a realtime lighting calculation engine intro duced in the early 2000s as rendered images due to its heavy computing cost.
2017 Nvidia introduced the world to realtime RT. Which in 8 years still isn't available to over 50% of gaming machines. It isn't available to the casual massive market yet and thus you got threads like this one. Also, lets not underplay the sheer amount of shit games that implement it and it runs like a
dog based diarrhea terd on your lawn when turned on. That does nothing but harm adoption moving forward as people get bad tastes in their mouth.

As for otherpoints that is just showing market adoption from a manufacturing point of acceptance that there is no other technology to latch onto as a branding iron to convince people to buy into. Hell it took AMD 3 generations of GPU architecture to stop trying to emulate RT computation and actually physically manufacture a GPU that would natively do it. So.... 6 years for AMD to actually adopt RT?

The contradiction here is that Nvidia lead the charge and actually got everyone to turn the bend on actually acknowledging that RT is worth a proper investment intom as its not going away. Now we got our first couple of games that require RT capable hardware to run and people are losing their minds because they never bought anything that natively supported RT in a manner that would provide a good experience.
 
It will have its place, but the cost Vs what you get in return is still way too high for alot of hardware/pc users. I'd forget about it on console for sure..it's just a waste of time, quite frankly.
 
And that is why some games had to not have real-time day/night cycles but predetermined times of days for their map. Assassin creed unity here comes to mind. Amazing baked lighting, they made hard choices on the limitations.

Assassin creed shadows would have taken 2 years just to calculate the baking for full time of day plus all seasons for a massive map and take 2TB of data for light


RTGI is absolutely the future, we'll drag your ass to RT wether you want it or not, because if we listen to tech illiterate peoples on their opinions on tech we still wouldn't have shaders because in 2001 they didn't think it was worth upgrading nor a reason to install DX9, fucking nvidia and Microsoft forcing me to upgrade 😩 wahhhh
Why would it take 2 years to bake lighting if the devs leverage one (or multiple) RTX cards? I failed to understand this part.

Let's say using RTGI the GPU has enough power to render the scene in real time (so maybe about at minimum 16ms per frame), how can it takes 2 years to bake these lighting information to the textures? Or storing light information at one point in space to the light probe? Could the problem be because of the engine design instead? I remember in UE5, using lumen the lighting can be baked very quickly when we choose option to use the RTX GPU.

I know that the process has overhead, also with light probes the dev need to take care of placing light probes to avoid increase the size of the game too much. But given this example game:
  1. No dynamic lighting required.
  2. Not open world, hub-like design.
Is the cost of using RTX card to bake lighting that much higher?
 
What part makes me a boomer exactly? The fact that I want good graphics instead shitty mobile game graphics or the fact that I'm against fake frames and the egregious lies Jensen told?
Get ready for more refined 'fake frames'. Like someone else mentioned, Ray Tracing is a heavy lift even for the latest GPU. This isnt specific to Nvidia, every major player is pushing fake frames, fake resolution, etc.

Expecting exponential increase in raw hardware performance is what makes you a boomer.
 
It is completely relatable.

VR is a motion tracking head mounted monitor and back in 2015 the market was desperately trying to get people to buy into the technology as the next evolution to gaming and augmented reality computing. It did not take.

Raytracing is a realtime lighting calculation engine intro duced in the early 2000s as rendered images due to its heavy computing cost.
2017 Nvidia introduced the world to realtime RT. Which in 8 years still isn't available to over 50% of gaming machines. It isn't available to the casual massive market yet and thus you got threads like this one. Also, lets not underplay the sheer amount of shit games that implement it and it runs like a
dog based diarrhea terd on your lawn when turned on. That does nothing but harm adoption moving forward as people get bad tastes in their mouth.

As for otherpoints that is just showing market adoption from a manufacturing point of acceptance that there is no other technology to latch onto as a branding iron to convince people to buy into. Hell it took AMD 3 generations of GPU architecture to stop trying to emulate RT computation and actually physically manufacture a GPU that would natively do it. So.... 6 years for AMD to actually adopt RT?

The contradiction here is that Nvidia lead the charge and actually got everyone to turn the bend on actually acknowledging that RT is worth a proper investment intom as its not going away. Now we got our first couple of games that require RT capable hardware to run and people are losing their minds because they never bought anything that natively supported RT in a manner that would provide a good experience.

They aren't remotely relatable.

RT doesn't require specific proprietary hardware separate from the rest of your gaming setup. Again, all modern GPUs already support RT in one way or another.

RT isn't locked behind specific brand or storefronts the way headsets like Meta Quest, etc.. do.

RT doesn't require the customer to buy a specific version of a game or have any exclusive games that you need to invest into to take advantage.

RT doesn't require you to strap a bulky, uncomfortable device to your face to be able to appreciate it.

RT doesn't require you to seperate yourself from the rest of the world to use it

RT doesn't doesn't require you to create a specific room or need significant space requirements to use it.




I could go on endless about this but the comparisons are extremely surface level, at best. They aren't remote comparable.
 
But you also have games that aren't raytraced looking pretty grounded too.
Uncharted 4 on a PS4

boon-cotter-epilogue-bedroom-01.jpg


boon-cotter-epilogue-livingroom-01.jpg
Raster games like Uncharted 4 can look good but also ugly depending on the scene. RT is the only technology that can address this problem.

Samuel is standing in the shadows and yet he is well lit.

u4-2024-12-05-14-59-33-018.jpg


Grass look flat without RT GI and AO

tll-2024-12-05-17-24-58-676.jpg


People say PC version is downgraded, so I'm also including PS4Pro screenshot.

Uncharted-4-Kres-z-odzieja-20241206014639.jpg


All these people who are saying crap about RT simply dont know much about technology and graphics. If you know where to look and what RT does the difference is often staggering. I have played tens of RT games and I could EASILY tell the difference, therefore I'm very happy with this technology. The framerate is very good on my RTX 4080 Super OC (60TF), and I don't mind using DLSS in the most demanding RT games. Some RT games run at well over 120 fps, even at 4K native (doom eternal, Resident Evil 2, 3, village).

4.jpg


Currently, I'm playing Indiana Jones: the Great Circle and get around 170–230 fps with standard RT (4K DLSS Q+ FGx2).

DLSSQ-x2.jpg


4-K-DLSSQ-RT.jpg


4-K-DLSSQ-RT-2.jpg



RT is also very scalable. In some games RT doesn't even affect the framerate on my 4080S, for example RT reflections in cyberpunk.

RT reflections vs ultra SSR (raster). The framerame is similar, yet RT reflections drastically improve graphics fidelity. On my PC there's simply no point to play this game with raster.

raster.jpg


rt.jpg



RT-reflections.jpg


ultra-ssr.jpg


RT can also bring a new life into old games.

GTA5-Enhanced-2025-04-13-20-06-04-085.jpg


GTA5-Enhanced-2025-04-13-20-05-53-300.jpg


GTA5-Enhanced-2025-04-13-19-40-17-295.jpg


GTA5-Enhanced-2025-04-13-19-40-28-297.jpg


GTA5-Enhanced-2025-04-13-20-56-16-659.jpg


GTA5-Enhanced-2025-04-13-20-56-26-434.jpg


GTA5-Enhanced-2025-04-13-20-03-19-419.jpg


GTA5-Enhanced-2025-04-13-20-03-31-185.jpg


GTA5-Enhanced-2025-04-13-21-00-16-213.jpg


GTA5-Enhanced-2025-04-13-21-00-27-363.jpg


1a.jpg


1b.jpg


2a.jpg


2b.jpg


5a.jpg


5b.jpg


3a.jpg


3b.jpg


RT can be very expensive in some games and cut the framerate in half. However, if I'm getting well over 60 fps anyway, I'm not going to complain about that :).

PT games are however way more demanding than standard RT (nearly twice as demanding based on my tests) and there should be a different category for these games. I'm not so sure if current hardware is ready for PT (especially midrange GPUs) but high end cards with DLSS technology can run PT games with acceptable results. Even my upper midrange 4080S can run PT with playable framerate (especially if I use DLSS SR with DLSS FGx2), so I can still play PT games and experience this new technology :). At 1440p I need to use DLSS Quality + FGx2 to get 120-170fps in PT games. At 1800p framerate is also very good (around 95-130fps), but at 4K I need to use more agressive DLSS settings, but to be honest even DLSS performance (transformer model) look 4K like and surpass image quality on consoles.

Alan Wake 2 PT (1440p DLSS Q + FGx2)

Alan-Wake2-2025-03-14-02-49-22-872.jpg


Alan-Wake2-2025-03-14-02-45-41-095.jpg


Alan-Wake2-2025-03-14-12-30-30-435.jpg


Alan-Wake2-2025-03-14-02-58-24-706.jpg


Indiana Jones with PT (1440p DLSS Q + FGx2)

1440p-DLSSQ-PT.jpg


PT-3.jpg


PT-2.jpg
 
Last edited:
They aren't remotely relatable.

RT doesn't require specific proprietary hardware separate from the rest of your gaming setup. Again, all modern GPUs already support RT in one way or another.

RT isn't locked behind specific brand or storefronts the way headsets like Meta Quest, etc.. do.

RT doesn't require the customer to buy a specific version of a game or have any exclusive games that you need to invest into to take advantage.

RT doesn't require you to strap a bulky, uncomfortable device to your face to be able to appreciate it.

RT doesn't require you to seperate yourself from the rest of the world to use it

RT doesn't doesn't require you to create a specific room or need significant space requirements to use it.




I could go on endless about this but the comparisons are extremely surface level, at best. They aren't remote comparable.
If you want to nitpick things apart we can do that with RayTracing adoption as well. However some of those points are moot because not all VR headsets meet all those points brought up. And if I were to take it at face value, these points are roadblocks to adoption not a separator to the consumer adoption just as much as RayTracing is to them.

I expecct we will be another 2-3 years from broad market saturation that every level of gaming device is going to have native RayTracing hardware inside them. Until then the consuming market still speaks that its not ready for broad adoption of RayTracing. Not NPUs, not RT emulation/software, but full on hardware inside dedicated to running RT 4b/8b code realtime.
 
Expecting exponential increase in raw hardware performance is what makes you a boomer.

It just makes me a realist. One way or another a big leap will happen in computing thanks to AI. AI will design better chips and AI will also help accelerate the development of quantum computing.

I'm not against AI for things like upscaling but fake frames is questionable, one frame might be acceptable but multi frame gen is horrible and leads to significant artifacting and lag (which they try to compensate for by adding even more AI trickery in the form of Reflex which adds even more artifacts). Picture quality matters and a picture full of bad AI artifacts all over the place is an awful way to play a game. But then fact that Jensen lied to everybody's face by saying that the $549 5070 was as powerful as the 4090 takes the cake, that's why it's the biggest scam, it's that Jensen wants to pass off these horrible fake frames as real ones.
 
RT can also bring a new life into old games.

that be true ray traced gi in gta 5 is really cool and makes game look much better. even incredible with secondary bounce



basic ray traced global illumination. literally goes from forbidden west level of flat lighting to something else entirely

beautiful ambient light hitting everything



i had to replay witcher 3 for the 5th time or something just because of how transformative ray traced gi was in that game
 
Last edited:
It's good, in principle, because it can facilitate dynamic worlds, but developers aren't using it to do that, as it's always an "option" that can be turned on or off, so you can't design a game around it. In theory it would also save a ton of dev time, as the old method was time consuming, but because it can be turned on and off in actuality it means MORE not less work. So the potential is there, but it's been done before the hardware is sufficiently ready/capable, and yeah as some have mentioned, it's such an inefficient hog we're likely to see ai lighting solutions before it's more broadly viable.
 
They'll change their fucking tune with GTA 6

Like yamaci17 yamaci17 said, if AC shadows had been very good and successful I don't think we would even have this thread because peoples would have a better example of good RTGI and good performance, but peoples skipped because its just another AC.

GTA 6 will simply wipe the floor with naysayers of RTGI.
At 30 fps, I doubt it will convince naysayers. We just need to put up with this for the rest of the generation. Next generation will surely shut them up. Hopefully
 
Scam? Not sure about that, but it's something I've wanted since the PS1 days, so I'd say no. It's about damn time, if anything.

Marketing-wise? That could be a different story.
 
that be true ray traced gi in gta 5 is really cool and makes game look much better. even incredible with secondary bounce



basic ray traced global illumination. literally goes from forbidden west level of flat lighting to something else entirely

beautiful ambient light hitting everything



i had to replay witcher 3 for the 5th time or something just because of how transformative ray traced gi was in that game
My GTA5 EE screenshots show the launch version, but I know that Rockstar has refined RT since then. As you said, they added a secondary bounce and improved the quality of the reflections. I would need to download this game again to take new screenshots, but I think even this old version clearly shows the difference.

As for the Witcher 3 this old game has better lighting than many new open world games thanks to RT GI and AO. Since I started playing RT games, I have found it much easier to notice the flat lighting (i.e. the lack of indirect shadows and light bounce) in other games. For example, I immediately noticed the flat lighting in the remastered version of Horizon even though this game was prised for it's graphics.
 
Last edited:
But what happens if you bump in to those curtains and move them? What if you draw them fully? When you walk over the clothes and they move are they still grounded? What if you drop a box on the floor? How does it look if you return when its dusk?
You can have great lighting without real-time ray tracing but you have to limit what the player can do or spend inordinate amounts of time planning for every conceivable option.
That is true but most games don't have that kind of physics to begin with. Games that do can still look good too like the TLOU remake. Look at that vs something like DMC which tanks performance for some reflections or something.
Ray tracing or some variation is the future of gaming - whether you think it sucks right now is probably mainly based on the hardware you have.
My dude I have 5090. I don't think it sucks I just think the difference isn't huge especially when in games that even have raytracing those curtains don't move, those clothes are static, etc. A lot of games use "Raytracing" as a bulletpoint and not for something substantial.
 
Post some VIDEOS of examples. No one is impressed with some small little rooms in Uncharted where nothing happens.

This is what the lighting looks like in Uncharted 4:


Reminder that its running on a PS4. The dynamic rocks still look grounded too.
And that looks much worse than this?



Not really.

Nice throwback from the era of bullshots. I played Uncharted 4 recently on PC maxed out at 8k and it looked nothing like those shots.

In fact it looked very flat and dated in many areas.
That is not a bullshot. Its from the game.

 
Last edited:
raytracing is used for things it shouldn't be used for currently.
the hardware isn't powerful enough to use it in the capacity some devs want to use it.

there's games that use it well, like Doom, Spiderman or Ratchet... and games that use it as a crutch like 90% of UE5 games.
 
I wonder the difference in budgets, not to mention what GSC went through during development.
Yes I agree, raytracing helps get realistic results without the manual effort.

Raster games like Uncharted 4 can look good but also ugly depending on the scene. RT is the only technology that can address this problem.

Samuel is standing in the shadows and yet he is well lit.

u4-2024-12-05-14-59-33-018.jpg


Grass look flat without RT GI and AO

Uncharted-4-Kres-z-odzieja-20241206014639.jpg


tll-2024-12-05-17-24-58-676.jpg


All these people who are saying crap about RT simply dont know much about technology and graphics. I have played tens of RT games and I could EASILY tell the difference, therefore I'm very happy with this technology. The framerate is very good on my RTX 4080 Super OC (60TF), and I don't mind using DLSS in the most demanding RT games. Some RT games run at well over 120 fps, even at 4K native (doom eternal, Resident Evil 2, 3, village).

4.jpg


Currently, I'm playing Indiana Jones: the Great Circle and get around 170–230 fps with standard RT (4K DLSS Q+ FGx2).

DLSSQ-x2.jpg


RT is also very scalable. In some games RT doesn't even affect the framerate on my 4080S, for example RT reflections in cyberpunk.

RT reflections vs ultra SSR (raster). The framerame is similar, yet RT reflections drastically improve graphics fidelity. On my PC there's simply no point to play this game with raster.

raster.jpg


rt.jpg



RT-reflections.jpg


ultra-ssr.jpg


RT can also bring a new life into old games.

GTA5-Enhanced-2025-04-13-20-06-04-085.jpg


GTA5-Enhanced-2025-04-13-20-05-53-300.jpg


GTA5-Enhanced-2025-04-13-19-40-17-295.jpg


GTA5-Enhanced-2025-04-13-19-40-28-297.jpg


GTA5-Enhanced-2025-04-13-20-56-16-659.jpg


GTA5-Enhanced-2025-04-13-20-56-26-434.jpg


GTA5-Enhanced-2025-04-13-20-03-19-419.jpg


GTA5-Enhanced-2025-04-13-20-03-31-185.jpg


GTA5-Enhanced-2025-04-13-21-00-16-213.jpg


GTA5-Enhanced-2025-04-13-21-00-27-363.jpg


1a.jpg


1b.jpg


2a.jpg


2b.jpg


5a.jpg


5b.jpg


3a.jpg


3b.jpg


In some games RT is very expensive and can half framerate, but if I'm getting well over 60fps anyway I'm not going to cry about that :).

PT games are however way more demanding than standard RT (twice as demanding based on my tests) and I'm not so sure if current hardware is ready for PT. But DLSS technology makes even extremely demanding PT usable (especially DLSS SR combined with DLSS FGx2), so I can still play PT games with playable framerate :). At 1440p I need to use DLSS Quality + FGx2 to get 120-170fps in PT games. At 1800p framerate is also very good (around 120fps), but at 4K I need to use more agressive DLSS settings, but that's fine since even DLSS performance (transformer model) look 4K like and surpass image quality on consoles.

Alan-Wake2-2025-03-14-02-49-22-872.jpg


Alan-Wake2-2025-03-14-02-45-41-095.jpg


Alan-Wake2-2025-03-14-12-30-30-435.jpg


Alan-Wake2-2025-03-14-02-58-24-706.jpg
Yeah there are times when the illusion falls apart with something noticeable when approximating with simpler more performant methods. It depends if getting rid of that is worth running at half the framerate all the time though.
 
Last edited:
I want to believe, but also I can't. If life taught me anything, if money is involved things will NEVER get better. In fact it will get worse. With devs getting fired left and right and an all time high-risk of working in IT, I can guarantee you most developers will do their outmost do delay their work. Its why companies are pushing hard for AI, and its why we need a huge control over its usage else it'll be fucking hell.

OR

We'll get amazing quality games every year with the use of AI and pc's and consoles will become absolute as all you'll need is just an AI device to run the game for ya in AR. Right? Right???
It's quite the opposite, when money is involved, things get better. People vote with their wallets. If money did not make things better, there would be no PS5 or XSX and we'd all be playing sprite based games built by a team of 2 on their time off using midi files and 10+ years of development. Not knocking that, of course.
 
Why would it take 2 years to bake lighting if the devs leverage one (or multiple) RTX cards? I failed to understand this part.

Let's say using RTGI the GPU has enough power to render the scene in real time (so maybe about at minimum 16ms per frame), how can it takes 2 years to bake these lighting information to the textures? Or storing light information at one point in space to the light probe?
Using RTGI and baking lighting are 2 completely different workflows. The former is realtime and dynamic, which means what needs to be calculated is only that within the current scene. The latter needs to be calculated and statically stored in lightmaps for not just the current scene, but every possible scene (such as time of day, weather etc.). So basically every possible scene for every location of the entire game needs to be statically baked and stored so that they can be loaded in runtime without RT. For an open world game with dynamic ToD, seasons and destructability, this is practially impossible (or cost prohibitive) to do. To do this for an entire game, like Spiderman 2 did, devs use massive server farms. Not just single RTX cards. The scope is several orders of magnitudes more. They couldn't attempt RTGI because spidey's movement was too fast while swinging and by the time RTGI accumulation could happen, the scene would change. So baked lighting still could have its uses, but the more dynamic a game becomes, the less it is feasible. So if spiderman 3 needs to support random, dynamic building level destruction while fast swinging, then they will have to come up with some hybrid solution, where destructible objects are not baked while the rest are. This can cause a lot of lighting issues though, so I doubt they will go down that path. Or RTGI, accelerated by AI, gets rid of the accumulation problem and they can stop baking altogether.

Could the problem be because of the engine design instead? I remember in UE5, using lumen the lighting can be baked very quickly when we choose option to use the RTX GPU.
This issue applies to all engines. Not sure what you mean by baking lighting using lumen. The whole point of lumen is to not bake the lighting. Using RTX just means hardware accelerated RT versus software? If you mean disabling lumen and using an RTX card instead to bake lighting, then yes, the RT cores help accelerate light baking.

I know that the process has overhead, also with light probes the dev need to take care of placing light probes to avoid increase the size of the game too much. But given this example game:
  1. No dynamic lighting required.
  2. Not open world, hub-like design.
Is the cost of using RTX card to bake lighting that much higher?

It's faster than using software baking. But there is no comparison to RTGI in terms of time it takes. If the entire game is just one fixed scene, then they are comparable.
 
Last edited:
The whole ray tracing naysayers thing all comes down to current consoles not doing it well. On PC it's absolutely incredible and I don't want to play another game without RT or PT.


It reminds me of the 30fps "debate", where 30fps was perfectly fine... until consoles could actually perform at 60fps and suddenly it's a gamechanger and 30fps is unplayable. Or 4k, or HDR, or VRR, or all these other technologies that were talked about as a "scam" because console players didn't have the hardware to take advantage of them. But magically when consoles caught up, suddenly they're vital to the gaming experience.

Pretty much. RT isn't great on console. Games like Jedi Survivor showed us that. I would much rather have clean graphics than that mess. I think RT this gen was a necessary step and hopefully next gen will be able to be utilize it, but really is premature to judge RT from what consoles have to offer at this point.
 
Last edited:
I see you changed that video fast to the cherry picked stuff I saw the video you posted before your edit. It didn't look that great in general gameplay.

I just changed it to a video that was longer and showed more areas. There is nothing to hide. You show off that house in Uncharted 4, that nothing of substance happens in. A small little area that those developers could bake to perfection. Other people in this thread show shots of Uncharted 4 where the real gameplay takes place and it's severely dated. The lighting in it is a joke by today's new standard. Same for the Horizon and the Death Stranding games. And DS2 is supposed to be the latest showcase of how the old way of lighting is still viable. What a crock.
 
Last edited:
Ray tracing will be essential to driving down dev costs and time as it eliminates the need for baked lighting.
I'm not an investor. Plus, we saw what happened when that modder got GI to work in Crysis 3. Made the game look boring and drab. Lighting baked by really good artists will always look better than global illumination. Any money publishers would save from not having to hire these artists will just be spent on the absolute worst things you can imagine.
 
I just changed it to a video that was longer and showed more areas. There is nothing to hide. You show off that house in Uncharted 4, that nothing of substance happens in. A small little area that those developers could bake to perfection. Other people in this thread show shots of Uncharted 4 where the real gameplay takes place and it's severely dated. The lighting in it is a joke by today's new standard. Same for the Horizon and Death Stranding games.
Yeah right, for transparency this is what you had posted before:



You look at that storm and wonder does it look significantly better than this running on weak PS4?



I would say in places yes but significantly no.
 
Last edited:
Top Bottom