• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Is Raytracing a necessity?

Is Raytracing features a necessity for games to have?

  • Yes

    Votes: 141 27.5%
  • No

    Votes: 350 68.4%
  • Cannot decide

    Votes: 17 3.3%
  • Others ( Please elaborate )

    Votes: 11 2.1%

  • Total voters
    512
If you're going to add RT to a 2004 game but fail to make it visually compete with current titles and end up with heavier system requirements, then the tech is pointless
 
Because most other efforts don't shoot for RTGI. The ones that have show massive improvements and Cyberpunk is one of them. With path/raytracing, the lighting looks nowhere near as good.
Back before the time of advanced deferred raster GI (2005-ish) RT was a much simpler discussion in which RT interchangeably meant RT/PT as appropriate and even at just 1 ray cast and 1 ray bounce (with an ambient) the benefits as real-time lighting were unquestioned - had performance existed to use it even at 320x240 resolutions.

However, in the time since.. rendering resolutions have risen sharply along with impressive non-RT GI lighting that has became so impressive that - RT/PT terminology aside - the value of RT to real-time rendering on inadequate hardware - that includes xx80/xx90 - comes with extra criteria for things like lighting density - relative to render resolution and geometry complexity - or lighting complexity - bounce counts to go beyond basic physically accurate shadows or reflections on non-planar surfaces - leading to many games that provide RT failing to convey that it is indeed RT and to be easily appreciated by your average gamer without specific knowledge of GI lighting.

In the Quake 2 RT the visuals benefit from the low geometric detail and original target 320x240 software raster resolution relative to higher lighting density, and it is accompanied by higher lighting complexity with higher bounce counts giving a volumetric quality to the dynamic lighting way beyond the volume-less lighting of raster GI. The light at a per pixel level correctly diverges - faintly - with the smallest camera motion that gives a more tangible volumetric quality for every pixel , as though each viewport pixel has its own shaft (or shafts) of light propagating through it, like lighting in lightmap games like quake3 under gang ways, where the RT lightmaps captured this multi-sample complexity and could be appreciated as having volume when viewed at mid range, even though the lighting was static

Which sort of highlights a problem IMO on PC/Console games trying to add RT worth while, because resolutions have been driven high in raster and raster GI techniques over time and set to too high a baseline, to the point that the disconnect between hardware resources for a RT light density and complexity that can give a physical based RT volumetric quality to those resolutions is completely unachievable at 60fps, forcing the use of upscaling and MFG (HL2 RT demo illustrates this) which still results in poor RT density or lighting noise levels relative to the native render resolution and geometry complexity failing to deliver meaningful noise free volume based dynamic lighting that easily puts raster GI techniques in pristine noise free high resolution renders to shame, and in a lower tier.

Cyberpunk was the classic case where even on a 4080 super the lighting density and complexity was too low to render at 1440p60 and with it looking like RT with a volumetric quality, and instead looked closer to raster GI techniques, just with physically accurate shadowing and non-planar reflection as a feature, which barely make a difference IMO.
 
Last edited:
Doubtful since I've never enabled it in a single game.. in time it will be something we won't want to live without but it's not quite where it needs to be yet.
 
Now that we are now on 2026 on moving, now can you say that RT or even PT really a necessity or impacting to your gaming experience and enjoyment the last few years?
Or IQ and frame rate more important?
 
I think that, in theory at least, raytracing can be great for indie developers who want to make more graphically ambitious games. Sure, it requires users to have better hardware, which limits potential sales, but the old rasterization pipeline has so many limitations that require a lot of work to deal with, if you want to come anywhere near to matching what the AAA game companies are capable of. Raytracing levels the playing field somewhat.

Now, I haven't seen many indie games take advantage of raytracing yet, but I think it'll happen. Assuming that the computer hardware industry doesn't completely pivot to supporting LLM trash, anyway.
 
Now that we are now on 2026 on moving, now can you say that RT or even PT really a necessity or impacting to your gaming experience and enjoyment the last few years?
Or IQ and frame rate more important?
RT/PT has certainly improved my gaming experience. On my GPU I can play with RT/PT enabled and still achieve a very good image quality and framerate at 1440p. At 4K, I have to make bigger compromises in PT games (enable DLSSP), but it's still worth it.

Alan Wake 2 with PT 120-170fps

Alan-Wake2-2025-03-14-02-49-22-872.jpg


Alan-Wake2-2025-03-14-02-54-41-754.jpg


Alan-Wake2-2025-03-14-02-56-31-279.jpg


Cyberpunk with PT 110-140fps.


PT-DLSSQ-FG2.jpg


1440p-Q-FG-3.jpg


1440p-Q-FG-2.jpg


RT in Cyberpunk is also highly scalable, so I can run this game even faster while maintaining superior visual quality compared to raster.

Path tracing - 131 fps

RT Ultra - 183 fps

RT Reflections - 233 fps
 
Last edited:
Absolutely a necessity for any modern game pushing a realistic or semi-realistic style beyond a certain level of fidelity. You can have all the polygons and asset density you want but past a certain point, if you don't have an RTGI solution for accurate lighting and to ground characters and objects in the environment, your game will simply look "wrong" to trained eyes. Even the best baked lighting solutions can't compare and are much more expensive both to develop and in terms of drive space. Someone's gonna @ me with Last of Us 2 or some last-gen shit saying it looks better than recent high-end RTGI titles. It doesn't.
 
Last edited:
Now that we are now on 2026 on moving, now can you say that RT or even PT really a necessity or impacting to your gaming experience and enjoyment the last few years?
Or IQ and frame rate more important?
There is no option besides some form of RT for dynamic realistic (indirect) lighting, period. So if you want games to be dynamic and have the most realistic lighting possible, and all that that encompasses, it is a necessity.
Is it a necessity for linear static games like f.e. a GoW or anything with say a more abstract artstyle? no.

So the answer is "it depends on what game you want to make" and from the consumer side "it depends on how high your standards are". For me personally unrealistic lighting in realistic settings really...really distracts me after having played with max RT wherever possible for years. Glowing objects in shadowy areas just look a bit archaic to me now.
 
Last edited:
A good RT/PT implementation simply looks better. Sure, big performance hit, obviously. But that hit is getting smaller as generations move on. Combined with the quality of upscalers and ML denoising and it's certainly worth it for me.
 
Absolutely a necessity for any modern game pushing a realistic or semi-realistic style beyond a certain level of fidelity. You can have all the polygons and asset density you want but past a certain point, if you don't have an RTGI solution for accurate lighting and to ground characters and objects in the environment, your game will simply look "wrong" to trained eyes. Even the best baked lighting solutions can't compare and are much more expensive both to develop and in terms of drive space. Someones gonna @ me with Last of Us 2 or some last-gen shit saying it looks better than recent high-end RTGI titles. It doesn't.
Also Last of Us 2 was developed by Naughty Dog, who are the foremost experts on Playstation hardware and among the foremost experts on graphical rendering in general in the industry. And even though they're a big studio with much more funding than most, it must've taken a huge amount of work.

The real positive of raytracing isn't even so much for users, I'd say, it's that it has the potential to level the playing field on the developer side. With raytracing it takes waaay less work to make a game looks nice, because that work is offloaded to the user's GPU.
 
Now that we are now on 2026 on moving, now can you say that RT or even PT really a necessity or impacting to your gaming experience and enjoyment the last few years?
Or IQ and frame rate more important?
I don't think it's a necessity, but it has improved and impacted my experience in games that use it and makes them objectively better than not having it. I'd rather have it as an option, especially when so many developers have lost the fine art of hand crafted lighting that looks really good without it.
 
Notice how people still have to reference Cyberpunk 2077 as the best use of it all these years later lol. No, still for 99% of games.
Games like Cyberpunk and The Witcher 3 show the biggest differences, which is why people like to use them as good examples of what RT can do for the games graphics. However, I can see a noticeable difference even when only a single RT effect is used, especially RT reflections, because nothing breaks immersion more than fading screen space reflections. As I proved a few pages back, well-implemented RT reflections can be even cheaper than SSR.
 
Raytracing could be interesting if it was used as a game mechanic.

Clarity of the picture and the ability to understand what is happening and how the game functions will always triumph over graphical gimmicks.
 
Also Last of Us 2 was developed by Naughty Dog, who are the foremost experts on Playstation hardware and among the foremost experts on graphical rendering in general in the industry. And even though they're a big studio with much more funding than most, it must've taken a huge amount of work.

The real positive of raytracing isn't even so much for users, I'd say, it's that it has the potential to level the playing field on the developer side. With raytracing it takes waaay less work to make a game looks nice, because that work is offloaded to the user's GPU.
I forgot to list RDR2 next to TLOU2 and yeah, the same applies. They often get held up as examples of "games don't need to look better than this" while people tend to forget that ND and R* are the two most technically capable devs on the planet (and yet both games still show the limits of rasterized lighting).

Several things went wrong with ray tracing that have little or nothing to do with ray tracing itself:
  1. UE5 was designed for 30 fps on consoles, just like UE4 and UE3 were. Unfortunately, for numerous reasons this was also the gen where console gamers demanded a return to 60 fps. Therefore we have a lot of UE5 titles looking and running like shit on consoles in their 60 fps modes because they were never designed to do so. Low quality ray tracing or sometimes cut completely. Sony making their big releases focus on 60 fps using largely last-gen tech really hasn't helped.
  2. Unlike last gen, these consoles were a lot closer to the average PC spec at time of release. PC gamers got used to maxing everything out in last gen ports and still getting good image quality and performance. Gamers failed to appreciate this nuance this gen and set expectations accordingly contributing to UE5 getting a bad rep on PC beyond what was deserved.
  3. For whatever reason, people can't see the improvements of RT as easily. I constantly see people holding up certain last-gen games as the peak of visuals. With their shitty screen space reflections and lack of GI causing characters and objects to glow like the CIA. RT reflections and GI are an enormous graphical leap when used correctly, but the the majority of gamers when assessing visuals seem only attuned to environmental detail, not lighting quality. Hence the proliferation of retarded statements like "only Digital Foundry can tell the difference."
  4. It really doesn't help when high-profile games like Elden Ring and Space Marine 2 get abysmal RT updates that do fuck all except make the game run worse. That kind of thing just hurts the cause.
  5. Hardware got expensive as fuck and gaming discourse got more polarized and poisonous than ever, so RT became a proletariat vs. bourgeoisie issue instead of gamers collectively getting excited for the future of visuals like back in the Doom 3 or Crysis days.
It's all so tiresome, but thankfully, it's still the future of gaming visuals.
 
Last edited:
Please bestow upon us the list of all time classics that have native ray tracing.

The big thing about ray tracing for me is the games that use it are largely subpar. Not saying it's inherent to ray tracing, but why buy a $999 GPU when the games coming down the pipeline look like such shite? You might need to upgrade by the time the industry corrects its heading.
 
Last edited:
Judging by Minecraft/2077 yes… but I don't want the huge performance hit, so give me the best rt you can do while maintaining 60 like insomniac does
 
Not necessary, and sometimes detrimental to the experience if the hardware can't handle it.

Having said that, it breathes new life into older games if you can run things at acceptable levels. For example, I played through Control (2019) recently after upgrading to a 5070ti, which made it playable at 4K 60FPS with Ray Tracing. It looked beautiful.

Or games like Avatar: Frontiers of Pandora, where the Ray Tracing made all the vegetation blend into the world, rather than standing out, making the world seem more convincing, which is the main (only?) draw of that game.
 
Last edited:
I think ray tracing is the easiest technique for environment designers to have realistic lighting (and shadows); You put a source of light and the algorithms do the rest. With the other techniques, more work and tricks have to be done.
 
I can't deny it is really transformative in many cases and looks awsome
And it should ease a bit the job of lighting artists
But for some reason, except in indiana Jones and Doom, it's never well optimized
 
My first thought was no, definitely not. But after thinking about it, I'm not sure as I don't think we have really seen the true value yet.

Necessary, not yet but we might get there. I'm in the not sure camp.
 
Realtime RT is not necessary for games that can be fine with static time of day, since devs can bake RT-quality lighting into textures.

But realtime RT (RTGI) is necessary for games that have dynamic time of day if you want to achieve high level of quality for lighting, reflections, shadows. The rasterized alternatives are much weaker and we have known this since Metro Exodus.

SVOGI (voxel based RTGI) is a pretty good compromise between the two. Looks very good and is much more performant than realtime RT. But I don't think any engines aside from Cryengine are using it.
 
Last edited:
In linear games? nah. Baked lighting is enough.

In open world games? Absolutely

Cyberpunk and AC Shadows show how much better an open world game with dynamic time of day looks with RT. It's a necessity.

AC Unity still looks bonkers, but it had fixed time of days with already pre baked lighting. AC Syndicate had a dynamic time of day after so many fans cried about the lack of it in Unity. The result? Looked like shit compared to Unity.

RDR 2 was different: it had tons of baked lighting data tied to specific times of day. So when the game changed from morning to afternoon, it loaded another set of baked lighting (you could tell when it happens by a sudden change in lighting out of nowhere sometimes). This is a crazy expensive and time consuming technique (lighting has to be reviewed over and over again by artists after being baked), so few studios can pull it off with a game this big.

So, RT in open world games should be the future. Supposedly Rockstar is using RT in GTA VI, so hopefuly other studios will follow.
 
If I didn't get used to path tracing /high level Ray-tracing, I would definitly enjoyed games just as much without it. But after being used to the 5090 it would be rough going back to playing modern games that have these features ane not using it..
Just like going back to 1440p after 4K. A bit rough, even though it feels kinda silly to feel that way, somehow.

So I guess the answer for me is yes If you've see it. And I could have saved a solid bit of money by not seeing games with all this stuff enabled.
 
I've got an RTX 4090 and love ray tracing, but no... definitely not a "necessity."

Then again, path traced games like Cyberpunk and Alan Wake do look truly incredible. I hope it eventually just becomes "how things are done" in terms of lighting when (if?) technology advances enough that it can be done with low cost.
 
Last edited:
Honestly, it is really hard to tell if it is worth it or not. Really doesnt make too much sense to me most of the time since I won't really notice much of a difference or any unless I do a side by side comparison where it is apparent. I think the only time I looked at it and was like... oh this definitely looks better was with Cyberpunk but only when putting on phantom tracing whatever it is. Throwing on PT in Doom Dark Ages made the image softer and kind of annoying. Some things looked a little better but not the overall compliment it is in other titles I have played. Now I keep everything on since I have a 4090 and why not but still. The cost to performance isnt really worth it most of the time unless you rock a 4090 or 5090
 
Top Bottom