Is Ray-Tracing the biggest scam in gaming?

Its not a scam, but the way its been overhyped does verge on the deceptive.

As I noted from the beginning, to get full advantage requires a paradigm shift from the ground up. Unfortunately however that ground up approach would effectively cut off access to anyone without access to the appropriate hardware to leverage the tech effectively.

The result being that as noone can really afford to take that hit to addressable market, what's left are bolted-on optional implementations and half-step alternatives that offer some of the same benefits at a more reasonable power cost. Both of these options have drawbacks for both developers and consumers.

This was all entirely predictable, but the enthusiasts didn't want to hear it, and the masses weren't ready for the added cost to reqs for a less than transformative benefit.
 
You think this is the pinnacle of game lighting? I really hope not.
u4-2024-12-05-14-59-33-018.jpg
Oh crap man not a cherry-picked scene! I'm defeated! I wish Uncharted 4 looked as good as Avowed and Immortals of Avenum. Hopefully Naughty Dog quietly laid off the lighting artists and put that money to good use learning UE5.

WruxSdfngNcv1bti.png
QBhQ0rANfwxdOMW2.png
JOPctkZdh8GfwWh9.png
 
Last edited:
Its not a scam, but the way its been overhyped does verge on the deceptive.

As I noted from the beginning, to get full advantage requires a paradigm shift from the ground up. Unfortunately however that ground up approach would effectively cut off access to anyone without access to the appropriate hardware to leverage the tech effectively.

The result being that as noone can really afford to take that hit to addressable market, what's left are bolted-on optional implementations and half-step alternatives that offer some of the same benefits at a more reasonable power cost. Both of these options have drawbacks for both developers and consumers.

This was all entirely predictable, but the enthusiasts didn't want to hear it, and the masses weren't ready for the added cost to reqs for a less than transformative benefit.
It's not "less than transformative." The problem is, if it's done very well, you won't even notice it anymore. That's the rub. The idea is to make lighting as natural as possible. It's not a flashy graphical trick meant to wow people. However, it can and does have the run-on effect of vastly improving overall image quality... at least in still shots. The technology has a long way to go before it's stable enough to hold up under constant motion, without costing far too much in terms of rendering speed.

Ray tracing is not a "scam." It's actually a ... well, a beautiful dream right now. But one that I favor greatly whenever I can get away with using it. Lighting is one of the most important aspects of any sort of visual expression. Simulating it in real-time with relatively accurate propagation? It's just plain lovely when it works, and trust me, your brain appreciates it probably more than your eyes do.
 
I'm not an investor. Plus, we saw what happened when that modder got GI to work in Crysis 3. Made the game look boring and drab.
This is not how game lighting is done. Just like in real life, if a cinematographer doesn't care about lighting and just says "let the natural light do all the talking", most movies will look like shit. You still need to consider materials, time of day, artificial lights to compensate where natural lighting is inadequate, reflectors etc. to shoot a scene. Art needs be created with light placement/availability in mind. Retrofitting RT with no consideration for the lighting in the actual level can yield uneven and unpredictable results.

Witcher 3 with RT can look jaw dropping or garbage depending on the scene, because it was all a retrofit and not part of the original design. Your observation with Crysis 3 may be correct, but the conclusion is not.
 
Last edited:
Dude the lighting in these shots look terrible lol
how will avowed recover from this :(
h0W3v5N.png

PW4Q9Ys.png


i said it before, i'll say it again, avowed has decent use of lumen imo. it consistently had better lighting than the likes of spiderman 2 and forbidden west in my experience

then again this is hardware lumen and most consoles have ran software lumen so far but thankfully new unreal engine versions will sort that out and they will push hardware lumen even on base consoles
 
Last edited:
how will avowed recover from this :(
h0W3v5N.png

PW4Q9Ys.png


i said it before, i'll say again, avowed has decent use of lumen imo

then again this is hardware lumen and most consoles have ran software lumen so far but thankfully new unreal engine versions will sort that out and they will push hardware lumen even on base consoles

Avowed looks amazing on PC. Here's some of my PC shots:

TUI7CoV.jpeg


ottKW7smwbErTzdN.jpg


Pu4FpjdEGg2YrVHR.png


CgvruUVkykZsCQLO.png


f16rWxmI7AEm2rFg.png


ydv4aD8zjTdx8iJL.jpeg






Again, the issue isn't RT, it's that these consoles are way underpowered to showcase what RT can really do.
 
Last edited:
It is completely relatable.

VR is a motion tracking head mounted monitor and back in 2015 the market was desperately trying to get people to buy into the technology as the next evolution to gaming and augmented reality computing. It did not take.

Raytracing is a realtime lighting calculation engine intro duced in the early 2000s as rendered images due to its heavy computing cost.
2017 Nvidia introduced the world to realtime RT. Which in 8 years still isn't available to over 50% of gaming machines. It isn't available to the casual massive market yet and thus you got threads like this one. Also, lets not underplay the sheer amount of shit games that implement it and it runs like a
dog based diarrhea terd on your lawn when turned on. That does nothing but harm adoption moving forward as people get bad tastes in their mouth.

As for otherpoints that is just showing market adoption from a manufacturing point of acceptance that there is no other technology to latch onto as a branding iron to convince people to buy into. Hell it took AMD 3 generations of GPU architecture to stop trying to emulate RT computation and actually physically manufacture a GPU that would natively do it. So.... 6 years for AMD to actually adopt RT?

The contradiction here is that Nvidia lead the charge and actually got everyone to turn the bend on actually acknowledging that RT is worth a proper investment intom as its not going away. Now we got our first couple of games that require RT capable hardware to run and people are losing their minds because they never bought anything that natively supported RT in a manner that would provide a good experience.
It's not comparable to VR. Stop this complete nonsense.
 
Eh, that IS ray tracing - Baked ray tracing.
I suppose but if you think like that then everything is some type of "baked raytracing" but nobody was talking about raytracing on a PS4 or in that game because it was a baked approximation of what we consider raytracing today.
 
I suppose but if you think like that then everything is some type of "baked raytracing" but nobody was talking about raytracing on a PS4 or in that game because it was a baked approximation of what we consider raytracing today.

That's the whole point of ray tracing though. To create actual real-time dynamic lighting that looks as good (or better) than that Uncharted 4 setpiece everywhere.


That Uncharted 4 room looks so good because it's small, contained, static, and a team of lighting artists worked for months on it to make it look just right. RT/PT will allow that level of lighting to exist every where in every game with exponentially less dev time/work AND it's actually dynamic and reacts in real-time.
 
Last edited:
Until consoles can run path tracing, the technology will live mostly on PC through devs who actually give a damn to support it. When consoles can run path tracing, then it becomes the norm and tools and development workflows can be safely designed around it for AAA.
i think ps6/xsx2 will be strong enough for path tracing
 
It is still early days, algorithms are improving, there is a lot of room for AI to help, but if you want CGI quality graphics there is no other ways.
 
Oh crap man not a cherry-picked scene! I'm defeated! I wish Uncharted 4 looked as good as Avowed and Immortals of Avenum. Hopefully Naughty Dog quietly laid off the lighting artists and put that money to good use learning UE5.

WruxSdfngNcv1bti.png
QBhQ0rANfwxdOMW2.png
JOPctkZdh8GfwWh9.png
Thats the point though. The same artists created these scenes as the one with the really shitty lighting. Although even these scenes have issues around the dynamic elements.
 
That's the whole point of ray tracing though. To create actual real-time dynamic lighting that looks as good (or better) than that Uncharted 4 setpiece everywhere.


That Uncharted 4 room looks so good because it's small, contained, static, and a team of lighting artists worked for months on it to make it look just right. RT/PT will allow that level of lighting to exist every where in every game with exponentially less dev time/work AND it's actually dynamic and reacts in real-time.
Yeah I get that. My point of contention is that a lot of games that implement raytracing are often static anyway and do so just to have less effort put in to make things look good. Often the art of "how can I approximate this scene to get good results and high performance" is lost to "just enable raytracing" with barely a difference and tanking performance, or they do it for the bulletpoint in marketing.

Even reflections were very well done and better than some games today with RT enabled but I understand that all this takes effort in optimising the specific scene and is not as easy as just hitting "enable raytracing"


1a1e653a63fa34a5a46f3d704afa691f.jpg


I mean look at this "raytraced" mirror in Alan Wake in comparison:

EGNfwGk.jpeg


Or even worse this screen space reflection one:



This is on newer hardware vs a PS4.
 
Last edited:
This is not how game lighting is done. Just like in real life, if a cinematographer doesn't care about lighting and just says "let the natural light do all the talking", most movies will look like shit. You still need to consider materials, time of day, artificial lights to compensate where natural lighting is inadequate, reflectors etc. to shoot a scene. Art needs be created with light placement/availability in mind. Retrofitting RT with no consideration for the lighting in the actual level can yield uneven and unpredictable results.

Witcher 3 with RT can look jaw dropping or garbage depending on the scene, because it was all a retrofit and not part of the original design. Your observation with Crysis 3 may be correct, but the conclusion is not.
We both think that proper lighting requires a proper artist. So the argument of "shorter development cycles" thanks to not having to spend a lot of manpower on lighting is just wrong. Not to mention we still haven't seen a single game that has benefited in this way from using RT. Dev cycles are longer than ever and budgets are higher than ever in the RT era. Not to mention we're rendering at even lower resolutions to accommodate it.

My stance is, since you can't clean out the lighting department, and it's not cutting dev budgets or timelines, what benefit does the gamer get? Spending no less than double on their video cards, getting worse performance on average from any era since pre-8800GT. Even in RT darlings like Cyberpunk. the difference just isn't that much in motion. It is the perfect cat nip for graphics hobbyists, since the gambit of "this is exactly how the sun IRL shades things" makes for never ending screenshot comparison fodder. It just backfires way more often than it actually helps. Look at Doom Dark Ages. Game does not fare well in direct comparisons to its predecessor and takes no less than double the processing power to produce the same performance. Plus, you just locked out all of Eastern Europe, Russia and the entire southern hemisphere from being able to play it (at scale). Touching back on Cyberpunk, that game came out 5 years ago and it's STILL the best example of the technology? Where are the classics that were inarguably aided by ray tracing? Where's the OG Splinter Cell and Doom 3 marking a clear 'BC-AD" line in the technological sand? Where's the "Half Life 2" that instantly upon release makes everything before it look dated? No instead we get wounded games limping to consumers under a great optimization burden. STALKER 2, Dark Ages, Alan Wake 2 etc.

I'm aware of all the theoretical benefits, and there's no denying their efficacy on paper, I just don't see enough examples making it to the market with those benefits intact to make it worth the baggage. How many games can you list that would be clearly worse by not using RT? In my personal situation, I'd have to spend a thousand dollars to run games the way I do now with RT enabled being the only difference. I can count the games on one hand that I can't raster to 1080p60 High or better with RT off. The returns have never been so diminished.

None of this even touches the ripple effects it will cause. Publishers thinking they can do away with that lighting team and that in house engine because of those theoretical benefits. I don't want to post any spoilers, but I'm not very optimistic that dumping the Creation Engine, RED Engine or the Halo engine will have the upside the optimists say it will. Did Death Stranding 2 suffer?

Thats the point though. The same artists created these scenes as the one with the really shitty lighting. Although even these scenes have issues around the dynamic elements.
That's a huge cost to make drive-by scenes like this marginally better. I'd like to see that money used elsewhere in the dev budget.

That Uncharted 4 room looks so good because it's small, contained, static, and a team of lighting artists worked for months on it to make it look just right. RT/PT will allow that level of lighting to exist every where in every game with exponentially less dev time/work AND it's actually dynamic and reacts in real-time.
When will these tantalizing fruits ripen and make it to my desk? It's already been half a decade, and we haven't even seen a glimmer of this.
 
Last edited:
I remember when people got upset by GPU demanding shaders, why not use a regular old texture?
RT & FG have such detractors now, but it's going to become normal, eventually.

The PS4 generation spoiled PC gamers. Because they used weak hardware, any mediocre PC could run their games with quality. But the history of PCs has never been like this. Previously, some games would work on one piece of hardware but not on another, sometimes simply due to a missing feature. I remember, for example, the requirement for cards with shader model 3.0, or compatible with a certain version of DirectX.

Regarding performance, the first Doom didn't run well on CPUs at the time of its release. Only the following year, with the release of the Pentium, was it possible to play it decently and at full speed only two years later. And several other games were like that: Quake, Half Life, Doom 3, Unreal Tournament, etc. The infamous Crysis was one of the last major titles that demanded a lot from PCs.

bcdfnG9QBuPZwSJqLjqnBm.gif


Then came the stagnation in the PS4 era.

What I mentioned above was a time of great transformations, the emergence of 3D, the emergence of GPUs and all embedded technology. Now we're once again in a new transformation, shifting from the raster paradigm to ray tracing. Just as this required specialized hardware years ago, the same is true now. When low-end hardware is running these games well, we'll have reached stagnation again. Many games still use hybrid solutions, and this costs artists time to dedicate to one or the other. TLOU2, for example, is beautiful, but what it does is waste man-hours to replicate what could have been done with ray tracing in less time. The game took eight years to produce, cost at least $200 million, and suffered a severe crunch, to the point of causing many developers to leave the company. And the game likely made a loss, as it sold much more slowly than the previous game.
 
Yeah I get that. My point of contention is that a lot of games that implement raytracing are often static anyway and do so just to have less effort put in to make things look good. Often the art of "how can I approximate this scene to get good results and high performance" is lost to "just enable raytracing" with barely a difference and tanking performance, or they do it for the bulletpoint in marketing.
Like all new features, this is simply a transitory phase, where you get a mixed bag of good versus lazy implementations. It will pass. Once raytracing becomes the norm in the next generation or two and the actual assets are all made with RT/PT in mind, there will be no means to even compare. Turning off ray tracing in debug mode would just make the screen go black as there will be no lightmaps to fall back on. It's no different than comparing realtime rendering with CGI, which is silly. We will eventually mock baked lighting like we mock pre-rendered CGI in games. It's all natural progress.
 
Last edited:
Its just overhyped, not a scam.
7-8 gen consoles has largest gap between PCs at same time,
when the core 2 came out, it shattered PS3 and 360's CPU like a joke
 
Like all new features, this is simply a transitory phase, where you get a mixed bag of good versus lazy implementations. It will pass. Once raytracing becomes the norm in the next generation or two and the actual assets are all made with RT/PT in mind, there will be no means to even compare. Turning off ray tracing in debug mode would just make the screen go black as there will be no lightmaps to fall back on. It's no different than comparing realtime rendering with CGI, which is silly. We will eventually mock baked lighting like we mock pre-rendered CGI in games. It's all natural progress.
There will come a time when RT will be the norm and we have decent performance with everything enabled and won't know any better, I agree. We mocked CGI? For not being real sure but not for looking bad. I don't think we will mock baked lighting like that. Those games will still be classics. They will still look pretty good:

 
Its just overhyped, not a scam.
7-8 gen consoles has largest gap between PCs at same time,
when the core 2 came out, it shattered PS3 and 360's CPU like a joke
I guarantee u guys, all the usual suspects gonna love rt once its mandatory by the time of ps6 launch or latest end of ps5/ps6 crossgen so dunno 2031, by that time all stationary consoles(dont consider switch 2 one) and even entry lvl modern pc's will be powerful enough to do raytracing on a basic lvl at least(obviously cyberpunk 2077 pathtracing will be so much more demanding even then) so every1 will just accept it as a baseline same way antizotropic filtering x16, antialiasing and ssd's are already now :)
 
So the problem is not with the tech is it with companies like Sony and Nintendo being extremely foolish because they don't sell 3 thousand dollar consoles with severe supply issues?
 
So the problem is not with the tech is it with companies like Sony and Nintendo being extremely foolish because they don't sell 3 thousand dollar consoles with severe supply issues?
Just give it few more years, bro, ps6 gonna have big jump in terms of rt and ai upscaling vs base ps5, and switch3(or whatever name its gonna have) will be pretty decent at rt too as long as its nvidia based, so hell, come next gen we will not only get at least basic lvl rt on stationary consoles, we will get it very often even on a handheld too.
 
When will these tantalizing fruits ripen and make it to my desk? It's already been half a decade, and we haven't even seen a glimmer of this.
You see it all the time?
Naughty Dog were the absolute best (maybe still are, if they ever make a new game we'll see) at making these cinematic looking shots with movie level lighting. There is a reason that their games have been shown time and time again in this thread.
Now you have stuff like Avowed or Robocop with as good or better lighting.
 
You see it all the time?
Naughty Dog were the absolute best (maybe still are, if they ever make a new game we'll see) at making these cinematic looking shots with movie level lighting. There is a reason that their games have been shown time and time again in this thread.
Now you have stuff like Avowed or Robocop with as good or better lighting.
Avowed had a notably tumultuous development cycle and an even worse reception. You have to cherry pick it with the same amount of scrutiny to find the good shots that people do to find bad shots in UC4. Now pull up a Youtube video of each, click a random point in the timeline and see which one looks better in the majority of cases. For a real challenge, do it with Avowed and Death Stranding 2.

Again, on console? No. On PC? We've been bearing fruit for years now.
Mind sharing that list of games?
 
Last edited:
I still haven't gotten my money back from half of the DLC's promised.

MTX is really the biggest offender when it comes to scamming people.
 
I'm not an investor. Plus, we saw what happened when that modder got GI to work in Crysis 3. Made the game look boring and drab. Lighting baked by really good artists will always look better than global illumination. Any money publishers would save from not having to hire these artists will just be spent on the absolute worst things you can imagine.
In a screenshot maybe. But in motion real time lighting is an huge step above of any baked solution. Used properly of course. I laughed madly when I read people to say AC Shadows uses worst baked lighting over AC Mirage. No man, baked is always like that compared any real time solution just you can't see it on AC Mirage because you only have the baked mode. This is the sad but cruel reality. Sure ray tracing or path are hugely expensive and cost other precious graphic stuff but you can't win with the math precision.
 
Last edited:
To me it's more like you guys got too obsessed with RT and overhyped the crap out it.

You have no buy yourselves to blame.
 
We both think that proper lighting requires a proper artist. So the argument of "shorter development cycles" thanks to not having to spend a lot of manpower on lighting is just wrong.
I didn't make that claim though. Dev cycles are fucked for so many reasons that aren't even related to graphics. RT will address a very tiny portion of it, but is not a fix for dev cycles in general. It doesn't make it worse either.

Not to mention we still haven't seen a single game that has benefited in this way from using RT.
And we won't for another generation at least. Path Tracing needs to become the norm before games are designed around it. But unlike proprietary tech like Nvidia PhysX, adoption is simply a matter of adequate performance and universal availability. All GPU manufacturers are aligned on that vision, so it's just a matter of time.

Dev cycles are longer than ever and budgets are higher than ever in the RT era.
Sure. But it has nothing to do with RT. Non-RT games take just as long. Bringing this up in context of RT is a strawman argument.

Not to mention we're rendering at even lower resolutions to accommodate it.
This is true. And unfortunate. But it's literally the first generation where this has even been attempted. We are on the same page on devs using RT at the cost of reducing overall quality and performance. The tradeoffs are not being weighed correctly. But i expect it to not even be a tradeoff by the time PS7 is out. Raster capabilities will hit a ceiling. It has already. Path Tracing advancements and other AI driven acceleration will make up for the difference. It will then come down purely to what an artist can imagine, just like in the film industry, but without the constraints of infrastructure and server farms. At least till something insane like 8k or VR becomes a norm, if ever.
I'm aware of all the theoretical benefits, and there's no denying their efficacy on paper, I just don't see enough examples making it to the market with those benefits intact to make it worth the baggage. How many games can you list that would be clearly worse by not using RT?
There isn't a single game built from the ground up to exclusively run on path tracing. It won't happen for another 5-10 years. Meanwhile we are starting to see intermediate measures, like Indiana Jones where the game would miss even basic lighting if RT is turned off entirely. We can expect this trend to grow throughout the PS6 era. And there are plenty of games that have great RT implementations, especially on PC. Do you really think there are none? Or are you limiting the assessment to current gen consoles, which have pretty weak RT hardware?


None of this even touches the ripple effects it will cause. Publishers thinking they can do away with that lighting team and that in house engine because of those theoretical benefits.
This is a much more complicated and expansive topic than just ray tracing, which is just one feature of engines. Not sure where I stand on it, to be honest. Would be hard to tell what the dev should do when they are the ones bearing the costs of maintaining in-house engines. I'm confident some will continue evolving their own. UE will never be one-size fits all. But some level of "survival of the fittest" is expected across the industry.
 
Last edited:
Ignoring all conversation I can simply say that I'm not interested enough in pretty lighting and reflections to care about RT. Unless they tangibly improve gameplay they are mere distractions and as such are optional.
 
The PS4 generation spoiled PC gamers. Because they used weak hardware, any mediocre PC could run their games with quality. But the history of PCs has never been like this. Previously, some games would work on one piece of hardware but not on another, sometimes simply due to a missing feature. I remember, for example, the requirement for cards with shader model 3.0, or compatible with a certain version of DirectX.

Regarding performance, the first Doom didn't run well on CPUs at the time of its release. Only the following year, with the release of the Pentium, was it possible to play it decently and at full speed only two years later. And several other games were like that: Quake, Half Life, Doom 3, Unreal Tournament, etc. The infamous Crysis was one of the last major titles that demanded a lot from PCs.

bcdfnG9QBuPZwSJqLjqnBm.gif


Then came the stagnation in the PS4 era.

What I mentioned above was a time of great transformations, the emergence of 3D, the emergence of GPUs and all embedded technology. Now we're once again in a new transformation, shifting from the raster paradigm to ray tracing. Just as this required specialized hardware years ago, the same is true now. When low-end hardware is running these games well, we'll have reached stagnation again. Many games still use hybrid solutions, and this costs artists time to dedicate to one or the other. TLOU2, for example, is beautiful, but what it does is waste man-hours to replicate what could have been done with ray tracing in less time. The game took eight years to produce, cost at least $200 million, and suffered a severe crunch, to the point of causing many developers to leave the company. And the game likely made a loss, as it sold much more slowly than the previous game.
no one in 2019 expected gtx 1080 to run games at 1440p ultra 60 fps because a mere 2060 surpassed it

now even the 5060 is leagues behind a 3080 so when a 3080 user cannot get 1440p ultra 60 fps, they get angry. they look at charts and see that their GPU is still way faster than the likes of 4060 and 5060. when the same thing happened to gtx 1080 users, they had to silently accept the fall of their GPU. they'd look at benchmarks and see that a basic GPU like 2060 matched their once mighty GPU. so they wouldn't even dare to complain
 
Last edited:
Yes it is. Apart from reflection sadly it is not worth it. Instead of RT I would take, LOD and high res textures which has more impact IMO unless it is a night setting with lots of light source, puddles of waters where RT can make big difference.
 
I have zero interest in it - would rather have the overhead for fps.
TBH I find even HDR to be over rated. Granted, I've probably never seen a fully calibrated (correctly) setup with HDR, but the fact that it seems so hard to properly setup, straight away decreases its importance to me.
One day, hopefully, we will have all these 'boosters' properly working - Ray Tracing/HDR and even VRR - but I find it complete bollox that even after all this time, there are still issues behind each one.
 
There will come a time when RT will be the norm and we have decent performance with everything enabled and won't know any better, I agree. We mocked CGI? For not being real sure but not for looking bad. I don't think we will mock baked lighting like that. Those games will still be classics. They will still look pretty good:


I meant mocking for not being real (as in real-time). I never argued that baked lighting looks bad per se. It just isn't scalable, flexible or dynamic. It has the same issues as pre-rendered CGI. Technological progress is all about reducing constraints, which can then allow devs to evolve game design. Now will they actually use the technology to evolve game design? That's a whole different question. But technological progress can't fix laziness or lack of creativity.

You will never see an open world game on console that looks better than spiderman 2 with large-scale, dynamic, city-level destruction until RTGI or PT becomes viable. Because it becomes literally too costly to bake.

You can pull it off if you make everything flat and precomputed, like in Donkey Kong, but I don't think that's what most gamers would expect from Sony, MS or Rockstar.
 
Last edited:
Ignoring all conversation I can simply say that I'm not interested enough in pretty lighting and reflections to care about RT. Unless they tangibly improve gameplay they are mere distractions and as such are optional.
This, all of this is mostly about making graphics more "pretty" its not gonna change how you gonna play the game.
 
Top Bottom