• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

I'm a little sick of technological advancements that are barely noticeable.

There have been a number of technology advancements, particular on the PlayStation side of things, that are heavily marketed as being game-changers that turn out to be kind of nothing-burgers.

Ray-tracing is the biggest waste of time, energy, and resources of any graphical technology that I can remember. It's barely noticeable, it cuts your frame rate in half, and I believe is mostly being perpetuated as a way to sell $1,500 GPUs. What initially sold everyone on the concept was Minecraft mods that looked absolutely incredible, but if we are being honest with ourselves, how many games with RTX have really lived up to that game-changing prospect?

PSSR is working on some games, and on other games it's making them look worse. No doubt that this technology is going to be the future of graphics rendering, but right now it's a complete mixed bag.

HDR is available in only some games, on some televisions, and 50% of the time it's botched and doesn't work correctly. The calibration menus are unintuitive and difficult to parse even for a technically minded person. Most games don't use the system-level calibration, resulting in an uneven and inconsistent experience.

3D audio just doesn't work for me. I cannot hear height with any pair of headphones I've tried. The new personalized profile calibration tool just flat out sucks too. Sounds like stereo to me!

Even 4K is not a noticeable step-up on a regular sized television that most people would have. You need a BIG tv and it needs to be pretty close to you.



I'm not a technological laggard, I love the SSD technology, the DualSense controller, and OLED is a huge upgrade, but some of these other technological innovations have fallen quite short of expectations for me. Sometimes I do long for the days of wired controllers into CRT displays...nothing even comes close to how good games feel to play on an old-school setup like that.
 

The Cockatrice

I'm retarded?
how many games with RTX have really lived up to that game-changing prospect?

Very few indeed. Cyberpunk and I guess Alan Wake 2 are the only memorable games that utilize RT/PT incredibly well. Everyone else just uses standard RTAO(its garbage) and reflections which you'll rarely notice when playing the games. Even RTGI in some UE5 games is hardly something you want on compared to the software one.


It's still too early for it. It's practically a beta test for the PS6.


It has been a mixed bag indeed.


Very few games utilized this feature well so its essentially useless atm.
 

iamvin22

Industry Verified
To me the problem is devs/studios don't show or explain these advantages in their games. Also a lot of them don't know how to implement it or code it in properly.

I'm pretty much 💯 with you on everything you said but with talks I've had with several studios covid did the gaming sector no favors.
 

Mortisfacio

Member
Most games don't utilize it, but the only visual tech upgrade I've really been impressed by in recent years is path tracing. CyberPunk with path tracing on vs off is just so different to me and very impressive, just other than a 4090, good luck trying to use it.

The single largest "upgrade" I've ever felt in the last 15 years was moving from a HDD > SSD. Everything else has been nice or forgettable, but SSD was game changing.
 

SHA

Member
There have been a number of technology advancements, particular on the PlayStation side of things, that are heavily marketed as being game-changers that turn out to be kind of nothing-burgers.

Ray-tracing is the biggest waste of time, energy, and resources of any graphical technology that I can remember. It's barely noticeable, it cuts your frame rate in half, and I believe is mostly being perpetuated as a way to sell $1,500 GPUs. What initially sold everyone on the concept was Minecraft mods that looked absolutely incredible, but if we are being honest with ourselves, how many games with RTX have really lived up to that game-changing prospect?

PSSR is working on some games, and on other games it's making them look worse. No doubt that this technology is going to be the future of graphics rendering, but right now it's a complete mixed bag.

HDR is available in only some games, on some televisions, and 50% of the time it's botched and doesn't work correctly. The calibration menus are unintuitive and difficult to parse even for a technically minded person. Most games don't use the system-level calibration, resulting in an uneven and inconsistent experience.

3D audio just doesn't work for me. I cannot hear height with any pair of headphones I've tried. The new personalized profile calibration tool just flat out sucks too. Sounds like stereo to me!

Even 4K is not a noticeable step-up on a regular sized television that most people would have. You need a BIG tv and it needs to be pretty close to you.



I'm not a technological laggard, I love the SSD technology, the DualSense controller, and OLED is a huge upgrade, but some of these other technological innovations have fallen quite short of expectations for me. Sometimes I do long for the days of wired controllers into CRT displays...nothing even comes close to how good games feel to play on an old-school setup like that.
It used to be on a good battlefield, mirrors edge, mass effect, need for speed game and the list goes on. The industry is reshaping for potential buyers, that doesn't necessarily mean what you and I want. Some people put their money on this business, numbers don't lie.
 
Last edited:

keefged4

Member
In regards to HDR and Ray tracing, I firmly believe those who say that they "don't do much" or are a "waste of time" haven't properly seen decent HDR on a proper HDR display, and haven't seen Ray Tracing at it's full potential when implemented properly. The issue isn't the tech, it's the developers not implementing it properly most of the time.
 

winjer

Gold Member
Ray-tracing is useless in most games. As the performance loss is huge and the gains in image quality are not justified.

I think that PSSR issues are mostly dues to bad implementations. We have seen several games where devs put the minimum effort and the results were excellent.
But if a dev is doing the bare minimum, then it's probably going to suck.
 

violence

Member
Most of the time Ray tracing makes things look different not better in modern games. For me it’s been best with old games like Quake 2 RTX or Half-life RTX because I need something a little different for my 100th play through. Looking forward to more old games using it.

HDR is fucking amazing.
 

Gaiff

SBI’s Resident Gaslighter
Ray-tracing is useless in most games. As the performance loss is huge and the gains in image quality are not justified.
Well, yeah, it’s half-assed 90% of the time. It takes extremely beefy PC hardware to make good RT playable at a decent frame-rate and resolution.

Even the 4090 isn’t good RT. It’s just not terrible. However, I’m still glad that they’re trying. When real capable RT hardware hits the scene, we should see a pretty significant visual leap. The difference between the PS5 and PS6 might be greater than the one between the PS4 and PS5. That’s assuming the PS6 can run path tracing though, which is far from guaranteed.
 

King Dazzar

Member
HDR for me is superb if you have the right TV. And I've been in love with it since around 2017. It was a huge upgrade for me. And I usually dont buy any games that dont utilise it.

RT can look pointless at times. But then you see something like Cyberpunk on PC and it looks amazing. Just deploying it everywhere though, especially on hardware that cant cope, at the heavy expense of resolution fidelity is just crazy to me. I'm sure sometimes its used just for them to be able to say its being used. Almost like its primarily used solely for marketing.
 
Last edited:

HeWhoWalks

Gold Member
In regards to HDR and Ray tracing, I firmly believe those who say that they "don't do much" or are a "waste of time" haven't properly seen decent HDR on a proper HDR display, and haven't seen Ray Tracing at it's full potential when implemented properly. The issue isn't the tech, it's the developers not implementing it properly most of the time.
Simple one and done to this thread’s premise! Also comedic how he mostly singles out Sony when the majority of new tech starts on PC. 😜

For me, real game changers are in fact HDR, PSSR, path AND raytracing, VRR, as well as 3040x2160 (unfortunately, most monitors don’t do true 4096 4K, but that’s a whole other topic).

It boils down to the humans in control of the tech, not the tech itself.
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
I pretty much agree with the exception of HDR & 4K.
but not so much the HDR implementation, SDR content looks better on a HDR screen and I would never go back.
But I ain't getting upset over those features just because of teh 60fps that everyone seems to be crying for.
Most settle for 40fps because it's over the threshold that gives that high frame rate look.
They can't actually tell the difference
IQ and a stable frame rate is all I want, and if it costs RT or teh 60fps then do it.
If you can do 60fps with too, bonus
 
Dude what are you, pro consumer? Give these poor multi billion dollar companies a break, they need to satisfy their shareholders. If Nvidia bravely didn't cut our framerates in half, we wouldn't be upgrading our GPUs all the time. Same for all the other "innovations" - publishers, developers, platform holders and hardware manufacturers all combining forces to extract the highest amount of money out of you possible. Please be excited. Consume.
 

HeWhoWalks

Gold Member
VRR is the only one that makes a real difference, the rest is fluff
Eh, VRR makes a major difference to gameplay, yes, but 4K provided a massive boom to image quality, HDR and OLED provide a massive boom to blacks, colors, & contrast, raytracing provides a massive boom to visual realism (lighting, reflections, shadows), and upscalers provide a massive uptick to framerates.

Surely not fluff, no matter.
 
Last edited:

Loomy

Banned
Ray-tracing is the biggest waste of time, energy, and resources of any graphical technology that I can remember. It's barely noticeable,
Puyt RT and HDR side by side with no RT and HDR and you'll see a difference. It is still expensive right now though..

I've said this a few times in the past month or so. Anyone who wants to and is capable of creating a hyper realistic game is able to right now. There's no technical barrier to that. The advancements you're going to see now will be in areas that make achieving that easier and cheaper. And you probably won't notice huge jumps, but developers will notice their pipeline improving.
 

Kataploom

Gold Member
Most of the time Ray tracing makes things look different not better in modern games. For me it’s been best with old games like Quake 2 RTX or Half-life RTX because I need something a little different for my 100th play through. Looking forward to more old games using it.

HDR is fucking amazing.
And devs also change the initially intended look of a scene just to make it mirror-like reflective for the sake of making the easily impressionable ones claim it's "better".

I mean, it is, but not in the way it's been used these years, we still don't have the hardware for it to be standard and with Nintendo being so popular it makes even less sense for next gen as well lol.

The problem with HDR is the tinkering, I can tinker for like 30 seconds for my PC games to get the graphics I want it to have at 60 fps, but HDR is more tricky, in my case I decided to completely ditch it since the "creators intention" with contrast enhancer set to off is waaaaaay too dim and dark compared to even SDR, is it the norm to just leave the Contrast Enhancer ON? I'm on a QN90B TV, and I already gave up basically.

I mean there are lots of people saying "PC is not for them because of the tinkering", yet they like tinkering with HDR with is way less intuitive? I can't understand.

And for OP, yes, diminishing return already hit us, it's not only hardware but also production efficiency, to solve it we need more hardware which is more expensive to get the same return, even HDR is a representation of the diminishing return since you need to do way more as a user just to get a "right" picture that finally feels like a jump in quality (even devs have to, so there's dependency there)
 
OP
4d9ab4e9813c8cecaf3e4a447e835eb1_w200.gif


cant go back from HDR, 4k, and VRR.

RT is amazing but hardware just isnt up to snuff yet.
 

twilo99

Member
Eh, VRR makes a major difference to gameplay, yes, but 4K provided a massive boom to image quality, HDR and OLED provide a massive boom to blacks, colors, & contrast, raytracing provides a massive boom to visual realism (lighting, reflections, shadows), and upscalers provide a massive uptick to framerates.

Surely not fluff, no matter.

4K is not a "technology" , we've been getting gradual increase in resolution since the dawn of video gaming
 

HeWhoWalks

Gold Member
4K is not a "technology" , we've been getting gradual increase in resolution since the dawn of video gaming
4K is indeed a technological advancement and it provided a significant image quality upgrade over everything that came before it (particularly on consoles where there was no 1440p - which in and of itself provided a notable jump over FullHD).
 
Last edited:
The only ones that made me notice a good tech improvement was Ratchet and Clank Rift Apart during 2021, and Space Marine 2 this year. GTA VI obviously but only if they're able to at least keep the quality of what they showed from it's trailer.
 

Audiophile

Member
RT/PT is important and is a key part of the future but the issue is it's a technology that needed to get its foot in the door first. Even Nvidia weren't going to throw an enormous chunk of silicon/logic at it out of the gate (~15% on the 20/30 series I think).

Screen space stuff now sticks out like a sore thumb and is only going to be more apparent with every other element in the frame getting better as game tech advances, RT/PT is a thing that largely needed/needs to happen and someone had to take the leap, but we're in the transition period and there's a certain threshold in capabilities we have to reach before it really impresses as an intrinsic part of the game engine.

Right now it's largely tacked on and working with limited capabilities. By the next decade it'll be vital in the majority of new titles, there'll be more hardware dedicated to it and more efficient hardware featuresets, in addition to far more effective implementations and trickery. It'll also streamline development when it encompasses multiple aspects from GI, to reflections, shadows, emissives etc.

High end PCs can do it reasonably well but consoles which largely drive the direction and decisions around most titles; are extremely weak on RT. This gen has been a case of tacking on 1-2 low-medium quality RT fx. I expect adequate RT capabilities on the next generation and it'll be a core part of the tech.

Real-time HW-accelerated RT in games is only 6yrs old and was only in a handful of titles. Most people didn't start implementing it til just a few years back and there are still quite a few holding out. We're still incredibly early. I'll certainly be critical of many implementations, but the tech as a whole is just a natural evolution and eventually won't be optional in many, many titles.

I fully expect most large scale or heavily-dynamic games going into dev in the coming 5yrs to be RT-only and have their lighting systems built entirely on it with no raster fallback.
 
Last edited:
HDR is a weird feature because it's not standard across media. Most people are watching video and looking at images in SDR. HDR seems a bit of an extra step when in game and requires adjustments and too much of a change in the range of levels. A HDR screenshot is not going to be viewed properly by most so I just default to SDR and appreciate the ray tracing and GI etc through that.
 

Radical_3d

Member
HDR is a weird feature because it's not standard across media. Most people are watching video and looking at images in SDR. HDR seems a bit of an extra step when in game and requires adjustments and too much of a change in the range of levels. A HDR screenshot is not going to be viewed properly by most so I just default to SDR and appreciate the ray tracing and GI etc through that.
So you have an HDR capable display but choose to play on SDR so screenshots can be shared with correct graduation?
 

yogaflame

Member
RT is a bonus but its not a big lost if a game does not have that feature. I rather have better IQ and Frame rate than wasting resources on RT.
 

tmlDan

Member
your optometrist might be able to assist you, and a good TV

I think once we properly implement all these features you hate you'll be windblown
 
Last edited:

Crayon

Member
I'm right there with you on the ray tracing. I'm fascinated by it but years later, most games get barely anything out of it and some even nothing. It's the future, sure. But boy has that future been slow coming and the hype is massively outsized.
 
i'm sick of technological advancements that are barely noticeable & that only truly even exist in the first place in order to sell hardware...
 

twilo99

Member
4K is indeed a technological advancement and it provided a significant image quality upgrade over everything that came before it (particularly on consoles where there was no 1440p - which in and of itself provided a notable jump over FullHD).

I don’t see increasing resolution as new technology but okay
 
Sometimes I wish I could travel to some alternate universe where some weird physical law prevented graphics from evolving past PS2 level in anything but resolution, forcing developers to focus on other aspects to make their games stand out.
 
Top Bottom