Cyberpunk 2077 With Path-Tracing The best looking game available now?

RaidenJTr7

Member
When do you think consoles will get Path Tracing in games like Cyberpunk?

4128873-20230418_010810.jpg

4128874-20230418_011356.jpg






4128880-20230423_172333.jpg

4128881-20230417_004546.jpg
 
yfPrMIg.png


This scene was fine before. Now they added a nuclear bomb blast outside the room.

This kind of extreme exposure makes games look worse for me, not better.
 
full path tracing at 4k 60fps which apparently is the only resolution/framerate combo acceptable to gamers on this forum? maybe PS10
 
If all you care about is the technical aspect, sure, it's the most advanced game out there.

If you care about art direction, consistency, cohesion, and everything else, it's very subjective.

I don't find it's the most pleasing game to look at but it's definitely a cut above everything else from a technical perspective. At least in several aspects.
 
If all you care about is the technical aspect, sure, it's the most advanced game out there.

If you care about art direction, consistency, cohesion, and everything else, it's very subjective.

I don't find it's the most pleasing game to look at but it's definitely a cut above everything else from a technical perspective. At least in several aspects.

They have the most advanced game right now (3 years after release) and they are changing the engine to UE5, I don't get it...
 
They have the most advanced game right now (3 years after release) and they are changing the engine to UE5, I don't get it...
It doesn't seem like the Red Engine is all that well-equipped to deal with complex open-worlds. You can see the limitations in Cyberpunk and Witcher. I assume UE5 will be easier for them.
 
It doesn't seem like the Red Engine is all that well-equipped to deal with complex open-worlds. You can see the limitations in Cyberpunk and Witcher. I assume UE5 will be easier for them.

We will see how well UE5 performs. UE4 is absolute dog shit for open world stuff (look at Jedi...) and they made 2 massive open world games on red engine. I think they are shooting themselves in the foot.
 
Just ordered a 4090 Rig last night… Treated myself after finishing my Masters degree.

From 980TI > 4090TI

My body is absolutely F'n ready
That's quite the leap. Congrats. I also hope you have the CPU to take advantage of that beast.
OMFG that looks insane !!
Don't be fooled by almost professional-level screenshots. While the game does look good, it won't look as good as those screenshots moment-to-moment.
 
That's quite the leap. Congrats. I also hope you have the CPU to take advantage of that beast.

Don't be fooled by almost professional-level screenshots. While the game does look good, it won't look as good as those screenshots moment-to-moment.
This.
 
yfPrMIg.png


This scene was fine before. Now they added a nuclear bomb blast outside the room.

This kind of extreme exposure makes games look worse for me, not better.

I agree, but there are a lot of people who prefer that since it's supposed to represent realistic visuals.
 
Don't be fooled by almost professional-level screenshots. While the game does look good, it won't look as good as those screenshots moment-to-moment.
That´s true, but it´s by far the most consistent experience you can get today. Basically no shadow pop in, no AO fadeout, no "everything outside of direct light suddenly looks flat or glows".If you had that for a while in motion and then go back to a game with the usual raster fakery and even the standard RTX GI implementation where you can break everything by simply changing the camera angle a bit it`s downright repulsive how inconsistent everything is.
 
Last edited:
Full PT lighting as standard? PS7 maybe.
Nope... by PS6 we will have that.

PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPuwoudnt even be considered a low-end GPU.

The kicker though, is that if consoles in 2028 are coming with GPUs more powerful than a 4090 today... what would be the high-end PC GPU then?

Another important question, is even if the hardware todo it is there... should they?
 
Last edited:
Resolution?

I'm referring to your comment about lighting. It's supposedly more realistic with path tracing.

My issue is with the blinding, completely burned out lights from the outside daylight, when you are in a slightly darker room.

This is an effect caused by cameras, not our eyes. Realistic lighting means you should be able look outside the window during daytime. But in most modern videogames all you see is this blinding light.
 
Nope... by PS6 we will have that.

PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPuwoudnt even be considered a low-end GPU.
We have an optimist here :messenger_grinning_sweat:
The 4090 is only barely able to do it with DLSS2 + DLSS3.
The hardware level where something like this could actually become standard is 5090 terrain or even beyond that, and I don`t see that kind of power on a 300$ <300w SoC in 2027 at the latest when the base tech for the next gen should be finalized.

really hoping to eat crow on that one, but with how things are going.......
 
Last edited:
Looks fantastic. Need to see a full blown gameplay video in 4K before I say yes though. OP didn't provide enough cold hard evidence
 
PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPuwoudnt even be considered a low-end GPU.
Of course, it will be a low-end GPU, if not lower mid-range. 2028 is only 5 years from now, the 4090 will be around 6 years old by then. The flagship from 6 years ago is the 1080 Ti, which is still a fairly decent GPU.

The big question is the advancement we will make in terms of ray tracing. If we make enormous leaps, then yeah, the 4090 could be rendered obsolete by then but that's a big if.
 
Last edited:
We have an optimist here :messenger_grinning_sweat:
The 4090 is only barely able to do it with DLSS2 + DLSS3.
The hardware level where something like this could actually become standard is 5090 terrain or even beyond that, and I don`t see that kind of power on a 300$ SoC in 2027 at the latest when the base tech for the next gen should be finalized.........
Its not optimism... its common sense and precedent.

This happens every single time. Every time there are consoles, there is some new high-end GPU and some new rendering tech... and every single time, people talk and dismiss shit as if nothing would surpass that. Or as if the very things that are being praised wouldn't soon become common pace. As if it hasn't happened many many many times already..

We are in 2023... Do you really think, that in 5 years, a GPU like the 4090 would even be considered low-end? The ow endGPU then would be more powerful than it. And you think in 5 years, where we are at with FSR2 is where we will still be? That in 5 years, AMD wouldn't have its own form of frame generation?

Come on...
 
Its not optimism... its common sense and precedent.

This happens every single time. Every time there are consoles, there is some new high-end GPU and some new rendering tech... and every single time, people talk and dismiss shit as if nothing would surpass that. Or as if the very things that are being praised wouldn't soon become common pace. As if it hasn't happened many many many times already..

We are in 2023... Do you really think, that in 5 years, a GPU like the 4090 would even be considered low-end? The ow endGPU then would be more powerful than it. And you think in 5 years, where we are at with FSR2 is where we will still be? That in 5 years, AMD wouldn't have its own form of frame generation?

Come on...
What bad take is this? The 1080 Ti isn't even low-end and it's 6 years old. WTF are you talking about lol?

A low-end GPU is a 3050 and it gets trounced by a 1080 Ti.
 
Last edited:
For screenshots maybe, in motion it's terrible so I went back to psycho. Tons of weird visual glitches and artifacts, in fast camera movement or when objects are occluded the lighting is just plain wrong and then slowly adjusts adjusts itself which is really distracting.

It also seems path tracing wasn't taken into account when designing the game and half the campaign takes place in areas that are pitch black.
 
Last edited:
its amazing yes. puts it up there with the best looking tech yes. the outside of the city just looks incredible. character models need to be much better though.
 
Can 4080 with DLSS 3 handle this?
playability wise, yes

if u're in it for screenshots and pondering around walking, even a 3070 is capable. I played 1440p dlss balanced and got around 30 FPS in most places. static shots / slow scenes are reconstructed to almost a native 1440p like presentation (but movement is of course somewhere between. not quite 835p, but not quite 1440p. also overall path tracing breaks a lot in high speed motion). all at 30 FPS of course. but enough to drive, walk and enjoy the path tracing. would I play like that? i'm actually a madman and plan doing so with the expansion

for the topic: yes, it has the best visuals / lighting i've ever seen and i've played quite bit amount of video games
 
Last edited:
What bad take is this? The 1080 Ti isn't even low-end and it's 6 years old. WTF are you talking about lol?

A low-end GPU is a 3050 and it gets trounced by a 1080 Ti.
Are you kidding me? And I am the one having a bad take?

The 1080Ti is fine as long as you are not doing anything RT based. What's it with PC peeps, so smart and stupid at the same time.

As far as Raster goes, if Nvidia has shown anything, is that GPU tech budget can be better spent elsewhere. We can have GPUs in 2028, that for all intents and purposes, has the same raster performance that a 4070 has today, but the RT performance that a 4090 has instead and that at that time, would be considered an entry-level GPU.

And the current gen consoles came in being on par with a 6700 amd GPU, or a 2070 super or 2080... two years after that GPU was considered high-end.

Honestly not going get into this kinda mindless pointless argument. How about we wait for 5 years and see what's what then? I have said this before in another thread... When the next gen comes, just look at whatever was considered high nd two years prior... and that would be what you see in those consoles. And All I am saying, is that come 2028... a 4090 equivalent wouldn't even be considered entry-level whatever series of GPUs are released that year. The cheapest GPU you would get in those series... be it 90xx series or whatever... would be more powerful than a 4090 is today.
 
Last edited:
Its not optimism... its common sense and precedent.

This happens every single time. Every time there are consoles, there is some new high-end GPU and some new rendering tech... and every single time, people talk and dismiss shit as if nothing would surpass that. Or as if the very things that are being praised wouldn't soon become common pace. As if it hasn't happened many many many times already..

We are in 2023... Do you really think, that in 5 years, a GPU like the 4090 would even be considered low-end? The ow endGPU then would be more powerful than it. And you think in 5 years, where we are at with FSR2 is where we will still be? That in 5 years, AMD wouldn't have its own form of frame generation?

Come on...
the 2080 Ti is nearly 5 years old, far from "low end", still costs north of 400 bucks, never was in the price regions of a 4090 and all that was before NVIDIA decided to make crypto pricing the new normal.....

what are you smoking?

And the current gen consoles came in being on par with a 6700 amd GPU, or a 2070 super or 2080... two years after that GPU was considered high-end.
lol....
You`re comparing 600-700 dollar higher end cards with 1600$ level titan "enthusiast level" cards with the biggest hardware jump since the "Titan"-class was introduced. Sorry but that is beyond dumb....
You´re acting as if the last 3 years never happened.
 
Last edited:
the 2080 Ti is nearly 5 years old, far from "low end", still costs north of 400 bucks, never was in the price regions of a 4090 and all that was before NVIDIA decided to make crypto pricing the new normal.....

what are you smoking?


lol....
You`re comparing 600-700 dollar higher end cards with 1600$ level titan cards with the biggest performance jump since the "Titan"-class was introduced. Sorry but that is beyond dumb....
You´re acting as if the last 3 years never happened.
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.

A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.

In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.

A 7900XTX is a 60TF+ GPU.

If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.

In which case,I would just say, lets agree to disagree.
 
Top Bottom