• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF : Cyberpunk 2077 - How is path tracing possible

hyperbertha

Member
dont-do-drugs-kids-teddy-ray.gif


Soo many salty people lately in all CP2077 and RTX threads lol
I thought you'd turn combative and turn this into fanboy nonsense. Forbidden west has far better polydetail, texture detail and lighting model than default cyberpunk. Thought with overdrive it has overtaken in the lighting department. You're delusional if you think fw isn't the best looking game out there yet.
 

Turk1993

GAFs #1 source for car graphic comparisons
I thought you'd turn combative and turn this into fanboy nonsense. Forbidden west has far better polydetail, texture detail and lighting model than default cyberpunk. Thought with overdrive it has overtaken in the lighting department. You're delusional if you think fw isn't the best looking game out there yet.
Again you are making these stupid claims and expect a proper conversation. How can i be a fanboy when i praised HFW constanstly before and called it the best looking game on console and one of the best on all platforms. And i played both of them and see it with my eyes which one is better. And again you claim that the lighting model in HFW is better than CP2077 even before the Overdrive update just show how delusional you are. Lighting and shadows are the biggest weak points of HFW, the RT physco lighting shits on HFW lighting and the Overdrive mode is just not even comparable.
Ay4Vbu0.jpg

PLhsI6K.jpg

3Sshul7.jpg


Just look at this
 

Gaiff

SBI’s Resident Gaslighter
I'm saying I don't see paradigm shifts in graphics unless a new console generation comes up, though I'll grant crysis as an exception. And the reason is because devs largely develop with the lowest common denominator in mind, which means even if every pc gamer suddenly had an rtx 4090, devs still need to cater to console specs. I didn't talk about physics.
Ray tracing that first hit PCs in 2018 isn't a paradigm shift? The biggest rendering technique since the transition to 3D doesn't matter?
forbidden west and demon's souls look better than cp 2077. its not even close. Though the raytracing is now better in overdrive. Crysis 3? Really? Order 1886 was regarded as the benchmark last gen.
The only thing The Order 1886 was regarded for is its failure.

I thought you'd turn combative and turn this into fanboy nonsense. Forbidden west has far better polydetail, texture detail and lighting model than default cyberpunk. Thought with overdrive it has overtaken in the lighting department. You're delusional if you think fw isn't the best looking game out there yet.
Non-RTGI games need not apply when we discuss best lighting. It's not possible for them to match competent RTGI seen in games like Metro Exodus or Cybperunk. Cyberpunk RT's lighting model was already far better than HFW. Overdrive just pushed it further ahead. It's delusional to claim that HFW was better in that department prior to the release of the Overdrive mode.
 
Last edited:

Pedro Motta

Gold Member
most people use that term to describe games that are mostly, or entirely rendered through rays traced into the environment, not ones that literally simulate every single thing a photon does irl.
which isn't even really possible and you could push that goalpost back every time a new milestone is reached.

basically every pixel on your screen is the color it is due to a ray being traced against a polygon, that's IMO the only definition of pathtracing that realistically makes sense without arbitrarily adding requirements of what needs to be simulated and what doesn't
Stay ignorant if you want. Not my problem.
 

magnumpy

Member
one thing I hope everyone can realize now, is that a "PlayStation 6" will definitely be necessary! :messenger_open_mouth:
 
Last edited:

Loxus

Member
I'm not sure i understand the argument? First, L2 vs RDNA 3's infinity cache is like comparing apples and oranges? L2 is right next to SMs, the infinity cache are on the memory controllers to make up for the lower bandwidth GDDR6. They really do not have the same role. When a the SMs don't find the data locally, they go to L2 cache, which again, as DF says, is SIMD bound. Going to another level outside the GCD is not a good idea. These things need low latency.



There's not a single AMD centric path tracing AAA game so, until then..



These are console solutions and they're relatively shit. Low res and low geometry. We're a world of difference with what's going on here with overdrive mode.



[/URL][/URL]

did the avg for 4k only, i'm lazy. 7900 XTX as baseline (100%)

So, games with RT only ALL APIs - 20 games
4090 168.7%
4080 123.7%

Dirt 5, Far cry 6 and Forza horizon 5 being in the 91-93% range for 4080, while it goes crazy with Crysis remastered being 236.3%, which i don't get, i thought they had a software RT solution here?

Rasterization only ALL APIs - 21 games
4090 127.1%
4080 93.7%

Rasterization only Vulkan - 5 games
4090 138.5%
4080 98.2%

Rasterization only DX12 2021-22 - 3 games
4090 126.7%
4080 92.4%

Rasterization only DX12 2018-20 - 7 games
4090 127.2%
4080 95.5%

Rasterization only DX11 - 6 games
4090 135.2%
4080 101.7%

If we go path tracing and remove the meaningless ray traced shadow games, it's not going to go down well in the comparison.



It would be a meaningless difference.

tsmc-n4w.png


You're looking at too much of a gap in the GCD+MCD total size package vs 4080's monolithic design to make a difference with 6% density. The whole point of chiplet is exactly because a lot of silicon area on a GPU does not improve from these nodes.



B bu but your game is not simulating the Planck length! Fucking nanite, USELESS.



Have you seen the post i refereed to earlier in the thread for it? They're both tracing rays, yes. There's a big difference between the offline render definition and the game definition. Even ray tracing solutions on PC heavily uses rasterization to counter the missing details from the ray tracing solution. Ray tracing by the book definition, you should be having HARD CONTACT shadows. Any softening of it is a hack or they make a random bounce to make a more accurate representation.

So ray tracing is not ray tracing

Emoji Think GIF


/s
First of all, I never said AMD RT implementation is better. I said it has potential to perform better if optimized for the hardware and consoles is helping with that.

Microbenchmarking AMD’s RDNA 3 Graphics Architecture
RDNA 3 makes a massive improvement in LDS latency, thanks to a combination of architectural improvements and higher clock speeds. Nvidia enjoyed a slight local memory latency lead over AMD’s architectures, but RDNA 3 changes that. Low LDS latency could be very helpful when RDNA 3 is dealing with raytracing, because the LDS is used to store the BVH traversal stack.
VZ2j0pj.png


The reason AMD should of went with 4nm, is because it gives 11% better performance while using 22% less power vs 5nm, not density. Which is the reason why Nvidia cards has better efficiency.
 

Loxus

Member
From that thread:

AMD's L2 cache: 6MB
NVIDIA's L2 cache: 96MB (72MB on 4090)

DF statements seems accurate to me.
You'll need to stop believing Digital Foundry knows shit about PC hardware.

Microbenchmarking AMD’s RDNA 3 Graphics Architecture
RDNA 3 makes a massive improvement in LDS latency, thanks to a combination of architectural improvements and higher clock speeds. Nvidia enjoyed a slight local memory latency lead over AMD’s architectures, but RDNA 3 changes that. Low LDS latency could be very helpful when RDNA 3 is dealing with raytracing, because the LDS is used to store the BVH traversal stack.
VZ2j0pj.png
 

Gaiff

SBI’s Resident Gaslighter
First of all, I never said AMD RT implementation is better. I said it has potential to perform better if optimized for the hardware and consoles is helping with that.

Microbenchmarking AMD’s RDNA 3 Graphics Architecture
RDNA 3 makes a massive improvement in LDS latency, thanks to a combination of architectural improvements and higher clock speeds. Nvidia enjoyed a slight local memory latency lead over AMD’s architectures, but RDNA 3 changes that. Low LDS latency could be very helpful when RDNA 3 is dealing with raytracing, because the LDS is used to store the BVH traversal stack.
VZ2j0pj.png


The reason AMD should of went with 4nm, is because it gives 11% better performance while using 22% less power vs 5nm, not density. Which is the reason why Nvidia cards has better efficiency.
Having shitty RT effects isn't "optimizing". It's deliberately using light RT workloads so AMD GPUs don't end up looking like shit. Notice that 2/3 games you mentioned are AMD-sponsored. Hell, the RT reflections in RE4R, another AMD-sponsored game, are at times straight up worse than SSR to the point DF even recommends turning RT reflections off. AMD-sponsored games just want that little RT badge but their application is a joke and in no small part why people mock RT because they cannot tell the difference. Of course, if all your RT does is a bunch of shadows or crappy reflections at 1/4 resolution, it won't look like much. Competent ray tracing as seen in Cyberpunk, Control, Metro Exodus, or even Spider-Man Remastered on PC can be quite transformative. Sadly, the vast majority of people cannot run them properly so it remains a fringe feature.
 

Loxus

Member
The RT in Far Cry and Village might as well not exist. It's shit-tier low res reflections. As for Spider-Man, it's only reflections (and actually competent, especially on Ultra) but the performance is so close because there doesn't tend to be that much of it all at once. If you do go out of your way to find places with a lot of reflections, AMD absolutely tanks.

y8CFVTf.png
I'm comparing the latest hardware from both AMD and Nvidia though.
Which shows the potential AMD RT has.
W7hIetB.jpg
 

Loxus

Member
Having shitty RT effects isn't "optimizing". It's deliberately using light RT workloads so AMD GPUs don't end up looking like shit. Notice that 2/3 games you mentioned are AMD-sponsored. Hell, the RT reflections in RE4R, another AMD-sponsored game, are at times straight up worse than SSR to the point DF even recommends turning RT reflections off. AMD-sponsored games just want that little RT badge but their application is a joke and in no small part why people mock RT because they cannot tell the difference. Of course, if all your RT does is a bunch of shadows or crappy reflections at 1/4 resolution, it won't look like much. Competent ray tracing as seen in Cyberpunk, Control, Metro Exodus, or even Spider-Man Remastered on PC can be quite transformative. Sadly, the vast majority of people cannot run them properly so it remains a fringe feature.
I like how you guys talk about AMD sponsored, but not Nvidia sponsored.
 

CGNoire

Member
That title belongs to MS flight simulator, its technically more advanced and has far more advanced simulation in there clouds. Also the clouds stay static in Horizon Forbiden West.

Yes true but we should also add that MSFS streams its cloud data from a server which does the cloud formation data for the game based on real weather radar data available.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I'm comparing the latest hardware from both AMD and Nvidia though.
Which shows the potential AMD RT has.
W7hIetB.jpg
It'll come down to the same thing. Notice how the 3080 is barely faster than the 6900 XT when in areas with a lot of RT, it mops the floor with it. Hell, look at the 4090 outperforming the 4080 by a meager 20% when it usually beats it by 25%+. In heavy RT workloads, it can easily balloon up to 30 or even 40%.
I like how you guys talk about AMD sponsored, but not Nvidia sponsored.
Because the RT of AMD-sponsored games is awful.

RT_1440p-color-p.webp

And that's presumably with a CPU bottleneck. Because at 4K, the difference between the 4090 and 4080 increases.

RT_2160p-color-p.webp


With heavy RT workloads, the 7900 XTX often finds itself in the territory of the 3090. Your examples have nothing to do with proper optimization. They're just bad and lightweight RT implementations. Crank up those shadows, reflections, GI, and what have you, and AMD GPUs suffer immensely.
 

Buggy Loop

Member
First of all, I never said AMD RT implementation is better. I said it has potential to perform better if optimized for the hardware and consoles is helping with that.

Microbenchmarking AMD’s RDNA 3 Graphics Architecture
RDNA 3 makes a massive improvement in LDS latency, thanks to a combination of architectural improvements and higher clock speeds. Nvidia enjoyed a slight local memory latency lead over AMD’s architectures, but RDNA 3 changes that. Low LDS latency could be very helpful when RDNA 3 is dealing with raytracing, because the LDS is used to store the BVH traversal stack.
VZ2j0pj.png

Problem with all this is that you're benching a specific problem. The hybrid RT modules in the pipeline for AMD is choking with the bigger tasks. Inline ray tracing is the BEST these cards can achieve as the scheduling is not choking the pipeline, but that's a very specific use case where there's little to no dynamic shaders.


The reason AMD should of went with 4nm, is because it gives 11% better performance while using 22% less power vs 5nm, not density. Which is the reason why Nvidia cards has better efficiency.

Like i said, when TSMC says 11% better performance, it's a very specific part of the tech that uses that node shrink to better use and it's really not representative of the whole chipset. Benching a transistor gate performance and power draw is really not the same as a complex GPU. Like i said, AMD has GOOD reasons to go chiplet and we'll definitely all see cards going there eventually, because cache and memory controllers see a stagnation from shrinking nodes.

Nvidia having better efficiency is a LOT more in direct correlation that they continued on a monolithic die vs chiplet than any 4N vs 5N TSMC process.
 

CGNoire

Member
Even ray tracing solutions on PC heavily uses rasterization to counter the missing details from the ray tracing solution. Ray tracing by the book definition, you should be having HARD CONTACT shadows. Any softening of it is a hack or they make a random bounce to make a more accurate representation.
Thank You for clarifying something thats been confusing me for the last 20 years now. Back then I was messing around with 3DS Max alot and when using ray traced shadows they allways showed up with razor sharp edges but now a days you got all these game devs talking about how there RT shadows allow for a soft penumbra edges and Im like "excuse me"?
 
Last edited:

Buggy Loop

Member
Found probably the best video explanation of the Monte Carlo vs ReSTIR, it stays based on the offline render papers and doesn't take into account Nvidia's own twist on ReSTIR, but should give a good glimpse of the tech.




It's a smarter path tracing. For an optimization project at university for a real customer's product, we went through dozens of stochastic algorithms to improve upon slow Monte Carlo solution.

Here its the same. Why waste computational time on rays that will not even converge?
 

Turk1993

GAFs #1 source for car graphic comparisons
Yes true but we should also add that MSFS streams its cloud data from a server which does the cloud formation data for the game based on real weather radar data available.
Its still more advanced, even without internet it still has better cloud system and simulation. There are 1000's of pre loaded cloud formations already loaded in the game when you install it.


Also just look at this, watch the part at 0:45 where he goes in to the clouds
 

Loxus

Member
It'll come down to the same thing. Notice how the 3080 is barely faster than the 6900 XT when in areas with a lot of RT, it mops the floor with it. Hell, look at the 4090 outperforming the 4080 by a meager 20% when it usually beats it by 25%+. In heavy RT workloads, it can easily balloon up to 30 or even 40%.

Because the RT of AMD-sponsored games is awful.

RT_1440p-color-p.webp

And that's presumably with a CPU bottleneck. Because at 4K, the difference between the 4090 and 4080 increases.

RT_2160p-color-p.webp


With heavy RT workloads, the 7900 XTX often finds itself in the territory of the 3090. Your examples have nothing to do with proper optimization. They're just bad and lightweight RT implementations. Crank up those shadows, reflections, GI, and what have you, and AMD GPUs suffer immensely.
The 7900 XTX competitor is the 4080.
In the case of Hogwarts Legacy with RT enabled at 4k, 7900 XTX - 33fps vs 4080 - 46fps average. How the hell are you considering a 13fps average difference bad and lightweight?

I swear you guys look at a graph without looking at the numbers.
 

Gaiff

SBI’s Resident Gaslighter
The 7900 XTX competitor is the 4080.
In the case of Hogwarts Legacy with RT enabled at 4k, 7900 XTX - 33fps vs 4080 - 46fps average. How the hell are you considering a 13fps average difference bad and lightweight?
Hogwarts Legacy has competent RT implementation, not just shitty reflections or shadows. Hence why the difference is so large. The 4080 is 39% faster whereas it's actually slower at 4K/non-RT.

Ultra_2160p-color-p.webp

I swear you guys look at a graph without looking at the numbers.
My point is that the games you presented were light RT workloads with bad implementation. With serious RT, the 7900 XTX finds itself matching the 3090, NVIDIA's last-gen flagship. Nothing to do with optimization as you suggested.
 

CGNoire

Member
Ray tracing that first hit PCs in 2018 isn't a paradigm shift? The biggest rendering technique since the transition to 3D doesn't matter?

The only thing The Order 1886 was regarded for is its failure.


Non-RTGI games need not apply when we discuss best lighting. It's not possible for them to match competent RTGI seen in games like Metro Exodus or Cybperunk. Cyberpunk RT's lighting model was already far better than HFW. Overdrive just pushed it further ahead. It's delusional to claim that HFW was better in that department prior to the release of the Overdrive mode.
I do think it is fair to atleast mention that dynamic objects in CP still looked pretty rough with RT Psycho on which seemed to really only effect the enviroment to a great deal. Im not sure HFW didnt have better looking dynamic object lighting. Maybe Idk I just remember seeing CP with Psycho RT and thinking there dynamic objects looked straight flat alot of the time as well.

For me PT upgrades CP's dynamic objects alot more noticeably than its enviroments which says alot since the enviroment upgrade is also large.
 

CGNoire

Member
Its still more advanced, even without internet it still has better cloud system and simulation. There are 1000's of pre loaded cloud formations already loaded in the game when you install it.


Also just look at this, watch the part at 0:45 where he goes in to the clouds

Ok that was pretty BOSS.
 

Loxus

Member
Hogwarts Legacy has competent RT implementation, not just shitty reflections or shadows. Hence why the difference is so large. The 4080 is 39% faster whereas it's actually slower at 4K/non-RT.

Ultra_2160p-color-p.webp


My point is that the games you presented were light RT workloads with bad implementation. With serious RT, the 7900 XTX finds itself matching the 3090, NVIDIA's last-gen flagship. Nothing to do with optimization as you suggested.
What do you mean, hence why the difference is so large?

If we use Cyberpunk 2077 benchmarks.
At 4k RT ultra, the 4080 is only 8-9 fps better the the 7900 XTX.
EvHpCe8.jpg

Nvidia GeForce RTX 4080 vs. AMD Radeon RX 7900 XTX: Which should you buy?
7qrFizN.jpg
 

nemiroff

Gold Member
Yes true but we should also add that MSFS streams its cloud data from a server which does the cloud formation data for the game based on real weather radar data available.

Yes, but it's also worth adding real world data is optional. You can full customize the clouds yourself.

Edit: Ah, I was a bit late with my reply..
 
Last edited:

Buggy Loop

Member
Its still more advanced, even without internet it still has better cloud system and simulation. There are 1000's of pre loaded cloud formations already loaded in the game when you install it.


Also just look at this, watch the part at 0:45 where he goes in to the clouds


Sweating James Mcavoy GIF


Asobo are wizards

I really need to dust off my HOTAS and TrackIR and jump into this game. Or VR it..
 
Last edited:

bender

What time is it?
He has never bashed Dev's....he says I don't know what happened but the performance is poor. Also half of the guy on steam could make better video on what we want from PC ports that will not be as stupid as his video which was more about make my workflow easier. And If DF bashing was that regular their video wouldn't be posted here all the time.

As far as Alex is considered, this son of a bitch didn't shame God of War Dev's when the AMD GPU performance was abyssmal for moths because he had an exclusive interview and I am not saying that because I had AMD GPU, I was perfectly fine with my 3070 at 4k60. He also was so lazy that he didn't play uncharted 4 properly otherwise he wouldn't have missed the abyssmal memory leak issue and wouldn't have been suprised for Last of Us bad performance. Arkham knight had bad texture streaming pool config, uncharted LOT had it and now TLOU has it. Do the math sherlok.....Iron Galaxy sucks at it. He is crying shader compilation now but he was very forgiving about it when control launched with shader compilation issues n horrible texture streaming bug that persists till date. Go watch his old video. He basically promotes Ray tracing over optimization, only now he is crying about it because even the flagship card is struggling too much with it.

half of the guy on steam is my favorite Youtube channel.
 

MikeM

Gold Member
Just tried path tracing on my 7900xt for science:

1080p native: 15fps, mainly high settings

1080p fsr at balanced: 36fps, same settings

For comparison with all vanilla RT on, lighting set to medium:

1080p native: 58fps
1080p w/ fsr quality: 79fps

1440p native: 35fps
1440p w/ fsr quality: 65fps

FOV set to 90. Overall- game is very playable using vanilla RT on AMD hardware if you are good with a ~60fps framerate and using FSR.
 

Gaiff

SBI’s Resident Gaslighter
What do you mean, hence why the difference is so large?

If we use Cyberpunk 2077 benchmarks.
At 4k RT ultra, the 4080 is only 8-9 fps better the the 7900 XTX.
EvHpCe8.jpg

Nvidia GeForce RTX 4080 vs. AMD Radeon RX 7900 XTX: Which should you buy?
7qrFizN.jpg
You mean 38% at 4K? Of course 9fps doesn't seem much...if the baseline was 100 but when it's 20, it's a lot.

cyberpunk-2077-rt-2560-1440.png


Here it's almost 50% in favor of the 4080 or are you going to be dishonest again and go "it's only 20 fps"?

The 7900 XTX is on par with a 3080 in this game and gets beaten by a 3090.
 

Loxus

Member
C'mon, man. Those are pretty important FPS. Huge difference playing at 40FPS vs 25FPS.
You mean 38% at 4K? Of course 9fps doesn't seem much...if the baseline was 100 but when it's 20, it's a lot.

cyberpunk-2077-rt-2560-1440.png


Here it's almost 50% in favor of the 4080 or are you going to be dishonest again and go "it's only 20 fps"?

The 7900 XTX is on par with a 3080 in this game and gets beaten by a 3090.
"Only"...... Someone urgently needs some math lections. We're talking 30-50%...
Why do you guys keep moving the goal post?

The 4080 is the 7900 XTX competitor not the 4090. So benchmarks should be 7900 XTX vs 4080, not 4090.

Benchmarking at 4k ensures the gpu is the bottleneck. 20fps vs 29fps isn't bad on AMD part.

Looking at a bigger picture, the 4080 is only 13-17% better that the 7900 XTX. Which imo isn't this huge difference you guys make it out to be.
8XpPWZo.jpg
 

sendit

Member
Immersion for this games application of path tracing pretty much breaks when your actual character reflects off nothing in the game world outside of the bathroom mirror.
 

Gaiff

SBI’s Resident Gaslighter
Why do you guys keep moving the goal post?

The 4080 is the 7900 XTX competitor not the 4090. So benchmarks should be 7900 XTX vs 4080, not 4090.

Benchmarking at 4k ensures the gpu is the bottleneck. 20fps vs 29fps isn't bad on AMD part.
Dude, are you a troll or do you not grasp what percentages are? 50% in Cyberpunk is the 4080, not the 4090. The 4090 trounces the 7900 XTX by almost 100% in that game. 20 vs 29fps is fucking awful because that's a 45% advantage for the 4080 when it's usually a bit slower in raster.
Looking at a bigger picture, the 4080 is only 13-17% better that the 7900 XTX. Which imo isn't this huge difference you guys make it out to be.
8XpPWZo.jpg
Once again you fail at basic mathematics. This aggregate includes games with lightweight RT workloads with garbage RT like Far Cry 6 or Village. Stick to games with strong RT such as Hogwarts Legacy, Control, Cyberpunk 2077, Dying Light 2, etc. The 7900 XTX routinely gets demolished by 20%+.
 

Rea

Member
Not sure it's the same SIMD that we're talking about. But For what's it worth, SIMD is Single Instruction Multiple Data. Not Simultaneous instruction.
SIMD
 

hyperbertha

Member
Again you are making these stupid claims and expect a proper conversation. How can i be a fanboy when i praised HFW constanstly before and called it the best looking game on console and one of the best on all platforms. And i played both of them and see it with my eyes which one is better. And again you claim that the lighting model in HFW is better than CP2077 even before the Overdrive update just show how delusional you are. Lighting and shadows are the biggest weak points of HFW, the RT physco lighting shits on HFW lighting and the Overdrive mode is just not even comparable.
Ay4Vbu0.jpg

PLhsI6K.jpg

3Sshul7.jpg


Just look at this

What exactly is this video supposed to prove exactly? And are you actually claiming that vanilla cyberpunk is a better looking game than fw? I already admitted overdrive has better lighting.
 

Turk1993

GAFs #1 source for car graphic comparisons
Last edited:

hyperbertha

Member
You must be blind to not see the bad lighting and lighting errors.

Yes.

The RT physco mode lighting is also miles better than the baked lighting in HFW.
The video showed the the game doesn't load exterior lighting when you are too inside the interior. A memory management issue there. Not a fault of the lighting itself.

And absolutely nobody considered cyberpunk as a graphical powerhouse in a next gen capacity before raytracing. The screenshots you put up doesn't hold a candle to horizon. Want proof? https://www.neogaf.com/threads/which-released-game-is-the-king-of-graphics.1655552/

This thread clearly shows nobody considers cyberpunk in the running without raytracing. It was good looking for last gen, but it did not usher in next gen graphics unlike demon souls and FW. Its a last gen game while horizon is a proper next gen game. My original argument is that console generations are at the forefront of graphics, and that is proved by Horizon and order 1886.
 

Turk1993

GAFs #1 source for car graphic comparisons
The video showed the the game doesn't load exterior lighting when you are too inside the interior. A memory management issue there. Not a fault of the lighting itself.
So bad lighting model with lighting errors because of baked lighting thanks.
And absolutely nobody considered cyberpunk as a graphical powerhouse in a next gen capacity before raytracing. The screenshots you put up doesn't hold a candle to horizon. Want proof? https://www.neogaf.com/threads/which-released-game-is-the-king-of-graphics.1655552/

This thread clearly shows nobody considers cyberpunk in the running without raytracing.
The game launched with the full RT pack, do your research before you talk all that nonsense and bullshit.
It was good looking for last gen, but it did not usher in next gen graphics unlike demon souls and FW. Its a last gen game while horizon is a proper next gen game. My original argument is that console generations are at the forefront of graphics, and that is proved by Horizon and order 1886.
So Horizon is a proper next gen game without any next gen graphical features, but CP2077 with its path traced lighting, RT reflections and RT shadows is a last gen game right LMFAO.
clown-icegif-9.gif
 

yamaci17

Member
The video showed the the game doesn't load exterior lighting when you are too inside the interior. A memory management issue there. Not a fault of the lighting itself.

And absolutely nobody considered cyberpunk as a graphical powerhouse in a next gen capacity before raytracing. The screenshots you put up doesn't hold a candle to horizon. Want proof? https://www.neogaf.com/threads/which-released-game-is-the-king-of-graphics.1655552/

This thread clearly shows nobody considers cyberpunk in the running without raytracing. It was good looking for last gen, but it did not usher in next gen graphics unlike demon souls and FW. Its a last gen game while horizon is a proper next gen game. My original argument is that console generations are at the forefront of graphics, and that is proved by Horizon and order 1886.

there are more people with PS5s than people with GPUs capable of running cyberpunk at 4k/dlss balanced+raytracing (not necessarily path tracing).

only reason most would pick FW over PS5 is because

- they played FW at crisp native 4K 30 FPS mode
- they played Cyberpunk on PC at blurry dogshit 1440p

because most wouldn't concisously choose 30-45 FPS on GPUs like 3070 3070ti 3080 on PC to hit 4k/dlss. instead, most played Cpunk at 1440p to get high framerates (50+) at 1440p. and game simply doesn't really look too hot at 1440p.

same goes for FW. in FW, performance mode looks dogshit too compared to resolution mode. nothing like what people are posting. but then again, its a compromise of 30/40 FPS. only problem is, one is TPS, the other is FPS, which makes it playing harder due to the nature of first person game.


cyberpunk + 4k dlss quality + ray tracing (no path tracing) is leaps and bounds better than HFW. only reason why someone would choose hfw over it is because they didn't see/play Cyberpunk at that config or they simply prefer art style of HFW more than Cpunk

as I said, most problematic thing with cyberpunk is that it needs 4K to shine. but same holds true for HFW. regardless; most games nowadays need 4K output/4K upscaling to shine. anything 1440p based will procure problematic results. TLOU remake at 1440p looks so blurry and abhorrent that it really doesn't feel like a PS5 exclusive game. native 4K however game brightens up and looks gorgeous.


I myself got around 40 fps 4k dlss balanced + ray traced GI at high with 3070. game looked gorgeous. when I play 1440p or 1440p dlss quality, game loses all its sheen and crispness and ray tracing cannot even save it at that point unless you really stop completely and give reconstruction more time to work with.


it really has to do with people and how they saw cyverpunk more than actual preference. its just that its really a select few / rich people who can get to see cyberpunk at 4k/ray tracing. I was able to do it due to my tolerance for 40 FPS. not many will do it.
 
Last edited:

Buggy Loop

Member
Super console centric & Sony darling forum voted en masse for an exclusive game while 90% of them not having experienced Cyberpunk 2077 RT on PC and had this for reference on their consoles at launch :

Eo0EIsrVoAUB_O3-b88c.jpg


Clearly proof that Horizon FW is better looking (?)..

christoph waltz nod GIF
 

damidu

Member
lol not even gonna bother to figure out what this thread devolved into.

for original topic, ill check it if/when i ever get a dlss3 card. looks dope
 

hyperbertha

Member
So bad lighting model with lighting errors because of baked lighting thanks.

The game launched with the full RT pack, do your research before you talk all that nonsense and bullshit.

So Horizon is a proper next gen game without any next gen graphical features, but CP2077 with its path traced lighting, RT reflections and RT shadows is a last gen game right LMFAO.
clown-icegif-9.gif
Going into the effort of putting gifs only prove you are far more triggered than me. Its also clear projection proving you might be the real clown here. That thread I posted was about comparison between cyberpunk with overdrive, not the 'raytracing' it initially launched with. Horizon is a next gen game because its got high density polycount, well above cyberpunk, and a much better lighting model, compared to the last gen game that is cyberpunk. The original cyberpunk on pc doesn't even hold a candle to fw, raytracing or no.
I can post screenshots of cyberpunk, with its 'next gen' raytracing looking like its got absolutely no global illumination in many instances, and the npcs looking like trash. FW, despite having no brute force RT, still manages to have a lighting model that approximates realism much better than vanilla cyberpunk.
 

Turk1993

GAFs #1 source for car graphic comparisons
Going into the effort of putting gifs only prove you are far more triggered than me. Its also clear projection proving you might be the real clown here. That thread I posted was about comparison between cyberpunk with overdrive, not the 'raytracing' it initially launched with. Horizon is a next gen game because its got high density polycount, well above cyberpunk, and a much better lighting model, compared to the last gen game that is cyberpunk. The original cyberpunk on pc doesn't even hold a candle to fw, raytracing or no.
I can post screenshots of cyberpunk, with its 'next gen' raytracing looking like its got absolutely no global illumination in many instances, and the npcs looking like trash. FW, despite having no brute force RT, still manages to have a lighting model that approximates realism much better than vanilla cyberpunk.
4vm5tc.jpg
 
lighting is realistically the only thing you have to trace.

given that the final image in a game is almost always a bunch of textures that are lit by the raytraced lighting, you could say that 99.9% of the pixels drawn while playing Cyberpunk's RT overdrive mode are colored due to the result of raytracing and the denoising applied to it, which is practically speaking nearly the same as Quake 2
Audio is what we really need.
 

Gaiff

SBI’s Resident Gaslighter
Going into the effort of putting gifs only prove you are far more triggered than me. Its also clear projection proving you might be the real clown here. That thread I posted was about comparison between cyberpunk with overdrive, not the 'raytracing' it initially launched with. Horizon is a next gen game because its got high density polycount, well above cyberpunk, and a much better lighting model, compared to the last gen game that is cyberpunk. The original cyberpunk on pc doesn't even hold a candle to fw, raytracing or no.
I can post screenshots of cyberpunk, with its 'next gen' raytracing looking like its got absolutely no global illumination in many instances, and the npcs looking like trash. FW, despite having no brute force RT, still manages to have a lighting model that approximates realism much better than vanilla cyberpunk.
Sounds like a lot of feelings and very little substance.
 
Going into the effort of putting gifs only prove you are far more triggered than me. Its also clear projection proving you might be the real clown here. That thread I posted was about comparison between cyberpunk with overdrive, not the 'raytracing' it initially launched with. Horizon is a next gen game because its got high density polycount, well above cyberpunk, and a much better lighting model, compared to the last gen game that is cyberpunk. The original cyberpunk on pc doesn't even hold a candle to fw, raytracing or no.
I can post screenshots of cyberpunk, with its 'next gen' raytracing looking like its got absolutely no global illumination in many instances, and the npcs looking like trash. FW, despite having no brute force RT, still manages to have a lighting model that approximates realism much better than vanilla cyberpunk.
We all just got a little dumber reading this.....
 

Goalus

Member
Going into the effort of putting gifs only prove you are far more triggered than me. Its also clear projection proving you might be the real clown here. That thread I posted was about comparison between cyberpunk with overdrive, not the 'raytracing' it initially launched with. Horizon is a next gen game because its got high density polycount, well above cyberpunk, and a much better lighting model, compared to the last gen game that is cyberpunk. The original cyberpunk on pc doesn't even hold a candle to fw, raytracing or no.
I can post screenshots of cyberpunk, with its 'next gen' raytracing looking like its got absolutely no global illumination in many instances, and the npcs looking like trash. FW, despite having no brute force RT, still manages to have a lighting model that approximates realism much better than vanilla cyberpunk.
What drugs are you on?
 
Top Bottom