• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry The PS5 GPU in PC Form? Radeon RX 6700 In-Depth - Console Equivalent PC Performance?

yamaci17

Member
I said 4060 as an example, it could just as easily be a 4060 Ti 16GB which I expect would be good for the rest of this gen for 1080p or better even when paired with a 5-year-old CPU and PCI-E 3.0. I just think it's silly calling them antiques, it isn't the early 2000s any more where a five year old CPU quickly became obsolete. Ideally a newer CPU is better, but swapping a motherboard and CPU is a big hassle (and expense) compared to just the GPU which takes 30 seconds to swap. Last PC I built was 2020 and I don't intend to do it again until my 10th gen i5 can no longer cut it. I don't foresee this being for many years and I will upgrade the GPU again before that.
i specifically call 3600 and alike antique because they have ccx structure. i do not call cpus such as 8700k or 9400f or 10400f that came out in that same period of time antique. AMD quickly pivoted away from CCX structure at least for 6-8 core gaming CPUs and keep using CCDs for 8+ core CPUs.

in my observations, the 3+3 core clusters are causing very random stutters and weird %1 low behaviors on some of the new games that truly stress CPUs.
 

Elysium44

Banned
i specifically call 3600 and alike antique because they have ccx structure. i do not call cpus such as 8700k or 9400f or 10400f that came out in that same period of time antique. AMD quickly pivoted away from CCX structure at least for 6-8 core gaming CPUs and keep using CCDs for 8+ core CPUs.

in my observations, the 3+3 core clusters are causing very random stutters and weird %1 low behaviors on some of the new games that truly stress CPUs.

I see, fair enough.
 

SlimySnake

Flashless at the Golden Globes
What you are talking about?

Scene that you showed was in 2160p mode and there was 30% difference, you said "maybe it's the CPU". I showed you scene from the same place but running in 2700p internal and it's still 30% difference. It was running over 100FPS in lower resolution so it can't be CPU limited...

s6bCmmf.jpg
Y7534Ou.jpg


84hos3.gif
so you have two identical GPUs with the same CUs, tflops, and one with less bandwidth, and the one with less bandwidth gets you an extra 20 fps, and you think the game is GPU bound?


But neither of those games are anywhere near CPU limited? Hitman 3 does 100fps+ (80fps minimums) while MHR does almost 200fps (160fps minimums) both with a 3600. Hitman can even do 70fps with a 1800x.
All this shows is that the PS5 is being bottlenecked by the CPU here. Richard's own tests showed the PS5 was performing like those zen 1 CPUs. so Hitman performing like an 1800x on the PS5 makes perfect sense.

Again. You have what is essentially the same GPU running the game with 44% more frames than the PS5. 55 vs 81 fps. Why? If its not the CPU then what is it?
 
Last edited:

Bojji

Member
so you have two identical GPUs with the same CUs, tflops, and one with less bandwidth, and the one with less bandwidth gets you an extra 20 fps, and you think the game is GPU bound?

There is no possibility that this is cpu bound where you can replicate same performance delta in obscene high resolution - 2700p.

If game was cpu bound it would only show difference like that in 2160p and not 2700p

Difference here can come from things like:

- average/bad port of the game on PS5 (there were many bad ports on consoles where they performed worse than pc parts matching their performance in theory)
- infinity cache making the difference in this title

But it can't come from cpu bottleneck. I think many of you don't even know how cpu limit looks like.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
so you have two identical GPUs with the same CUs, tflops, and one with less bandwidth, and the one with less bandwidth gets you an extra 20 fps, and you think the game is GPU bound?
Tflops aren't the same.

PS5 has only a peak of 10.28 Tflops since its clocked at "up to" 2.23 GHz, the PS5 is also lacking infinity cache. I'd imagine PS5 clocks are dropping here since the GPU is being taxed... but we have no way of knowing what the exact PS5 GPU clock is when it is being pushed like this...

6700 has a 2.45 boost clock (11.29 Tflops) and has the infinity cache which the PS5 is lacking.
 

SlimySnake

Flashless at the Golden Globes
Tflops aren't the same.

PS5 has only a peak of 10.28 Tflops since its clocked at "up to" 2.23 GHz, the PS5 is also lacking infinity cache. I'd imagine PS5 clocks are dropping here since the GPU is being taxed... but we have no way of knowing what the exact PS5 GPU clock is when it is being pushed like this...

6700 has a 2.45 boost clock (11.29 Tflops) and has the infinity cache which the PS5 is lacking.
ok, so thats 10% better. that would explain the 10% better results in avatar and alan wake's 60 fps modes.

the ones that are 40-50% better though. come on. If it was infinity cache magic or PS5 clocks dropping then it wouldve impacted the 30 fps benchmarks where the PS5 consistently leads the 6700.
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
ok, so thats 10% better. that would explain the 10% better results in avatar and alan wake's 60 fps modes.

the ones that are 40-50% better though. come on. If it was infinity cache magic or PS5 clocks dropping then it wouldve impacted the 30 fps benchmarks where the PS5 consistently leads the 6700.
Could be a memory latency issue, consoles use GDDR6 for system memory, which is bad in some situations, you can't do that on PC unless you use the 4800S (which no one uses, and that would also limit the PC GPU to 1/4 bandwidth (4x4).

Problem is, you can never get the perfect console vs. PC comparison since there is no way to build a PC that is exactly the same as a console.

The point of the video is how PS5-tier GPUs perform on PC, without the limitations of the console.

Its useful info for someone buying a PC GPU and wanting to target at least PS5 level performance. Its good to see PS5 level GPUs down at $280-$299.

Richard did not attempt to build a PS5 spec'd PC, as you, and some others were hoping for. You just can't do that. The closest you could get is the 4800S which again, would limit those GPUs to 1/4th bandwidth.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Could be a memory latency issue, consoles use GDDR6 for system memory, which is bad in some situations, you can't do that on PC unless you use the 4800S (which no one uses, and that would also limit the PC GPU to 1/4 bandwidth (4x4).

Problem is, you can never get the perfect console vs. PC comparison since there is no way to build a PC that is exactly the same as a console.

The point of the video is how PS5-tier GPUs perform on PC, without the limitations of the console.

Its useful info for someone buying a PC GPU and wanting to target at least PS5 level performance. Its good to see PS5 level GPUs down at $280-$299.

Richard did not attempt to build a PS5 spec'd PC, as you, and some others were hoping for. You just can't do that. The closest you could get is the 4800S which again, would limit those GPUs to 1/4th bandwidth.
Yeah, for the user that buys everything top of the line and then reserves $299 for the GPU for a gaming PC :D
 

winjer

Gold Member
There is no possibility that this is cpu bound where you can replicate same performance delta in obscene high resolution - 2700p.

If game was cpu bound it would only show difference like that in 2160p and not 2700p

Difference here can come from things like:

- average/bad port of the game on PS5 (there were many bad ports on consoles where they performed worse than pc parts matching their performance in theory)
- infinity cache making the difference in this title

But it can't come from cpu bottleneck. I think many of you don't even know how cpu limit looks like.

In that case, it is a CPU bottleneck.
That game is very basic in graphics, so these GPUs are not stressed enough.
The IF helps in memory bandwidth, but remember that the 6700 only has 320GB/s. The PS5 has 448GB/s.
On the other hand, the Zen2 on the PS5 has several limitations. Much less L3 cache, much higher latency due to GDDR6 controller, and lower clocks.
A fairer comparison, would be to use the 3600. DF has been touting the 3600 as the closest CPU to these consoles CPU, for 3 years, and suddenly they decide to use a 13900K. This is idiotic.
Yes, in some games it makes no difference, as they are GPU bound. But in others it makes a huge difference.
It will also make a difference for the minimum fps. As the 13900k will be able to brute force situations that a Zen2 CPU just can't.
Although he wo0uld have to fix whatever problem his setup of the 3600 has with latency, as this will skew results.

And another thing, DF could very easily have downclocked the 6700 to make it closer to the PS5 GPU.
This is something Richard has done several times before with other comparisons.
I don't know why he didn't do the same now. It's like he is trying to pile on the errors on his reviews.
 

Bojji

Member
In that case, it is a CPU bottleneck.
That game is very basic in graphics, so these GPUs are not stressed enough.
The IF helps in memory bandwidth, but remember that the 6700 only has 320GB/s. The PS5 has 448GB/s.
On the other hand, the Zen2 on the PS5 has several limitations. Much less L3 cache, much higher latency due to GDDR6 controller, and lower clocks.
A fairer comparison, would be to use the 3600. DF has been touting the 3600 as the closest CPU to these consoles CPU, for 3 years, and suddenly they decide to use a 13900K. This is idiotic.
Yes, in some games it makes no difference, as they are GPU bound. But in others it makes a huge difference.
It will also make a difference for the minimum fps. As the 13900k will be able to brute force situations that a Zen2 CPU just can't.
Although he wo0uld have to fix whatever problem his setup of the 3600 has with latency, as this will skew results.

And another thing, DF could very easily have downclocked the 6700 to make it closer to the PS5 GPU.
This is something Richard has done several times before with other comparisons.
I don't know why he didn't do the same now. It's like he is trying to pile on the errors on his reviews.

It can't be CPU bottleneck when resolution changes and difference between GPUs is the same, end of story.

Only game I'm not 100% sure that is GPU limited in this comparison is Hitman but in the end it probably is:

Ic0xoTx.jpg
 

winjer

Gold Member
It can't be CPU bottleneck when resolution changes and difference between GPUs is the same, end of story.

I admit I hadn't noticed they tested with 2160p and 2700p.
You are right, this can't be a CPU issue.
On the other hand, the difference in pixel count between 2160p and 2700p is 50%.
When changing resolution, both the PS5 and the 6700 are scaling at around 50%. Though there is some variation in aligning each run.
Could be just a bad port on the PS5.
 

Bojji

Member
I admit I hadn't noticed they tested with 2160p and 2700p.
You are right, this can't be a CPU issue.
On the other hand, the difference in pixel count between 2160p and 2700p is 50%.
When changing resolution, both the PS5 and the 6700 are scaling at around 50%. Though there is some variation in aligning each run.
Could be just a bad port on the PS5.

I think it's a bad console port or some extreme case of infinity cache actually making big difference.

Sometimes people think that console games are always optimized but this isn't the case, there were many games that performed worse than they should be in theory.
 

winjer

Gold Member
I think it's a bad console port or some extreme case of infinity cache actually making big difference.

Sometimes people think that console games are always optimized but this isn't the case, there were many games that performed worse than they should be in theory.

If it was a matter of the IF, when changing resolution, we would see greater loses on the PS5.
But they scale both at around 50%. The 6700 slightly less, probably because it has 10% higher clocks.
It's probably just a bad port.
 

Gaiff

SBI’s Resident Gaslighter
MHR is not CPU limited and you have proof above your post ^
Then I stand corrected for MHR. I was certain it had similar fps at 2700p.

My problem is really only Hitman and I would have liked him to try a different CPU just to put his theories to the test.

MHR’s problem probably isn’t GPU either. There might be a bottleneck somewhere else but it likely isn’t the CPU.

I have no problem with most other tests.
 

Gaiff

SBI’s Resident Gaslighter
And another thing, DF could very easily have downclocked the 6700 to make it closer to the PS5 GPU.
This is something Richard has done several times before with other comparisons.
I don't know why he didn't do the same now. It's like he is trying to pile on the errors on his reviews.
Because the goal of the test is to compare the 6700 as is to the PS5's GPU. The video is the 6700 vs PS5. Why would he downclock the 6700 to the PS5's level? It would no longer be a 6700.

If he was trying to see how the PS5 GPU performed in a PC-like environment, then yes, downclocking the 6700 would have made sense but this isn't the case. It's a straight vs. Higher clocks + Infinity Cache vs Higher Bandwidth.
 
Last edited:

winjer

Gold Member
Because the goal of the test is to compare the 6700 as is to the PS5's GPU. The video is the 6700 vs PS5. Why would he downclock the 6700 to the PS5's level? It would no longer be a 6700.

If he was trying to see how the PS5 GPU performed in a PC-like environment, then yes, downclocking the 6700 would have made sense but this isn't the case. It's a straight vs. Higher clocks + Infinity Cache vs Higher Bandwidth.

To make it a closer, fairer comparison.
There are things we can't control, such as L3 cache and bandwidth.
But clock speeds are very easy to match.
 

Gaiff

SBI’s Resident Gaslighter
To make it a closer, fairer comparison.
There are things we can't control, such as L3 cache and bandwidth.
But clock speeds are very easy to match.
The point isn't to make a fair comparison though, just like it wasn't in the 4080 vs PS5 bit. It's really just, this desktop GPU has specs close to the PS5. How do they match up in games? That's why he took a 13900K to remove CPU bottlenecks on the 6700, unlike with his other video where he took a 4800S desktop kit with a downclocked 6700. The goal in that older video was to try and reproduce console specs on PC. This one is different. He also throws in a few other mid-rangers for good measure to see how they compare.
 
Last edited:

winjer

Gold Member
The point isn't to make a fair comparison though, just like it wasn't in the 4080 vs PS5 bit. It's really just, this desktop GPU has specs close to the PS5. How do they match up in games? That's why he took a 13900K to remove CPU bottlenecks on the 6700, unlike with his other video where he took a 4800S desktop kit with a downclocked 6700. The goal in that older video was to try and reproduce console specs on PC. This one is different.

The reason DF picked the 6700 for this test was that it's the closest GPU on PC, that matches the PS5 GPU. It is not some random comparison.
It's a comparison based purely on specs. And clock speed is one such speck that should have been taken into consideration.
And remember that DF has adjusted clock speeds in previous comparisons between console and PC GPUs.
 

Gaiff

SBI’s Resident Gaslighter
The reason DF picked the 6700 for this test was that it's the closest GPU on PC, that matches the PS5 GPU. It is not some random comparison.
Yes and Rich said as much.
It's a comparison based purely on specs. And clock speed is one such speck that should have been taken into consideration.
Because it's closest to the PS5...the goal isn't to reproduce PS5's specs on PC. It's to know how this $330 mid-ranger that's closest to the PS5 compares to it out of the box.
And remember that DF has adjusted clock speeds in previous comparisons between console and PC GPUs.
Yes, and they also used a 4800S and not a 13900K as I duly pointed out. Because the goal at that time was specifically to reproduce a console-like environment as closely as possible. This isn't the objective of this video. That's why he threw a few NVIDIA GPUs into the mix as well. In the other video, he tried to reproduce PS5's limitations on PC as much as he could. In this one, he doesn't because they have different goals.

You have to view this like any other GPU comparison that they did. 4080, 4070, 2070S, etc. He didn't downclock the 6700 because that wouldn't represent how it behaves in the real world.
 

winjer

Gold Member
Because it's closest to the PS5...the goal isn't to reproduce PS5's specs on PC. It's to know how this $330 mid-ranger that's closest to the PS5 compares to it out of the box.

Yes, and they also used a 4800S and not a 13900K as I duly pointed out. Because the goal at that time was specifically to reproduce a console-like environment as closely as possible. This isn't the objective of this video. That's why he threw a few NVIDIA GPUs into the mix as well. In the other video, he tried to reproduce PS5's limitations on PC as much as he could. In this one, he doesn't because they have different goals.

You have to view this like any other GPU comparison that they did. 4080, 4070, 2070S, etc. He didn't downclock the 6700 because that wouldn't represent how it behaves in the real world.

Once again we go into the non-sense comparisons with DF. Something they didn't use to do in the PS4 era.
Back then they would try to either patch specs, or price, when doing these comparisons.
Now they don't match specs, clock speeds, or even price. So what is the point?
Seriously, that 13900k alone costs more than the PS5. Not to mention the rest of the system.
And the comparisons with the 4070 and 4080 where downright idiotic.
This is just screwing around with no concrete goal.
 

yamaci17

Member
So explain to me what difference 13900k makes in these scenes:

t4wcXhW.jpg
QAdy6gi.jpg
CjGe270.jpg
I8N6Dpn.jpg
d2dH9en.jpg
HQ16syO.jpg
Ii7337t.jpg
it is due to cpu when it does fit their narrative
it is not due to cpu when it does not fit their narrative

:)

I'm quite disgusted by some takes in this thread that is being kept parroted the very same way they did in 4070/ps5 comparison threads.

ignore list is piling up. not surprising i was not able to see this thread at all and had to find it through google search since I blocked the OP (i was like, how the hell a topic was not opened for this. then i realized it was opened by a user I blocked). no surprising that person would be the one to rush to open this thread to garbage talk about it with his cohort.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
it is due to cpu when it does fit their narrative
it is not due to cpu when it does not fit their narrative

:)

I'm quite disgusted by some takes in this thread that is being kept parroted the very same way they did in 4070/ps5 comparison threads.

ignore list is piling up. not surprising i was not able to see this thread at all and had to find it through google search since I blocked the OP (i was like, how the hell a topic was not opened for this. then i realized it was opened by a user I blocked). no surprising that person would be the one to rush to open this thread to garbage talk about it with his cohort.
You need help. Don’t cry like a baby when you’re wrong like last time and then smugly accuse me of opening a thread to "talk garbage" with my cohorts whoever they may be.

What a pathetic manchild. Address someone directly if you’re going to shit-talk them.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Once again we go into the non-sense comparisons with DF. Something they didn't use to do in the PS4 era.
Back then they would try to either patch specs, or price, when doing these comparisons.
Now they don't match specs, clock speeds, or even price. So what is the point?
Seriously, that 13900k alone costs more than the PS5. Not to mention the rest of the system.
And the comparisons with the 4070 and 4080 where downright idiotic.
This is just screwing around with no concrete goal.
Eh, I think it's fine. The 2070S is constantly used in comparisons to the consoles. Now, the 6700 might be the better candidate overall, and certainly the best from the AMD camp.

Still, DF tends to select the most popular GPUs and it doesn't seem like the 6700 (or any RDNA2 GPU, really) is all that popular.

I would have liked to see more tests with the 4800S desktop kit and the downclocked 6700 though.
 

SlimySnake

Flashless at the Golden Globes
it is due to cpu when it does fit their narrative
it is not due to cpu when it does not fit their narrative

:)

I'm quite disgusted by some takes in this thread that is being kept parroted the very same way they did in 4070/ps5 comparison threads.

ignore list is piling up. not surprising i was not able to see this thread at all and had to find it through google search since I blocked the OP (i was like, how the hell a topic was not opened for this. then i realized it was opened by a user I blocked). no surprising that person would be the one to rush to open this thread to garbage talk about it with his cohort.
eh. this kind of metacommentary is not conducive to any good discussion. its no different from accusing each other of console warring. i thought we were above this.

Besides, you of all people should know that those zen 1 and zen 2 ryzens are completely trash. you play most of your pc games at 30-40 fps. you and i have been in dozens of these threads where im like my 3080 is giving me locked 60 fps while others with zen 2 CPUs complain about unoptimized ports instead of simply realizing that the low clocked zen 2 CPUs were not a good investment in the long run.

game after game after game, my i7-11700k has beaten its zen 2 counterparts. the ps5 cpu is even worse and acts more like a zen 1 cpu. similar to your 2700x and according to rich's own analysis, much worse than even that 2700x. I had people asking me how i was running Cyberpunk Path Tracing at 60 fps even at 720p internal resolutions.

This thread is ignoring years of data we have on console 60 fps modes struggling to hit 60 fps requiring severe downgrades to their resolutions. Those are clear indicators of CPU bottlenecks.
 

Audiophile

Member
No they are not. Ive been begging them to do this.

its a nerdy discussion about graphics, console hardware vs equivalent Pc specs. its literally why digital foundry exists.

now if the new york times was doing this analysis, id agree it would be pointless.
I like it, it's as close to an ideal test as we're gonna get to show the real-world, practical performance differences between similar software on a console with a fixed spec, a somewhat low-level api and the integration of an APU vs the open, varying spec of PC with higher-level api and discrete parts.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Bad optimization? Console games can suffer from that as well.
So only the 60 fps modes of these games are poorly optimized? the 30 fps modes arent? How does that work? They run the game on two different engines?

So explain to me what difference 13900k makes in these scenes:

t4wcXhW.jpg
CjGe270.jpg
d2dH9en.jpg
Ii7337t.jpg
It doesnt in these scenes.

So explain to me what difference 13900k makes in these scenes:

QAdy6gi.jpg
I8N6Dpn.jpg
HQ16syO.jpg
It does in the scenes above.

I just dont get why you guys choose to ignore 3 years worth of poor 60 fps modes on consoles. Just last week, helldivers is the perfect example of this. Runs at 1800p in the 30 fps mode and only 1080p in the 60 fps mode. thats 5.6 vs 2.1 million pixels. Why is it that the GPU was not a bottleneck at higher resolutions and only a bottleneck at 1/3rd the resolution??

Why does spiderman 2 only run at 40 fps when you cut the resolution by more than half to 1440p? Should be at 60 fps with room to spare. but you unlock the framerate and it stays around 40 fps. They had to settle for 1080p in the 60 fps mode.

Guardians of the Galaxy same thing. Rock solid native 4k 30 fps mode, trash 60 fps performance mode with WORSE settings. Starfield never got a 60 fps mode. Star Wars had to ditch RT altogether. Avatar, FF16 and Alan Wake 2 drop all the way down to 720p despite running the fidelity game around 1440p. So 3.7 million pixels vs 900k pixels. And thats AFTER massive downgrades to visual settings. Avatar actually goes up to 1800p while AW2 stays around 1296p which is around 2.8 million pixels.
 

SlimySnake

Flashless at the Golden Globes
CPU have nothing to do with resolution.
they dont but the GPUs have to work overtime which is why they have to downgrade the resolution further than they otherwise would.

if uncharted 4 could run at 1080p 30 fps on a ps4 gpu then it shouldve been able to run at 720p 60 fps but the modders had to drop the resolution all the way down to 540p just because the jaguar CPUs still couldnt run those games at 720p 60 fps. thats a quarter of the 1080p resolution. we are seeing the same thing this gen with all these native 4k 30 fps modes dropping resolutions all the way down to 1080p because the CPUs cant run them at 1440p forcing the GPU to work overtime which means downgrading the GPU load by reducing resolutions.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Eh, I think it's fine. The 2070S is constantly used in comparisons to the consoles. Now, the 6700 might be the better candidate overall, and certainly the best from the AMD camp.

Still, DF tends to select the most popular GPUs and it doesn't seem like the 6700 (or any RDNA2 GPU, really) is all that popular.

I would have liked to see more tests with the 4800S desktop kit and the downclocked 6700 though.
He chose the 6700 precisely because its specs are so close to the PS5. It has the same CU count. If his goal was to get as close to PS5 level performance as possible, he wouldve chosen the 10.6 tflops 6600xt instead of the 11.3 tflops 6700 but he didnt because it has 32 CUs.

6600xt is also way more popular in comparison so the main reason he chose this GPU was because of its CU count matching the PS5. And then he threw all that away by choosing an insane CPU and decided not limit clocks. Well then why choose this obscure GPU in the first place??
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Here is me running Cyberpunk at 720p internal resolution using dlss performance at 1440p.

Notice the GPU usage at 87%. And yet the game is at 51 fps. Why? Clearly, the GPU isnt maxed out. You guys seem to think that the CPUs cant be bottlenecked in the 50s and CPU limits only come to play at 100+ resolutions. I can find you similar bottlenecks in Starfield and Star Wars. Hell, Avatar has a CPU benchmark built in and shows exactly how much CPU is taxed during combat encounters. I could run other two benchmarks at a locked 4k 60 fps using dlss quality, but the cpu benchmark was consistently in the 50s. And this CPU shits on most zen 2 and zen 3 cpus in its class. The PS5 cpu is way worse than those zen 2 and zen 3 cpus and is holding back the GPU.

3z0MiNb.jpg
 

winjer

Gold Member
Eh, I think it's fine. The 2070S is constantly used in comparisons to the consoles. Now, the 6700 might be the better candidate overall, and certainly the best from the AMD camp.

Still, DF tends to select the most popular GPUs and it doesn't seem like the 6700 (or any RDNA2 GPU, really) is all that popular.

I would have liked to see more tests with the 4800S desktop kit and the downclocked 6700 though.

The 2070S kinda makes sense, as it is one of the closest GPUs from Nvidia, to the PS5. At least in rasterization.
The 6700 is easily the closest GPU. But downclocking it to 2.2Ghz makes it more accurate.
Of course, there are many differences between a PC and consoles, but it's still possible to make a comparison with the right hardware setup.

To make a proper comparison between products there must be common ground. Usually is a price point. Though an argument can be made for other things, such as compute power.
What doesn't make any sense is a comparison between a 400$ console and a 2000$ or 3000$ PC. Much less, when that PC uses hardware from a different company and from a whole different generation.
There isn't even a case for comparable performance, since a 4070 or 4080 with a 13900K, will easily outclass a PS5, in every metric.
Such comparisons denoted a lack of professionalism from Digital Foundry.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
He chose the 6700 precisely because its specs are so close to the PS5. It has the same CU count. If his goal was to get as close to PS5 level performance as possible, he wouldve chosen the 10.6 tflops 6600xt but he didnt because it has 32 CUs. 6600xt is also way more popular in comparison so the main reason he chose this GPU was because of its CU count matching the PS5. And then he threw all that away by choosing an insane CPU and decided not limit clocks. Well then why choose this obscure GPU in the first place??
Because its an actual product you can buy, and it has the same CU count as PS5. No one is buying a 6700 just to turn it down to match PS5 clocks, just like no one is importing a Chinese OEM kit to get high latency GDDR6 system memory.

There is nothing wrong with the video, it serves its purpose as showing how PS5-tier GPUs perform on an unrestrained PC, like typical PC GPU benchmarks are done...

He didn't make the video you wanted him to make, suck it.

Even if he did go for closer match on CPU, you'd still be complaining. "why'd he use 3600, it has more cache" "why'd he use 4800S, it has higher clocks (ignores 4800S 1/4 PCIe bandwidth)"
 

SlimySnake

Flashless at the Golden Globes
Because its an actual product you can buy, and it has the same CU count as PS5. No one is buying a 6700 just to turn it down to match PS5 clocks, just like no one is importing a Chinese OEM kit to get high latency GDDR6 system memory.
first of all, no one is buying a 6700. And secondly, like i said, 6600xt is a closer comparison.
 
Top Bottom