• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Help me, PCMR, you're my only hope!

Nickolaidas

Member
Okay, so I am thinking of upgrading my 4060Ti 8GB to a 4070 Ti Super 16GB (double the TFLOPs, double the VRAM).
Problem is I have an i7-9700K CPU which causes a lot of bottleneck on that GPU (30% at 4K, 44% at 1440p) - according to the Bottleneck Calculator website my big brother frequents. ( https://pc-builds.com/bottleneck-calculator/ )

Right now I can't change my CPU, because I also have to upgrade my Aorus Z390 motherboard (as well at its memory) so it effectively takes my 1000$ upgrade to 2200$ (basically a new PC at that point).

So I am thinking of how much of a bottleneck I get if I use say, Quality DLSS on a game.

In such a case, by taking 1440p image quality and upscaling it to 4K (which is what Quality DLSS does, right?), is the bottleneck going to be 30%? or 44%? Note that I do not increase the framerate any higher than 60fps - due to my TV's limitations.

And extra question: is there a middle ground I can reach? A better CPU than my i7-9700K which is also compatible with my Z390 Aorus motherboard?


EDIT: Some people on the nvidia forums tell me that the bottleneck numbers the website gives me are bogus, and at 4K and 60fps (max), the game is almost entirely GPU bound and the CPU basically doesn't bottleneck the game. Is that true?
 

RaZoR No1

Member
IMO first of all the upgrade path you have planned is not really worth it..
You either wait for the next SKUs of Nvidia GPUs or you really have a very very cheap option to upgrade from 4060 to a 4070.
If you are not in a hurry, wait.
Currently good/best gaming CPUs are the AMD X3D ones.
But for them you need a new RAM/MB and CPU combo..

The most important thing here is, what games are you playing on what setting and resolution etc.
4k is mostly GPU bound.
A 4070 will not run 4k 60fps without any help from DLSS..
try to fiddel around and maybe play the games on 1440p. Most people cannot really see a big difference to 4K and it gives your GPU a easier time.

Have you tried OCing you CPU and maybe disabling Hyperthreading?
You could get a few more FPS and disabling HT could lower the temps (8 cores are enough).

Personally I would stick to the current setup and lower the setting a bit more
 
Last edited:

Rivdoric

Member
To put it simply :

- Your CPU charge will always be or pretty much be the same regardless of image quality. (Draw distance and other settings that increase the number of "smart" stuff like NPC's etc... being the only thing that could increase or decrease the charge more or less)
- Your GPU charge will on the other hand, have a lot more variables like resolution, rendering quality etc... etc...

So using that logic : Removing "CPU bottleneck" is taking into account the max fps your CPU can render something and trying to even it with your GPU charge so it matches.
Your CPU is at 100% and can't push more than 50 fps on that game and your GPU is stuck à 30/40% load ? Push resolution beyond your native one using DSR/DLDSR or push details of the scene.

You always will encounter a "bottleneck" in the end.

Your 9700k is perfectly fine for 50-60 fps overall. It would have been a big bottleneck if 100Hz.
As for should you upgrade from 4060ti to 4070ti is another debate. New GPU's are going to be released soon so i wouldn't buy. Your 4060Ti is fine really.
 
Last edited:
Ya, that site has a 7800X3d bottleknecking a 4070. lol OTOH, you're get some running 1440p if you go above 60fps. A 9700k should give you 60fps in most games if that's your target tho.
 
Last edited:

winjer

Gold Member
Okay, so I am thinking of upgrading my 4060Ti 8GB to a 4070 Ti Super 16GB (double the TFLOPs, double the VRAM).
Problem is I have an i7-9700K CPU which causes a lot of bottleneck on that GPU (30% at 4K, 44% at 1440p) - according to the Bottleneck Calculator website my big brother frequents. ( https://pc-builds.com/bottleneck-calculator/ )

Right now I can't change my CPU, because I also have to upgrade my Aorus Z390 motherboard (as well at its memory) so it effectively takes my 1000$ upgrade to 2200$ (basically a new PC at that point).

So I am thinking of how much of a bottleneck I get if I use say, Quality DLSS on a game.

In such a case, by taking 1440p image quality and upscaling it to 4K (which is what Quality DLSS does, right?), is the bottleneck going to be 30%? or 44%? Note that I do not increase the framerate any higher than 60fps - due to my TV's limitations.

And extra question: is there a middle ground I can reach? A better CPU than my i7-9700K which is also compatible with my Z390 Aorus motherboard?


EDIT: Some people on the nvidia forums tell me that the bottleneck numbers the website gives me are bogus, and at 4K and 60fps (max), the game is almost entirely GPU bound and the CPU basically doesn't bottleneck the game. Is that true?

First, don't put much trust on those calculators.
Bottlenecks will always depend on the game.

But yes, that 9700K will limit your performance in some scenarios.
Without changing the whole system, your best option is to buy a 9900K.
Compared to the 9700K, this has hyper-threading. So you will go from 8 cores and 8 threads, to 8 cores and 16 threads.
You will also get more more L3 cache. 12MB vs 16MB.

Another thing you can do is to tighten your memory timings. And maybe overclock your memory a bit.
This always brings performance up.
 

Soodanim

Member
  • Upgrading a GPU within the same generation seems like a poor move
  • Upgrading a GPU when a new generation is round the corner seems like a poor move
  • Upgrading the component that isn't holding back your performance seems like a poor move
If you want to find out if you can upgrade the CPU on the same mobo, Google what your mobo supports then check comparisons between your CPU and whatever the top supported CPU is

Do your own tests to see what is actually holding back your performance. It's no trouble at all to turn out stats while playing and will give you actual results unlike some website that doesn't have your machine.
 

Nickolaidas

Member
  • Upgrading a GPU within the same generation seems like a poor move
  • Upgrading a GPU when a new generation is round the corner seems like a poor move
  • Upgrading the component that isn't holding back your performance seems like a poor move
If you want to find out if you can upgrade the CPU on the same mobo, Google what your mobo supports then check comparisons between your CPU and whatever the top supported CPU is

Do your own tests to see what is actually holding back your performance. It's no trouble at all to turn out stats while playing and will give you actual results unlike some website that doesn't have your machine.
Lowering resolution improves performance.
Removing bells and whistles from the graphics improves performance.
Numerous games which need VRAM cause the game to stutter / suffer from frame drops after a while.
5070 seems a LOT worse than a 4070 Ti Super.
 
Last edited:

GHG

Gold Member
That bottleneck calculator won't be accurate. I put together a post recently that outlines how you can get a rough idea of how bottlenecked your system will be at a specified resolution, use this methodology instead:

Even at 1440p it's pretty difficult to be CPU bottlenecked unless you have one of the top 3-4 GPUs. The quickest way to find out if you're bottlenecked is to find your GPU in techpowerup.com benchmarks for the game you want to play and make note of the FPS (using spider man as an example):

spiderman-remastered-2560-1440.png


Then you want to pair that GPU with a CPU that has an equal or better fps on the chart below:

spiderman-2560-1440.png


So for the 4080, in that game at 1440p you need a CPU that is capable of doing 190fps and above (so the 13700k as a minimum). You can also see how many frames you're giving up in that specific scenario by having various lower tier CPU's.

Not an exact science, but it should give you a rough idea. Just use the same methodology and cross reference charts for the specific games you'd like to check at the resolution you game at and it will show you what CPU you need to get in order to prevent your GPU from being bottlenecked at that resolution:



The issue you might face is finding modern CPU reviews that still list your processor.

That said, I wouldn't recommend putting out any outlay for what essentially will be a side-grade for your GPU. New nvidia cards are arriving in a couple of months just wait it out.
 

Knightime_X

Member
Just hold out for a 5070ti at that point.
Maybe even a 5080.
But i think a 5070 ti\super will be overkill for many many games.
4k is mostly unnecessary when 1440p looks damn DAMN near just as good.

Keep your gpu for now and use that money to get a new PC for the better cpu benefits.
1440p is the sweet spot in both performance and image quality.
DLSS improves that even further.
 

winjer

Gold Member
It's true that changing the GPU, just months away from a new generation is not a good idea. So, if you can hold on a bit longer, it will probably mean you can get a cheaper 4070ti. Or just go for a 5070ti.

Regardless, there is a huge advantage that the 4070Ti has over the 4060. That is PCIe lanes.
With the 9000 series CPU and Z390 chipset, you only get PCIe Gen3. Which limits bandwidth between the CPU and GPU.
Bit it's much worse with a 4060, because that GPU only has 8 lanes of PCIe. While the 4070Ti has 16 lanes.
8 lanes of PCIe Gen 3 is really bad.
 

GHG

Gold Member
I also wanted to wait, but I was very disappointed at the specs of the 5070. It seems to be 25% weaker than the 4070 Ti Super.

The real world performance is what will matter, not the specs.

You have nothing to lose in waiting a couple of months, the prices for the current cards will drop further once the new ones launch.

What games specifically are you struggling with and how have you identified that it is indeed a VRAM bottleneck that you are running in to?
 
That cpu is dookie, start planning a proper cpu upgrade if you really want to game there. What resolutions are you looking to play at? What kind of games?

Best thing is to balance your system overall performance, spending again on a GPU without seeing any benefits will be a proper waste of money.
 

Nickolaidas

Member
The real world performance is what will matter, not the specs.

You have nothing to lose in waiting a couple of months, the prices for the current cards will drop further once the new ones launch.

What games specifically are you struggling with and how have you identified that it is indeed a VRAM bottleneck that you are running in to?
Final Fantasy games, Capcom games which gobble up my VRAM as I add graphic options, Mortal Kombat 1 needing to be played with DLSS at Performance in order to maintain 60fps, Black Myth Wukong Performance Test taking my DLSS to ultra performance in order to play at 40-45fps, etc.
 
I am curious what games will run below 60fps on this i7 9700K? I think some RT games might drop it below 60fps (callisto protocol with RT can sometimes drop below 60fps even on my 7800X3D), but overall that 9700K should still run the vast majority of games at 60fps, if not 200fps.

Here's 9700 vs. 7800X3D comparison. Only Starfield dipped below 60fps (57fps), but that's still very playable on the VRR display IMO.




If I were in your shoes, I would buy the 4070tiS or wait for the RTX50 series. You would be able run much higher resolutions (DLDSR / DSR), or RT settings and not to worry about VRAM.
 
Last edited:

Nickolaidas

Member
That cpu is dookie, start planning a proper cpu upgrade if you really want to game there. What resolutions are you looking to play at? What kind of games?

Best thing is to balance your system overall performance, spending again on a GPU without seeing any benefits will be a proper waste of money.
Yeah, I also fear that I need to buy a PC from the ground up. This PC was basically meant to go along with a 2080 Super.
 

Griffon

Member
The difference between those GPUs is minuscule. Unless you have too much money to burn, I say wait a while.

Your CPU is more than fine, it would only be a bottleneck if you had a 4090, and even then most games would still be perfectly fine.
 
Miniscule? The 4060 Ti has 8GB of VRAM and 22 TFLOPs. The 4070Ti Super is 16GB of VRAM and 44TFLOPs. It's basically twice the horsepower, unless I am reading the specs wrong.
Techpower up has it at 75% faster. Far from miniscule. Not to mention double the vram as well.

 
Last edited:
The difference between those GPUs is minuscule. Unless you have too much money to burn, I say wait a while.

Your CPU is more than fine, it would only be a bottleneck if you had a 4090, and even then most games would still be perfectly fine.
Not true. The difference is massive because the 4070tiS has twice as much VRAM. If the game is VRAM limited, the fps will drop to single digits. Some games like forspoken will also lower the texture settings on the 8GB GPUs to PS2-like levels. I had 8GB VRAM GPU not so long time ago so I know.

In my experience, modern games use about 10-12GB of VRAM at 1440p. At 4K I sometimes see 14GB VRAM usage. The 8GB VRAM is a huge bottleneck, modern games will stutter badly on this GPU.
 
Last edited:

Griffon

Member
Miniscule? The 4060 Ti has 8GB of VRAM and 22 TFLOPs. The 4070Ti Super is 16GB of VRAM and 44TFLOPs. It's basically twice the horsepower, unless I am reading the specs wrong.

That is nonsense.

Not true. The difference is massive because the 4070tiS has twice as much VRAM. If the game is VRAM limited, the fps will drop to single digits. Some games like forspoken will also lower the texture settings on the 8GB GPUs to PS2-like levels.

In my experience, modern games use about 10-12GB of VRAM at 1440p. At 4K I sometimes see 14GB VRAM usage. The 8GB VRAM is a huge bottleneck, modern games will stutter badly on this GPU.

Seems my assumptions were wrong... I wasn't realizing that the TI version of the 4070 was that big of a jump.
 

Nickolaidas

Member
The problem is that I absolutely want to have 16GB of VRAM, which means that I have to go with a 5080 RTX, if I want to upgrade in four months from now (Greece will probably receive the cards a little later than in the US).

A 4070 Ti Super right now costs 900 euro in Greece. And uses 280W.

The 5080 RTX will use 400W and will probably costs around 1500 euro, when it hits the Greek market. Plus, it's supposed to be around 54 TFLOPs, which is huge, but not THAT huge compared to the 4070 Ti Super's already impressive 44 TFLOPs.

So the way I see it, if I wait about four months, I'll pay almost 50% more what I would've paid for the 4070Ti Super, get a 20% performance increase compared to the 4070 Ti Super and with 40% more power consumption.

Honestly, I don't think it's worth it, at the moment.
 
Last edited:
Yeah, I also fear that I need to buy a PC from the ground up. This PC was basically meant to go along with a 2080 Super.
Maybe get lossless scaling app of steam, this should cover you for quite some time until you're ready to do the full upgrade. That GPU should be enough for you to get a good 2x FPS increase for the time being :messenger_sunglasses:
 

hinch7

Member
Might be better off with waiting for next generation of graphics cards at this point, if not for price drops. They are all coming Q1, next year if rumors are correct.

But your CPU will be a big limiting factor if looking at that kind of tier of GPU and I'd look towards upgrading that. There are deals for 5700X3D which can be had for well under $200 in some places (150 on AliExpress) which would be a massive upgrade. And a budget B550 motherboard that supports it. Can even reuse your RAM.
 
Last edited:

baphomet

Member
That small of a GPU upgrade would be super expensive for what would be a small-ish upgrade.

Either wait for the 5000 series to release and upgrade to that, or wait for it to release and upgrade to a higher end 4000 series at a much lower price than the current one.
 

Nickolaidas

Member
That small of a GPU upgrade would be super expensive for what would be a small-ish upgrade.

Either wait for the 5000 series to release and upgrade to that, or wait for it to release and upgrade to a higher end 4000 series at a much lower price than the current one.
It's literally twice as powerful. It isn't a small GPU upgrade at all.
 

winjer

Gold Member
The problem is that I absolutely want to have 16GB of VRAM, which means that I have to go with a 5080 RTX, if I want to upgrade in four months from now (Greece will probably receive the cards a little later than in the US).

A 4070 Ti Super right now costs 900 euro in Greece. And uses 280W.

The 5080 RTX will use 400W and will probably costs around 1500 euro, when it hits the Greek market. Plus, it's supposed to be around 54 TFLOPs, which is huge, but not THAT huge compared to the 4070 Ti Super's already impressive 44 TFLOPs.

So the way I see it, if I wait about four months, I'll pay almost 50% more what I would've paid for the 4070Ti Super, get a 20% performance increase compared to the 4070 Ti Super and with 40% more power consumption.

Honestly, I don't think it's worth it, at the moment.

All of that is speculation and rumours. Also, you can't compare TFLOPs from one generation to the next.
And then there is RT, which should see a bigger performance boost, than rasterization.
And we don't know exact prices, specs, clock speeds, features, etc.

Though it's also true, that you can buy a 4070Ti now and sell it later. By the time the 5070 series releases, the 4070Ti will still have 2.5 years of warranty, and that always benefits resale price.
 

Dr.D00p

Member
But yes, that 9700K will limit your performance in some scenarios.
Without changing the whole system, your best option is to buy a 9900K.
Compared to the 9700K, this has hyper-threading. So you will go from 8 cores and 8 threads, to 8 cores and 16 threads.
You will also get more more L3 cache. 12MB vs 16MB.

Going from a 9700K to a 9900K won't do that much, it will improve 1% lows rather than raise framerates to any significant degree.

I had a 9900K rig and would often turn off Hyper Threading to test for the difference in games and it was barely noticeable.

Most games are still optimised for no more than 8 threads in 2024.
 

T4keD0wN

Member
Just try playing your games on 720p or 540p to see at which point your GPU usage will drop below 99%, to see at which framerate youd be cpu bound.
The 5080 RTX will use 400W and will probably costs around 1500 euro, when it hits the Greek market. Plus, it's supposed to be around 54 TFLOPs, which is huge, but not THAT huge compared to the 4070 Ti Super's already impressive 44 TFLOPs.
Wait and see how much it costs first, 5080 msrp should be lower than 4080 msrp (doesnt mean itll retail for that much) and at worst 4070ti should drop in value.
It will 100% consume less power for same performance so just undervolt it or lock the power and youre way better off with 5080.
Stop using TFLOP as some gaming performance metric.

You should most likely upgrade the CPU first to at least 7600x/9600x/5800x3d or 14600k if you care about frametimes. Get a new CPU architecture, core count doesnt matter much.
Anyway i am playing on a 4080 and that card has aged really poorly and feels weak already, id recommend waiting for Blackwell.

4070ti for 900 is really a bad buy if you were to pair it with 9700k, upgrading the cpu is much more important since your GPU is already recent and decent for most resolutions, not every game will have framegen to help avoid the cpu bottleneck.
 
Last edited:

winjer

Gold Member
Going from a 9700K to a 9900K won't do that much, it will improve 1% lows rather than raise framerates to any significant degree.

I had a 9900K rig and would often turn off Hyper Threading to test for the difference in games and it was barely noticeable.

Most games are still optimised for no more than 8 threads in 2024.

Games now use more threads. Not all, but a few already do.
The 9900K also has slightly higher clocks and more L3 cache. So that always helps.
But you are right, it won't help a lot.

If he had an AM4 system, the recommendation would be very easy. Just get a 5800X3D, and he would almost double performance in games.
But with an Intel motherboard, upgrades are always extremely limited.
 

Dr.D00p

Member
Games now use more threads. Not all, but a few already do.
The 9900K also has slightly higher clocks and more L3 cache. So that always helps.

Actually, I could run my 9900K at 5.2Ghz @ 1.27v-core with HT disabled in '9700K' mode compared to a maximum of 5Ghz 1.31v-core in full 9900K HT mode.

Those higher clocks and (much) lower temps were often my preferred way of gaming, especially when using demanding emulators like RPCS3.
 

Cakeboxer

Member
Your lord and savior is here with an advice:

Sell your whole system around Christmas. Maybe you'll get like 600 Euro for it and built a completely new system with DDR5 after Christmas for ~2k.
 

Soodanim

Member
The problem is that I absolutely want to have 16GB of VRAM, which means that I have to go with a 5080 RTX, if I want to upgrade in four months from now (Greece will probably receive the cards a little later than in the US).

A 4070 Ti Super right now costs 900 euro in Greece. And uses 280W.

The 5080 RTX will use 400W and will probably costs around 1500 euro, when it hits the Greek market. Plus, it's supposed to be around 54 TFLOPs, which is huge, but not THAT huge compared to the 4070 Ti Super's already impressive 44 TFLOPs.

So the way I see it, if I wait about four months, I'll pay almost 50% more what I would've paid for the 4070Ti Super, get a 20% performance increase compared to the 4070 Ti Super and with 40% more power consumption.

Honestly, I don't think it's worth it, at the moment.
You don't think that the 50 series will cause a price drop in the 40 series? For me the possibility would be worth living with lower settings and waiting
 

lmimmfn

Member
I had a 6800k/x99 mobo/16Gig and 1080Ti, playing at 3440x1440 with gsync and bought a 4070Ti. The upgrade was fantastic, Cyberpunk was fully playable from 45FPS to 90FPS.

6 months later to fully unlock the 4070Ti potential I finally caved and got a 7800X3D.

So you could get the 4070Ti and just upgrade the rest later, you'll still benefit.
 
Just try playing your games on 720p or 540p to see at which point your GPU usage will drop below 99%, to see at which framerate youd be cpu bound.

Wait and see how much it costs first, 5080 msrp should be lower than 4080 msrp (doesnt mean itll retail for that much) and at worst 4070ti should drop in value.
It will 100% consume less power for same performance so just undervolt it or lock the power and youre way better off with 5080.
Stop using TFLOP as some gaming performance metric.

You should most likely upgrade the CPU first to at least 7600x/9600x/5800x3d or 14600k if you care about frametimes. Get a new CPU architecture, core count doesnt matter much.
Anyway i am playing on a 4080 and that card has aged really poorly and feels weak already, id recommend waiting for Blackwell.

4070ti for 900 is really a bad buy if you were to pair it with 9700k, upgrading the cpu is much more important since your GPU is already recent and decent for most resolutions, not every game will have framegen to help avoid the cpu bottleneck.
Buying a 6-core CPU in 2024 / 2025 isnt a good idea, especially coming from an 8-core CPU. Modern games decompress a lot of data and sometimes even more than 8 cores can already benefit games. In Space Marine 2 the minimum recommended CPU for the 4K texture pack is 12 cores.

Try to run TLOU Remake on 6 core CPUs and you will see stuttering and bad 1% lows.
 
Last edited:

M1987

Member
Buying a 6-core CPU in 2024 / 2025 isnt a good idea, especially coming from an 8-core CPU. Modern games decompress a lot of data and sometimes even more than 8 cores can already benefit games. In Space Marine 2 the minimum recommended CPU for the 4K texture pack is 12 cores.

Try to run TLOU Remake on 6 core CPUs and you will see stuttering and bad 1% lows.
I played TLOU remake on my 7600x,and got no stuttering
 

Nickolaidas

Member
I had a 6800k/x99 mobo/16Gig and 1080Ti, playing at 3440x1440 with gsync and bought a 4070Ti. The upgrade was fantastic, Cyberpunk was fully playable from 45FPS to 90FPS.

6 months later to fully unlock the 4070Ti potential I finally caved and got a 7800X3D.

So you could get the 4070Ti and just upgrade the rest later, you'll still benefit.
This is how I see it as well. Get the 4070Ti Super now, and a new CPU & Motherboard & RAM next Christmas.
 
I played TLOU remake on my 7600x,and got no stuttering
I have not tested the 7600x, but my 8 core 7800X3D was sometimes used at 80-90% in this game, and that doesnt bode well for 6 core 7600x.

6 core ryzen 3600 dipping below 60fps frequently



8 core ryzen 3700 running the last of us at over 60fps all the time

 

Bojji

Gold Member
Lowering resolution improves performance.
Removing bells and whistles from the graphics improves performance.
Numerous games which need VRAM cause the game to stutter / suffer from frame drops after a while.
5070 seems a LOT worse than a 4070 Ti Super.

I upgraded to 4070ti super month ago. It had good price and thing I needed the most: 16GB of VRAM.

5070 will suck, regardless of performance it will be limited by VRAM, I was surprised how mych some games gained from changing from 12GB to 16GB.

Now about that 5070ti? It could be good but we don't know if it will be the same price or more. Waiting is probably smart but 5070ti may not even exist in first wave of GPUs (it wasn't in first rumors) and price can be... whatever.

CPU? 9700 will cpu bottleneck you in some games but it's not terrible CPU, it should still play most games fine - it's close to performance to 5600X:

relative-performance-games-1920-1080.png


I think you should sell sell mobo/ram/cpu at some point and get something on AM5 but I don't think you need to do it now.
 

T4keD0wN

Member
Buying a 6-core CPU in 2024 / 2025 isnt a good idea, especially coming from an 8-core CPU. Modern games decompress a lot of data and sometimes even more than 8 cores can already benefit games. In Space Marine 2 the minimum recommended CPU for the 4K texture pack is 12 cores.

Try to run TLOU Remake on 6 core CPUs and you will see stuttering and bad 1% lows.
Not how it works at all, games dont do everything asynchronously so the core count matters way less than normal applications like handbrake.
Most game engine wont fully utilize/allocate every thread, you can easily test this by disabling cores in your bios or using process lasso. Leaving 16 threads enabled will not result in double the performance than when leaving 8 enabled, in fact you should even gain gaming performance if you disable SMT/HPET on most new cpus.

Games scale less and less the more cores you have, you wont gain 50% by going from 4 cores to 6, youll gain roughly half of that in the very best cases and you can count the amount of games that see improvements by going from 6c12t to 8c16t on 2 hands.

IPC matters the most since games will be limited by whatever is on the golden path so you want the individual cores as powerful as possible rather than having as many as possible since the remaining ones will be waiting for other processes to be completed before they can continue to work.

To oversimplify: if your task is to paint a room and you have 3 tools for painting and 3 ladders then having 40 workers waiting in line wont speed things up, you want 3 amazingly performing ones instead with each having one amazingly performing assistant, similar principles apply in computing workload paths.

Thats not to mention total CPU power matters since a 4 core cpu can be twice as powerful than an 8 core cpu, thats why even a 7600x or 5600x will wipe the floor with ryzen 1950x in every single game.
 
Last edited:
Not how it works at all, games dont do everything asynchronously so the core count matters way less than normal applications.
Most game engine wont fully utilize/allocate every thread, you can easily test this by disabling cores in your bios or using process lasso. Leaving 16 threads enabled will not result in double the performance than when leaving 8 enabled.

IPC matters the most since games will be limited by whatever is on the golden path so you want the individual cores as powerful as possible rather than having as many as possible since the remaining ones will be waiting for other processes to be completed before they can continue to work.

To explain it easily, if you have 3 tools then having 40 workers waiting in line wont speed things up, you want 3 amazingly performing ones instead with each having one amazingly performing assistant, its quite similar in workloads.

Thats not to mention total CPU power matters since a 4 core cpu can be twice as powerful than an 8 core cpu as such even a 5600x will wipe the floor with ryzen 1950x in every single game.
I heard similar talk in 2012. People thought HT made no difference, so they bought 3570K instead of 3770K. A few years later, the 3770K absolutely destroyed the 3570K thanks to HT. In some games like Shadow of the Tomb Raider, I saw 35fps drops in the hubs while the 3770 was over 60fps.

DSOG always test HT in their performance analysis articles, and people can see how this technology drastically improves framerate if only the game is designed to use more threads.

CPUs are relatively cheap (compared to GPUs). I paid $2500 for my PC and the 7600X was only $100 cheaper. An 8-core CPU will certainly last me longer. I would have to be literally broke to consider getting a 6-core CPU, and at that point upgrading my PC would not be a priority at all.
 
Last edited:
Top Bottom