• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Leak - Intel 14900KS Limited Edition - 6.2Ghz, +400W, +100Cº

winjer

Gold Member

Starting with the specifications, the Intel Core i9-14900KS Desktop CPU will feature the same 24 cores and 32 threads in an 8 P-Core & 16 E-Core config. The chip will feature 36 MB of L3 cache and 32 MB of L2 cache. As for clocks, the base frequency will be set at 3.2 GHz while the boost frequency will be set at 6.2 GHz, making this CPU the fastest-clocked chip on the planet. This is 200 MHz faster than the current fastest Core i9-13900KS and Core i9-14900K CPUs.
But with higher frequencies also come higher power draw & temperatures. The Intel Core i9-14900KS features a base TDP (PL1) of 150W and can hit a peak power consumption of up to 410W with an average power rating of 330W. With such high TDPs, the temperatures of the chip reached up to 101 degrees (Celcius) and the maximum reported frequency was 5.9 GHz across all P-Cores.

This Is Fine GIF
 

RoboFu

One of the green rats
My Mac Studio usually tops out at around 90watts lol.

Who are these cpus for? Are gamers their only customers now?
 

RickSanchez

Member
hit a peak power consumption of up to 410W with an average power rating of 330W. With such high TDPs, the temperatures of the chip reached up to 101 degrees (Celcius)

the fuck ? Those are GPU numbers. My MSI 4090 rarely hits 400W (only on Cyberpunnk on 4k max settings till now) and has never gone above 90ºC temperature.

Meanwhile ive never seen my 7800X3D go above 100W or 70ºC
 
Last edited:

Kadve

Member
So... Who remembers when Intel boasted how NetBurst/Pentium 4 would reach 10GHz by 2005?

Realistically speaking, we should be able to see NetBurst based processors reach somewhere between 8 – 10GHz in the next five years before the architecture is replaced yet again. Reaching 2GHz isn’t much of a milestone, however reaching 8 – 10GHz begins to make things much more exciting than they are today. Obviously this 8 – 10GHz clock range would be based on Intel’s 0.07-micron process that is forecasted to debut in 2005. These processors will run at less than 1 volt, 0.85v being the current estimate.

(edit, granted from a certain perspective it kinda made sense. The made this boast in the year 2000. In 1995 their fastest processor was clocked at 133Mhz which by 2000 had increased to 1.13Ghz)
 
Last edited:
The CPU market is a disaster. You either go Intel and need a thermonuclear reactor just to power your PC while gaming, or you go AMD and have massively increased idle power draw compared to Intel. My 7950x3D at idle draws almost 30w more than my Intel rig does when just browsing the web and watching streams. It's not 400w, but the amount of time my PC is doing light work and sucking down a lot more power probably makes up for it. Not to mention this shit is super flaky, I've had 3 freaking burnt CPU memory controllers already and had to RMA my motherboard and CPU twice already. This last pair I'm just gonna run into the ground and when I swap them out I'm done with these CPUs. Problem is, where do I go because 400w house fire CPUs isn't it. What a mess.
 
The CPU market is a disaster. You either go Intel and need a thermonuclear reactor just to power your PC while gaming, or you go AMD and have massively increased idle power draw compared to Intel. My 7950x3D at idle draws almost 30w more than my Intel rig does when just browsing the web and watching streams. It's not 400w, but the amount of time my PC is doing light work and sucking down a lot more power probably makes up for it. Not to mention this shit is super flaky, I've had 3 freaking burnt CPU memory controllers already and had to RMA my motherboard and CPU twice already. This last pair I'm just gonna run into the ground and when I swap them out I'm done with these CPUs. Problem is, where do I go because 400w house fire CPUs isn't it. What a mess.
Grab a 10-12 gen i7 or i9. Normal draw numbers and dirt cheap. If you run a decent GPU, the bottleneck should be in the single digits. (Mine is 0,2% bottleneck using an i9 10900K and a 4070 Super).

Current CPU's are apparantly not optimised for power consumption. Or not optimised at all.
 
The CPU market is a disaster. You either go Intel and need a thermonuclear reactor just to power your PC while gaming, or you go AMD and have massively increased idle power draw compared to Intel. My 7950x3D at idle draws almost 30w more than my Intel rig does when just browsing the web and watching streams. It's not 400w, but the amount of time my PC is doing light work and sucking down a lot more power probably makes up for it. Not to mention this shit is super flaky, I've had 3 freaking burnt CPU memory controllers already and had to RMA my motherboard and CPU twice already. This last pair I'm just gonna run into the ground and when I swap them out I'm done with these CPUs. Problem is, where do I go because 400w house fire CPUs isn't it. What a mess.

I have no clue how you are going through hardware like that. Had no issues with the three AMD cpus I've upgraded through.
 

bbeach123

Member
The CPU market is a disaster. You either go Intel and need a thermonuclear reactor just to power your PC while gaming, or you go AMD and have massively increased idle power draw compared to Intel. My 7950x3D at idle draws almost 30w more than my Intel rig does when just browsing the web and watching streams. It's not 400w, but the amount of time my PC is doing light work and sucking down a lot more power probably makes up for it. Not to mention this shit is super flaky, I've had 3 freaking burnt CPU memory controllers already and had to RMA my motherboard and CPU twice already. This last pair I'm just gonna run into the ground and when I swap them out I'm done with these CPUs. Problem is, where do I go because 400w house fire CPUs isn't it. What a mess.
If I was you I would locking the CPU power limit to 125w just for the peace of mind .
 
So what? They're Benchmarking numbers.

You'll never get near them just doing normal desktop stuff and gaming.
No but if you're doing long rendering sessions you wouldn't want a blast furnace in your room. They're heavy stress numbers. Benchmarking is just 1 of the many processes that put strain on your CPU.
 

Dr.D00p

Member
No but if you're doing long rendering sessions you wouldn't want a blast furnace in your room. They're heavy stress numbers. Benchmarking is just 1 of the many processes that put strain on your CPU.

Well, IMO, it goes without saying that if productivity is the main use your PC is used for, you should be using AMD.
 
This is what happens when you really, really want to push past the efficiency curve just to get a % or two higher in some benchmarks. Crazy.

What kind of a cooler do you get to even keep this thing running.
 

hinch7

Member
Lmao that's probably more than my PC running while gaming or even full load.
 
Last edited:

RickMasters

Member
Damn…… that’s literally double what My X299 builds wattage. I have An ASUS Thor PSU that shows how ouch power it’s drawing. Never seen it go past 220 and I do some seriously CPU and RAM heavy audio work.
 
Top Bottom