Intel 12900KS Review Thread: 5.5 GHz Boost. The World's Fastest Gaming CPU.

Leonidas

AMD's Dogma: ARyzen (No Intel inside)

Make no mistake: the Special Edition Core i9-12900KS is the fastest desktop [gaming] PC chip ever built.


the 12900KS is without any doubt the fastest gaming processor available.

mZInb2A.png


Alder Lake is a beast! :lollipop_smiling_face_eyes:
 
cpu-temperature.png


$750 + exotic cooling. Nah
Prefer spending the money on GPU, RAM, SSD, etc… and less on CPU and cooling (that is a warm CPU and 5.5 GHz is the boost frequency from a 3.x GHz base clock…). Anyways, it has its fans… but I would like to see a proper new architecture than super exclusive binning and frequency boosts that drive the temperature sky high.
 
I saw the hardwaretimes review of this. It consumes 50W more for a 3% performance increase and it costs 750 USD. That's an insane price for just a CPU, PCMR people are getting rinsed.
 
Last edited:
And how much boost the electricity bill is gonna get? This is why I'm heading to Mac next with their ARM architecture for video/photo editing instead of these energy-thirsty x86 old crap.

Vintage Vhs GIF by vhspositive
 






mZInb2A.png


Alder Lake is a beast! :lollipop_smiling_face_eyes:
Not much of a beast when it has a TDP of 5000 watts and runs hotter than boiling point.
 
Funny enough, with that power consumption, you can easily run a Threadripper/Epic CPU.
Yes I know, "SingleThread-Performance" is not increased, but you can do so much more with that power. In the end. normally the GPU limits first. Who buys such a CPU to game in 1080p ...
And when the GPU no longer limits there are much better CPU out.
 
And how much boost the electricity bill is gonna get? This is why I'm heading to Mac next with their ARM architecture for video/photo editing instead of these energy-thirsty x86 old crap.

Vintage Vhs GIF by vhspositive
If you do that as your work every day the time you save is far more valuable than the extra electricity bills. Besides, it is still less power consumption than an air conditioner, for example.
 
Last edited:
This is just Intel doing a preemptive strike, pushing clocks as fast as it can, to be competitive in games against the 5800X3D.
 
And how much boost the electricity bill is gonna get? This is why I'm heading to Mac next with their ARM architecture for video/photo editing instead of these energy-thirsty x86 old crap.

Vintage Vhs GIF by vhspositive

ISA is almost irrelevant to power consumption and efficiency.
The advantage that Apple currently has is due to TSMC's process node being more advanced than what Intel has.
Another thing to consider is that power usage doesn't scale linearly with clock speeds. After a certain point, is more of a geometric increase.
This is why power usage is much higher, for just an increase of a few hundreds of Mhz. And just a few percentage points if performance.
 
IMO, 12900K is only necessary for gaming if you are aiming for 240hz 1080p, even 1440p is still GPU bound for most big budget games
 
Last edited:
Wheedoggie, that's an expensive way to add marshmallow toasting functionality to your rig.




Given time, I'm hoping we'll see more games start to adopt parallelism-focused methodologies like entity component systems so we can get more juice out of high core / thread count CPUs. There's only so far single-core can be pushed now that moore's law has declined.
 
Last edited:
Energy efficiency be damned! Who cares that electricity prices have skyrocketed just turn up the wattage!
 
I don't really understand this product as a gaming cpu. Unless your running something really modded, you wouldn't ever need this. If you have the money to buy this, then you would use something above 1080, which you would then be gpu limited. It's 4.1% faster than a 12600k at 1440p.
 
Alder lake is great, but that chip is a volcano. Nope.

Hell 12900k was already the best gaming chip so this is superfluous.
 
Last edited:
Yeah that's why the new Ryzen 7 5700X (65w) is more interesting for me as a future upgrade, rather then 5800X (105w) or 5800X3D (105w)
5800x 3d is going to be a dog… No one should get it over an 5900x or 5950x which funnily is more power efficient than 5900x because of better silicon, on average.

For me, i'm definitely getting the 5700x! Which is the best ryzen 8 core, easily in terms of value and thermals to performance ratio.
 
Last edited:
How about instead of 80% as fast, I'll take 80% faster at three same TDP as older chips like 8700k? This new process shit is a joke. No efficiency gains whatsoever.

If u think about it, same with GPUs. They relatively stayed in the same power range as before but were more efficient. Sometimes even less power for more performance. Then came raytracing and now if this trend continues in 15 years, they'll hook directly into your body to suck out your life force.

Its funny considering DDRX ram to become more and more efficient with every iteration, same with PCIe and then we have CPUs and GPUs starting to brute force performance. When reading news about GPUs, many don't even speculate about the power anymore but how those beasts will be cooled. That alone tells a lot.
 
ISA is almost irrelevant to power consumption and efficiency.
The advantage that Apple currently has is due to TSMC's process node being more advanced than what Intel has.
Another thing to consider is that power usage doesn't scale linearly with clock speeds. After a certain point, is more of a geometric increase.
This is why power usage is much higher, for just an increase of a few hundreds of Mhz. And just a few percentage points if performance.
at what cost? Apples M1 cores are bigger than this Intel CPU + an RTX3080 on a soc. And still there are many more transistors used. Such a chip can only be used in premium prized products. Would not work for AMD/Intel.
 
at what cost? Apples M1 cores are bigger than this Intel CPU + an RTX3080 on a soc. And still there are many more transistors used. Such a chip can only be used in premium prized products. Would not work for AMD/Intel.
It's also not as powerful either. So higher cost for less performance.
 
but can it run Crysis locked above 60fps?
I wonder when we will get the first CPU that can do that, as until now, that was still basically impossible.
 
And how much boost the electricity bill is gonna get? This is why I'm heading to Mac next with their ARM architecture for video/photo editing instead of these energy-thirsty x86 old crap.

Vintage Vhs GIF by vhspositive
troll elsewhere. you think this chip is for your little photo editing? hell, you might as well buy a chromebook instead if you're that sensitive to energy prices.
 
but can it run Crysis locked above 60fps?
I wonder when we will get the first CPU that can do that, as until now, that was still basically impossible.
I'm pretty sure it is already possible aside from a few checkpoint save hitches (doesn't count) and maybe that flying level.

That flying level will finally be beaten with some future am5 chip/intel within the not so distant future. Then the new question will be what can sustain 120 😁

But any modern chip basically rocks crysis at 60fps afaik.
 
Last edited:
I'm pretty sure it is already possible aside from a few checkpoint save hitches (doesn't count) and maybe that flying level.

That flying level will finally be beaten with some future am5 chip/intel within the not so distant future. Then the new question will be what can sustain 120 😁

no, at least like a year ago or so it was still impossible even without including hitches. there were still sustained drops below 60 in some missions, and yes the flying level is one of them and the worst offender.
 
If changing and wiring up a motherboard wasn't such a pain in the ass I would upgrade my CPU more often. Now I just wait until mine is completely outdated.
 
Top Bottom