• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel admits Core Ultra 9 285K will be slower than i9-14900K in gaming

winjer

Gold Member

Chinese outlets are leaking Intel’s presentation of Core Ultra 200S series.



A comparison between Intel’s own Core i9-14900K and the competing AMD Ryzen 9 series has been leaked. The slides in this article are from Intel’s upcoming Core Ultra 200 reveal event, which is scheduled to take place in two days. It seems someone disregarded the embargo on these slides.
In any case, we now learn directly from Intel’s slides that the company acknowledges their 14900K is actually faster in gaming than the upcoming 285K. Compared to the 14900K, the 285K offers 261 average FPS, while the 14900K achieves 264 average FPS. The positive aspect is that the new architecture enables Intel to reduce power consumption to 447W (presumably for the entire system).

  • Intel Core i9-14900K:
    • Average FPS: ~264 FPS
    • Power consumption: ~527 W
  • Intel Core Ultra 9 285K:
    • Average FPS: ~261 FPS
    • Power consumption: ~447 W
Another slide reveals that the 285K system can deliver comparable performance to Raptor Lake-R (Refresh) while consuming nearly 165W less power across numerous games.


Core Ultra 9 285K vs Raptor Lake-R, Source: @wnxod
Leakers have also shared slides comparing the AMD Ryzen 9 9950X and 7950X3D. The slides show that while the 285K is expected to outperform the current fastest X3D part (with 3D V-Cache) in productivity benchmarks. However, it will be up to 21% slower in games like Cyberpunk 2077.


Core Ultra 9 285K vs Ryzen 9 7950X3D, Source: @wnxod
Another slide shows a comparison between the 285K and the 9950X. The 285K is reported to have an average performance advantge of 0.26% compared to the 9950X (assuming PAR means 0%). However, Intel has not yet shared any power comparison between the Ryzen 9 and the 285K, or at least it hasn’t been leaked yet.


Core Ultra 9 285K vs Ryzen 9 9950X, Source: @wnxod



The first leak from the upcoming Core Ultra 200 showcase.

INTEL-CORE-ULTRA-200-ARROW-LAKE-IPC-HERO-1200x624.jpg


In two days, Intel is set to announce its new desktop CPU series, Arrow Lake-S. This series will not only feature a new name but also an entirely new architecture. The CPU architecture will be upgraded to Lion Cove P-Cores and Skymont E-Cores, similar to those featured in the Core Ultra 200V, also known as Lunar Lake.

Intel is currently briefing the media on the new series, and one of the slides has been leaked. It appears to showcase the IPC gains for both types of cores. The right side focuses on the IPC gain for Lion Cove over Raptor Cove, indicating a 9% increase. The left side claims that Skymont has a 32% higher IPC compared to Gracemont. Skymont’s significant gain is also due to increased L2 cache bandwidth.


Intel Core Ultra 200 Arrow Lake IPC claims, Source: HXL/Intel

Lion Cove includes 36MB of shared LLC (Last Level Cache) and uses a 16.67 MHz clock frequency, likely referring to improvements in instruction handling or scheduling based on clock cycles. The leaked slides also mention an improved memory system, with enhanced internal memory enabling 3MB of L2 cache per core.

These are just some highlights from the new presentation that is already circulating online. More details are expected to leak soon, including final SKU names and pricing.

Update: A new slide shared by Wnxod shows Core Ultra 9 285K compared to Ryzen 9 9950X in four benchmarks. It claims that Intel has 8% geometric increase over last-gen series in single-threaded applications and 4% advantage over Ryzen 9 9950X.


Intel Core Ultra 200 Arrow Lake 1T claims, Source: @wnxod/Intel


But at least it uses less power and doesn't degrade and kill itself, like 13th and 14th gen. So that's something....
 

JohnnyFootball

GerAlt-Right. Ciriously.
Honestly this is a disappointment. I was one of the few who knew it was a possibility that the Arrow Lake would actually perform worse than the 14 series, but I expected the power consumption to drastically decrease.

Yes, there was a decrease, but it's still quite a bit higher than I was thinking Intel needed.

Remains to be see how the more mainstream CPUs do.

Short version: if you have a 7800X3D, the 9800X3D is your best hope for an upgrade.

I am curious how much Intel CPUs will be helped using the CAMM2 RAM.
 
Last edited:

DenchDeckard

Moderated wildly
I think that losing 0.3% of performance in exchange for ~17% less power usage is a good trade off. The power these new chips have been consuming is insane.

I have heard from people testing these chips that they do run considerably cooler and draw a lot less power. So there is that benefit.

With a 360MM rad my 14900KS hits 100 degrees C on cinebench pretty easily. Its ridiculous.
 

nowhat

Gold Member
Remember the drama about Zen5 only improving performance by 5% in games...
...and how much was that due to Windows 11 sucking ass and later getting patched?

(I have no horse in this race, the last dedicated GPU I had was a 6600 GT, and yes, that was NVidia at the time, just here for the popcorn. Please continue.)
 

FireFly

Member
...and how much was that due to Windows 11 sucking ass and later getting patched?

(I have no horse in this race, the last dedicated GPU I had was a 6600 GT, and yes, that was NVidia at the time, just here for the popcorn. Please continue.)
Windows sucking lowered the performance of the 7700X and 9700X almost equally, so it didn't really change the competitive situation between them. The new patch just made AMD's positioning slightly better relative to Intel parts.
 
Last edited:

winjer

Gold Member
...and how much was that due to Windows 11 sucking ass and later getting patched?

(I have no horse in this race, the last dedicated GPU I had was a 6600 GT, and yes, that was NVidia at the time, just here for the popcorn. Please continue.)

After the patch, Zen5 performance over Zen4, improved by 1%.
 

winjer

Gold Member
I'm only really interested in single core IPC for use in emulation thesedays, TBH.

...and in that, these new CPUs look very, very tasty.

Of course we don't yet know how Arrow Lake will perform in emulation, but at this moment, Zen5 are still the best.


emulation-ps3.png
emulation-switch.png
 

Dr.D00p

Member
I think a benchmark leak showed a 285K topping the Passmark single core rankings table, beating AMDs fastest by 10% or so.
 

winjer

Gold Member

Celcius

°Temp. member
Disappointed that it’s not faster but the improved efficiency and thermals are a very welcome sight. Hopefully I’ll be able to air cool these under full load after all.
 
Last edited:

Chiggs

Gold Member
He was a king in the digital war,
NeoGAF forums, Intel to the core,
Never would admit to the silicon flaws,
13th-gen degradation? He’d ignore the cause.
"Voltage spikes? Nah, it’s running great!"
Leonidas stands tall, won’t debate,
Even as frames start to fall behind,
Brainwashed by Intel, he won't mind!

V-cache smokes him, but he doesn't agree,
“That's just AMD marketing trickery!”
14th-gen’s just playing the game,
Core Ultra’s a joke, just chasing fame,
Slower in gaming, it’s losing the fight,
But Leonidas says, “Intel’s alright!”

Leonidas, Leonidas, Leonidas!
(Leonidas, Leonidas, Leonidas!)
Leonidas, Leonidas, oh, oh, oh Leonidas,
Come and rock me, Leonidas!


"Just wait, Intel’s got tricks up its sleeve,”
But AMD’s 3D stacking makes him grieve,
Numerous reports showing Intel lost its way,
But Leonidas defends them every day!
Pat’s running round, asking for government cheese,
Begging senators to help him, saying "please oh please!"
Pat can't compete with AMD or Arm,
But Leonidas says, “What’s the harm?”

"Voltage spikes? Stop crying, you fool!"

14th-gen’s a dumpster fire, but to him it's "cool."
“Who needs stability when you’ve got the lore?"
"Intel’s for us fanboys, we don’t care about more!”


Leonidas, Leonidas, Leonidas!
(Leonidas, Leonidas, Leonidas!)
Leonidas, Leonidas, oh, oh, oh Leonidas,
Come and rock me, Leonidas!


"Silicon degradation? That’s just noise!"
He'll defend Intel, like they're one of his boys,
Core Ultra 9’s got nothing to prove,
Leonidas gets down to Team Blue's groove,

Leonidas, Leonidas, Leonidas!
(Leonidas, Leonidas, Leonidas!)
Leonidas, Leonidas, oh, oh, oh Leonidas,
Come and rock me, Leonidas!


Intel forever, no need to concede,
Even if Gelsinger’s down on his knees!
 
Last edited:

64bitmodels

Reverse groomer.
I'm at least glad Intel's trying to get better on their power consumption. It's actually a big deal for me as TBH it's their biggest problem besides the heat of their architecture.

Before this i was convinced that Nvidia Intel builds by 2025 were gonna be using up 1500w of power on average.

Power consumption is genuinely important, i'm sorry. Gaming systems should not need to take up THAT much power. Consoles will always be better than PC on that front due to the optimization but it doesn't mean we should be fine with 600w GPUs and 500W CPUs. we need better!
 
Last edited:

Celcius

°Temp. member
Interesting to see both AMD and Intel focusing on improving efficiency this time around even if it means not delivering the performance uplift between generations that we're used to seeing. Things were starting to get out of hand so we were overdue for this imho.
 
... I'm about to defend Intel.. What am I even doing right now.. Sighs.. The entire framing of this article is off base.

This is a comparison of two different Intel CPU families, Ultra and Core. The comparison is of their Desktop grade K series unlocked card, vs a laptop grade AI enabled chip.

it is neat that it's able to reach similar performance specs at a lower power draw considering the difference here. In short, this is a comparison of a Desktop chip vs a Mobile chip. It is stupidly impressive to see the mobile chip getting performance like that.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I'm at least glad Intel's trying to get better on their power consumption. It's actually a big deal for me as TBH it's their biggest problem besides the heat of their architecture.

Before this i was convinced that Nvidia Intel builds by 2025 were gonna be using up 1500w of power on average.

Power consumption is genuinely important, i'm sorry. Gaming systems should not need to take up THAT much power. Consoles will always be better than PC on that front due to the optimization but it doesn't mean we should be fine with 600w GPUs and 500W CPUs. we need better!
It’s bullcrap to casually dismiss power as not being a big deal. For me who lives an apartment where my entire bedroom runs off one circuit a few 100W of power matters quite a bit.
 

Bry0

Member
... I'm about to defend Intel.. What am I even doing right now.. Sighs.. The entire framing of this article is off base.

This is a comparison of two different Intel CPU families, Ultra and Core. The comparison is of their Desktop grade K series unlocked card, vs a laptop grade AI enabled chip.

it is neat that it's able to reach similar performance specs at a lower power draw considering the difference here. In short, this is a comparison of a Desktop chip vs a Mobile chip. It is stupidly impressive to see the mobile chip getting performance like that.
I hate to break it to you my man, but the 285k is a desktop part. This is arrowlake/15th gen. They are rebranding desktop chips too, the core name is dead.
 
Last edited:

64bitmodels

Reverse groomer.
It’s bullcrap to casually dismiss power as not being a big deal. For me who lives an apartment where my entire bedroom runs off one circuit a few 100W of power matters quite a bit.
I mean I always took it into account when thinking about PC it's just that on this forum nobody likes to mention it... at all. Unless it's for the sake of ripping on Intel and even then people are more pissed about their heat than their power. But Nvidia and AMD's dedicated GPUs have the exact same power consumption issues and nobody will ever bat an eye, even when the new 12VHPWR connectors are frying GPUs and turning them into bricks nobody gives a shit. Connectors that wouldn't be needed if Nvidia buckled the fuck up and optimized their GPU's power consumption.
 

Chiggs

Gold Member
I hate to break it to you my man, but the 285k is a desktop part. This is arrowlake/15th gen. They are rebranding desktop chips too, the core name is dead.

That's the greatest exchange in this thread...and really tells the tale.

In other news...
  • Intel fans two months ago: "Zen 5 is shit and won't take the gaming crown."
  • Intel fans today: "Gee whiz, just look at that power and efficiency of the 285k! Who cares if it's not the fastest?!"
Keep it up, cool guys. Nobody will catch on to your brilliant strategy.
 

64bitmodels

Reverse groomer.
That's the greatest exchange in this thread...and really tells the tale.

In other news...
  • Intel fans two months ago: "Zen 5 is shit and won't take the gaming crown."
  • Intel fans today: "Gee whiz, just look at that power and efficiency of the 285k! Who cares if it's not the fastest?!"
Keep it up, cool guys. Nobody will catch on to your brilliant strategy.
>"cool guys"
> Intel fanboys

Choose one
 
Top Bottom