• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel admits Core Ultra 9 285K will be slower than i9-14900K in gaming

Parity lol
5e2718c379ae46ccd78b6ae14a1c0038.png
kjx5Wgm.jpeg
 

Zathalus

Member
Those power and temperature reductions are very impressive. It seems the chip competes quite favourably with regular Zen 5 in most aspects. With the boost to Zen due to the recent Windows patches it will probably be a bit slower overall at the end but the difference would be minimal.

Frankly I would say you can’t go wrong with either Intel or Zen 5 for a gaming CPU… except that AMD is going to release X3D chips soon and that is going to be faster then regular Zen 5 and Intel by quite a bit.

Honestly any gamer skipping the X3D chips in favour of this (or regular Zen 5) is doing themselves a disservice.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
except that AMD is going to release X3D chips soon and that is going to be faster then regular Zen 5 and Intel by quite a bit.
I don't know about quite a bit, but I could see Zen5 X3D being anywhere from 5-15% faster, which isn't exciting for me.

There is no game I've seen where 7800X3D meaningfully improves over my current CPU. And my CPU trounces it in MT.

Honestly any gamer skipping the X3D chips in favour of this (or regular Zen 5) is doing themselves a disservice.
X3D doesn't make sense for the majority of gamers. Most gamers don't need more than a 7600X or 13600K.

Most gamers aren't playing games with a 4090 on a 540 Hz 1080p monitor.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Right, the gaming crown is only important when Intel holds it. Just like multithreading performance didn't matter during the Zen 2 era, but is now super important again.
Gaming crown only mattered back when AMD couldn't even hit a stable 60 in certain games.

Today with high refresh and the fact that the top CPUs are all within 5-10% of each other (X3D included) makes it meaningless to me. I myself game at 120 Hz, why am I going to pay a premium for an extra 5-15% when my current CPU already reaches 120 Hz in 90% of games?
 

Superbean

Member
I love the wiggle words, slimy language and constant goal post moves. Going to be a conspiracy theorist and say it's a brilliant troll being written with chatgpt. There's no way a human could be so utterly devoted to a company whilst simultaneously devoid of any intellectual weight. You have many fooled my guy, absolutely brilliant show
 

Zathalus

Member
I don't know about quite a bit, but I could see Zen5 X3D being anywhere from 5-15% faster, which isn't exciting for me.

There is no game I've seen where 7800X3D meaningfully improves over my current CPU. And my CPU trounces it in MT.


X3D doesn't make sense for the majority of gamers. Most gamers don't need more than a 7600X or 13600K.

Most gamers aren't playing games with a 4090 on a 540 Hz 1080p monitor.
If you don’t see the need to upgrade then don’t, but if a gamer is looking for a new platform then Zen 5x3D simply makes the most sense right now from a performance standpoint.

I suppose Intel might win on the price/performance front. With the temperature and thermals not being stupid anymore, Intel could be the good budget option.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
I suppose Intel might win on the price/performance front. With the temperature and thermals not being stupid anymore, Intel could be the good budget option.
And I've always been about price/performance, which is why I went with a 13600K and overclocked both it and the memory, giving my better than 7800X3D MT with ~92% of 7800X3D gaming performance.

Put the two CPUs side by side in my setup and there is not even a performance difference between the two, since I game at 1440p, with a mid-range GPU and go up to only 120 FPS with VRR.

WELCOME BACK LEONIDAS

sparta GIF
Thanks :messenger_sun:

But I must bid you adieu, goodbye and goodnight.
 

Schnauzer

Member
I actually prefer Intel. I was having faster performance (originally) in the games I play on the 14900k. The reason I switched to the 7950x3D was because I got 14900k that was defective within a month (which is now more of a main stream problem) and Microcenter had 7950x3D at the time. I had a lot of performance issues with the 7950x3D's performance. These problems were recently fixed for the Ryzen 9.... One issue was resolved due to the Jays2Cent video that discovered that we needed to change the bios to prefer drivers. It's crazy that it took a whole generation to tell us we needed to make this change. That alone made me happy with my CPUs performance. Then we had the Windows update that improved my performance even further. A year later, and I'm now a happy 7950x3D owner.

Now we have a disappointing CPU and GPU generation unfolding. It seems like the Intel chip will perform much worse than my current CPU and require more watts. The AMD 9series generation has been a disappointment Now the 9950x3D will not launch until next year. This has been the weirdest CPU generations for me in history. Has anyone else had a similar experience?
 

FireFly

Member
Gaming crown only mattered back when AMD couldn't even hit a stable 60 in certain games.

Today with high refresh and the fact that the top CPUs are all within 5-10% of each other (X3D included) makes it meaningless to me. I myself game at 120 Hz, why am I going to pay a premium for an extra 5-15% when my current CPU already reaches 120 Hz in 90% of games?
During the Zen 2 era games were made with netbook CPUs in mind and even a 3700X was enough to get over 100 FPS. Now we have games like Microsoft Flight Simulator, BG3, Dragon's Dogma 2 and Elden Ring which can drop below 60 FPS on the fastest CPUs, and a bunch more that are stuck below 100 FPS. Of course, now we actually need the extra performance it doesn't matter, but when games were getting hundreds of FPS, that 10% or so that the 3700X was down over the 9900K was terribly important.
 

marquimvfs

Member
Seems that Arrow Lake (probably) the fastest gaming CPU architecture of 2024,
My prediction of Arrow Lake taking the 2024 gaming crown increases even further :messenger_sun:
Intel has to regress to not become the gaming CPU champion.
Will be a disaster if this comes out and trades blows with 1.5-2 year old CPUs, like Zen5.
Well, there you go.

Disappointed that there was no gaming performance improvement, but gaming is only one aspect when it comes to CPU.
Gaming crown only mattered back when AMD couldn't even hit a stable 60 in certain games.
You said that, but back when you believed in your predictions, gaming was all that mattered.
I just don't see how this is relevant to Gaming Discussions. Gaming had basically nothing to do with these things.
 

Celcius

°Temp. member
If the 285K is slower than the 14900k for gaming then it's going to be slower than Zen 4 X3D for gaming too. I'm extremely interested to see how sales go for Arrow Lake at launch...
 
Last edited:

Allandor

Member
You seem very ignorant when it comes to this subject.... you taught me nothing, yet again.
Also Intel ignored their official memory speed numbers in every test do far while the competition always gets slower speeds because of official support. If I remember correctly, raptor lake numbers were also presented with more than 6000 DDR5 speed, while ryzen was at the same chart measured with official limits.
 

Allandor

Member
Maybe so, but it remains to be seen.

But it does seem Arrow Lake will benefit from faster RAM. DDR5 8000 is expected to boost Arrow Lake by another 5%.

At any rate, no matter which CPU I upgrade to, Arrow Lake, X3D, we're only talking at best a 5-13% increase over my overclocked 13600K. A very small number. Something which I don't care about, but I will end up upgrading because in addition to faster gaming, I will also get much better MT and ST and probably much better emulation as well, thanks to the fact that Arrow Lake appears to have the highest ST of any desktop CPU in 2024.
Don't forget that this will also hurt efficiency a lot. Higher clocked memory normally also increases the energy consumption of the CPU, as the controller is inside of it.

It will be interesting what the missing hyper threading will do in real tests.
 

Monad

Member
X3D doesn't make sense for the majority of gamers. Most gamers don't need more than a 7600X or 13600K.

Most gamers aren't playing games with a 4090 on a 540 Hz 1080p monitor.

This is true for those of us who play in 4K.

I also have a 13600K along with the 4090, and at no point have I experienced CPU bottlenecks when playing at that resolution.

In any case, this is relevant to those who are in the competitive scene or play at 1080p (or maybe 1440p).
 

GHG

Member
This is true for those of us who play in 4K.

I also have a 13600K along with the 4090, and at no point have I experienced CPU bottlenecks when playing at that resolution.

In any case, this is relevant to those who are in the competitive scene or play at 1080p (or maybe 1440p).

Even at 1440p it's pretty difficult to be CPU bottlenecked unless you have one of the top 3-4 GPUs. The quickest way to find out if you're bottlenecked is to find your GPU in techpowerup.com benchmarks for the game you want to play and make note of the FPS (using spider man as an example):

spiderman-remastered-2560-1440.png


Then you want to pair that GPU with a CPU that has an equal or better fps on the chart below:

spiderman-2560-1440.png


So for the 4080, in that game at 1440p you need a CPU that is capable of doing 190fps and above (so the 13700k as a minimum). You can also see how many frames you're giving up in that specific scenario by having various lower tier CPU's.

Not an exact science, but it should give you a rough idea. Just use the same methodology and cross reference charts for the specific games you'd like to check at the resolution you game at and it will show you what CPU you need to get in order to prevent your GPU from being bottlenecked at that resolution:


 

Xyphie

Member
The official max memory speeds that Intel publishes are always 1dpc. For example, the 14900K, has max speed of 5600 Mbps for 1 Dimm per Channel. But a speed of 5200 Mbps for 2 dimms per channel.
So for using dual channel memory, the official value is 5200 Mbps. Not 5600 Mbps.

?

If you populate both channels with 1 DIMM it's still just one DIMM per channel. The lower specs are only for when using 2R DIMMs or using 4 DIMMs in total.
 
IDK how reliable this is, but if real:



Perf is pretty much on par but the CPU is running anywhere from 10 to 20º celsius hotter than the 14th gen counterpart. Without knowing specifics it's hard to judge but... doesn't look good.
 

GHG

Member
latency worse than Zen1
f1cfbc73fc7f1f1a033ec86b78c2e709.png

Expected.

From a marketing perspective making ultra high frequency ram able to run with these new boards/chips sounds great. That is, until you realise you'll forever be banished to gear 2 or even gear 4. Until they sort out the memory controller so it's able to run high frequency ddr5 in gear one then it will always be all show and no go.
 
Last edited:

winjer

Gold Member
105ns is impossible. That is close of gddr6 latency.
Probably an aida64 bug. Or some memory oc to break bandwidth records, sacrificing latency.
 

winjer

Gold Member
Definitely OC'd RAM, look at those bandwidth numbers. (It's over 9000) Mt/s

I doubt it's 9000Mbps memory.
For example, 8000Mbps gives a max theoretical value of 128GB/s.
A proper memory tweaked kit will do 90-95% efficiency.
So my bet that was using 8000-8200 Mbps memory kit.
 
Last edited:

Bojji

Member
Waiting from GN title " Waste of Sand"
9d7b794529660afef60f03abdc3d73c3.png

Is this standard in game bench from menu?

It's GPU limited in most cases, both GPUs are on similar level but you have this FSR/DLSS auto bullshit left in. Should have used 720p native and that's it.

Shit test to be honest.
 

Bojji

Member
Yeah you can stay until Zen6, but i'm waiting for 9800X3D reviews

Those review deleted from OC3D, seems they was thinking today is review day

I love that power consumption is in "hahaha" territory again, ugh.

It's BAD for everyone that we have weak Intel launch...

Where is Leonidas?
 

winjer

Gold Member
:pie_roffles: :pie_roffles: :pie_roffles: :pie_roffles: Meme Lake
GalN_vzagAAsUAq

I think Factorio is one of those games that really like low latency for the system memory.
That the 285K is doing so badly, might mean that memory latency is indeed terrible.

Still, I want to see more reviews, before drawing any conclusion.
 
Top Bottom