mysteriousmage09
Member
Less performance than an old Desktop CPU.
That's why devs are putting tons of tasks the CPU usually does onto the GPU.
Less performance than an old Desktop CPU.
That's why devs are putting tons of tasks the CPU usually does onto the GPU.
BUT MY OLD COMPUTER FROM A THOUSAND YEARS AGO WAS AT 2GHZ AND THAT WAS A PENTIUM 4 WHAT GIVES
Yeah wtf, 1,6 ghz!? So the new consoles have half the power of the old consoles!?!? #yolo
So an average quad core desktop i5 from last year would outperform an 8-core Jaguar at 1.6 GHz by a factor of 3 or so. As expected.
I guess the best thing you can say about these CPUs is that they are "good enough".
So an average 4 FP design i5 from last year would outperform 4 FP Jaguar at 1.6 GHz by a factor of 3 or so. As expected.
Just a question, juguar x 4 means four cores or are the total 8 because each jaguar has 2 cores?
Jag is not bulldozer based, it should have FP hardware per a core.As always when looking at post-Bulldozer AMD designs, it's important to realize that AMD's "8 cores" consists of 8 INT and 4 FP units. Intel's 4-core designs have 4 INT and 4 FP units. Guess what games like to hammer on? That's right, the FP units. So really what you're saying is,
and that would be about right since Intel crushes AMD in FP grunt and has since the Core 2 days.
As always when looking at post-Bulldozer AMD designs, it's important to realize that AMD's "8 cores" consists of 8 INT and 4 FP units. Intel's 4-core designs have 4 INT and 4 FP units. Guess what games like to hammer on? That's right, the FP units. So really what you're saying is,
and that would be about right since Intel crushes AMD in FP grunt and has since the Core 2 days.
So an average quad core desktop i5 from last year would outperform an 8-core Jaguar at 1.6 GHz by a factor of 3 or so. As expected.
I guess the best thing you can say about these CPUs is that they are "good enough".
Jag is not bulldozer based, it should have FP hardware per a core.
Jaguar doesn't have a shared FP unit like bulldozer. Each core has its own INT and FP unit.
No offense, but some of you guys, really don't have any clue about those stuff.
![]()
No offense, but some of you guys, really don't have any clue about those stuff.
![]()
How does the GPU compare to video cards anyways within the last few years?I'm glad the CPUs are so weak.
I won't have to replace my i2500k for a looooooooooong time.![]()
Less performance than an old Desktop CPU.
So an average quad core desktop i5 from last year would outperform an 8-core Jaguar at 1.6 GHz by a factor of 3 or so. As expected.
I guess the best thing you can say about these CPUs is that they are "good enough".
It's silly to look only to the CPU and call it the day. You should factor the whole picture. Like what kind of tasks is it supposed to perform. Which tasks has been moved to dedicated hardware. For example audio, which is a total performance hog, has it's own dedicated hardware in the machines. Computing assistance from the GPU has been getting bigger in the last couple of years (also on PC), so these machines have been designed with that in mind. The lowpower usage of the CPU frees up improvements in other area's like RAM and GPU.
No offense, but some of you guys, really don't have any clue about those stuff.
![]()
cool... so its half of PS4/XO CPU, without GDDR5, bus optimizations and same clock speed.
Audio in most games is anything but a big drain on system resources, the GPU is only good for massive parallel tasks, it is not a be all and end all fix for poor CPUs performance.
How does the GPU compare to video cards anyways within the last few years?
Audio on 360 could have a whole CPU core dedicated to it. How is that not a drain? 33% is pretty huge imo. The GPU computing might not fix everything, but it is performing tasks that the CPU would have to do. Again freeing up the CPU to do other things.
Audio on 360 could have a whole CPU core dedicated to it. How is that not a drain? 33% is pretty huge imo. The GPU computing might not fix everything, but it is performing tasks that the CPU would have to do. Again freeing up the CPU to do other things.
The difference will probably be quite a bit larger than this gen unless Microsoft has some bullshit stranglehold on devs that prevent them from taking advantage of the PS4's extra power. Such a gap is pretty big.
GDDR5 is not going to help a CPU, they already get tons of bandwidth.
That was only one game and an exception to the rule.
the 360 cpu was shit though
You seem to know more of the subject then me. How much resources of the CPU do you think is dedicated to audio on average?
So are these CPU's, so we are lucky they don't have to perform those tasks right! Audio is pretty crappy on 360 when you compare it to PS4, atleast we are spared that devs have to make cuts there.
Killzone3's CPU audio usage
http://images.eurogamer.net/articles//a/1/2/9/7/6/2/3/KZ_Jobs.bmp.jpg
(likely the yellow bar on SPU5)/
But that is CELL, it is a beast decoding this stuff. I reckon it will be different on x86 architecture based chips.
Well the PS4 GPU is 1.84 TFLOPs, the new GTX780 is 3.98 TFLOPS. Raw power-wise, this is the furthest gap between new consoles and PC hardware ever I think.
So are these CPU's, so we are lucky they don't have to perform those tasks right! Audio is pretty crappy on 360 when you compare it to PS3, atleast we are spared that devs have to make cuts there.
So 2GHz rumor looks unlikely at this point.
Ridiculous. Even with that increase 8 cores @ 2 Ghz would only be ~30W. That's peanuts compared to any desktop CPU or the last generation.
Less performance than an old Desktop CPU.
A 30W increase for homeconsoles is devastating.
Ridiculous. Even with that increase 8 cores @ 2 Ghz would only be ~30W. That's peanuts compared to any desktop CPU or the last generation.
A4 5000 is a 4 core APU. The one in PS4/XONE is 8 core and custom designed to get more peerformance . Misleading article since with time, developers would have better tools to run APUs .