• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Anandtech: Tech analysis -- PS4 vs Xbox One, PS4/Xbox One's CPU performance

Do you expect MSAA to be used in this coming generation? really?. 32MB is enough for 1080p framebuffer without MSAA.
Sure. Maybe not in "AAA" games that need to blow performance on assets and effects to compete in bullshots, but in plenty of other games.

I own 5 PS3 games that use 4xMSAA.
 
Sure. Maybe not in "AAA" games that need to blow performance on assets and effects to compete in bullshots, but in plenty of other games.

I own 5 PS3 games that use 4xMSAA.

What I mean is, given the cost of MSAA I don't think it will be the prevalent method of dealing with Aliasing. That is beside the point anyway as Anand's post states that it is enough for 1080p framebuffer. Including MSAA (much less 4xMSAA) as a measure for the validity of that post is wrong.
 
I would if I could. It depends on how much developers are allowed to depend on the exact performance characteristics of the GDDR5 incarnation of the PS4. I assume those would be rather hard to replicate with an entirely different memory type -- remember that MS actually had to dedicate hardware to essentially slow down the SoC version of 360. Even small changes are hard in the console space if they are not designed for from day 1.

Thanks for the answer, informative as always! :)
 
when you see guys in a ms "technical panel" to talk about the power of the cloud the very day they present their hardware, basically you know whats up.
not that we didn't know, but it was easy to believe that microsoft had the time, the funds and the little foresight needed to reaccess.

Can you smell that next gen wars in the air yet?

Ahh, how I missed thee!
what war? do you see a war? all I see an easy win.
UNLESS sony does a microsoft on consumer rights too.



ps4vsx1obuhs.gif
kudos!
 
Nothing was mentionned about 4k...if this console lasts 8 years,do we know if an upgrade can be done to support it...didnt the 360 start at 720P and they patched it to support 1080p and HDMI and all...
 
OK . . . maybe I should know the answer to this . . . . but what is the deal with going to slower clock speeds? I know we have more cores now but can't we have higher clock speeds and more cores? Is the lower clock speeds to increase yield? To save power?
 
Nothing was mentionned about 4k...if this console lasts 8 years,do we know if an upgrade can be done to support it...didnt the 360 start at 720P and they patched it to support 1080p and HDMI and all...

I think microsoft one will last a lot less than 8 years.. imo at least
 
Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.

I doubt it will be used much for AI because of the latency involved with data communications online, but there are many uses for it in games. Just off the top of my head, the cloud could be used for:

* "Live look-ins" in sports titles. Imagine you're playing Madden and in the huddle picking your next play during a game in Franchise mode. Up pops a window showing an important play from another game going on at the same time between two computer controlled opponents. All the physics, collisions, and AI for that replay were computed in the cloud while you were playing the last down or two, then the replay file was sent to your Xbox One to be displayed between plays.
* I love the Road to the Show mode in MLB The Show, but there are some games where I spend an inordinate amount of time looking at the "Simulating until your player's next important moment" screen then actually playing. A lot of that simulation could be offloaded to the cloud, which could work through every possible outcome of your current action until your next one so that it loads immediately once your finished with the current one.
* The Forge mode in Halo 4 includes generated lighting for maps. The generation of those lightmaps could be offloaded to the cloud.
* Calculating physics for real-time cutscenes, thus freeing more power in the console for rendering.

How did game OS's go from a few MB to several GB of ram usage in one generation?

The wii's entire OS ran on 64mb of ram and now the wiiU uses 1gb.

The 360 went from 32mb reserved for the OS and now is using 3gb.

It does not use 3GB. Full Windows 8 only needs 2 GB for a 64-bit processor. A pared down version of it is not going to require more. From the article in the OP:

I suspect ~1GB of system memory ends up being carved off for Windows.

I'm just wondering... what happens if you registered a code but break/lose your disc? Is having the disc inside the console required? Or is it like Steamworks where you can trash your disc as long as you have the code?

As soon as you input the code, you don't need the disc anymore. The game is attached to your account permanently. Since the game is automatically installed to your hard drive, you no longer need to put the disc in to play. Should you ever get a new Xbox One, after logging into your account on the new XO, you can download that game from the Live servers without ever needing the disc.

Nothing was mentionned about 4k...if this console lasts 8 years,do we know if an upgrade can be done to support it...didnt the 360 start at 720P and they patched it to support 1080p and HDMI and all...

The Xbox One supports 4K.

Audio and video: 1080p and 4K both supported; 7.1 surround sound.

Source
 
It does not use 3GB. Full Windows 8 only needs 2 GB for a 64-bit processor. A pared down version of it is not going to require more.

Marc Whitten, Microsoft Chief Product Officer, says you're wrong.

If Xbox One was using just Windows 8 as an OS you might have something, but you are forgetting that Xbox One has 3 OSs.
 
Price please.
Absolutely. That's what it come down to.

I think this future gen hardware is the same as current gen...
Nintendo puts out a significantly underpowered console with a gimmick control system, except this time people are not falling for the gimmick hype.
Sony makes a console that is slightly more powerful than Microsoft, and they'll have the same strategy of trying to hype the difference.

But it all comes down to price. The difference between the top 2 isn't going to justify a huge price difference, and Sony needs to avoid the PS3 pricing nightmare.

Next gen, but same old strategies.
 
Not really, no. But that's not the important point, GDDR5 uses quite a bit more power than DDR3.

However, I don't see an issue. It will still be much less total TDP than a launch PS3, and Sony had no problem cooling that.

And for good reason, as my original PS3 sounds like it will take off when I am doing anything with it lol. Hopefully both consoles will be much more silent this time
 
It doesn't seem like the horsepower difference is enough to cause some dramatic difference in multiplatform games.
 
It doesn't seem like the horsepower difference is enough to cause some dramatic difference in multiplatform games.

The difference will probably be quite a bit larger than this gen unless Microsoft has some bullshit stranglehold on devs that prevent them from taking advantage of the PS4's extra power. Such a gap is pretty big.
 
Yep, the ps4 is going to be a pretty large console it looks like. I wonder how they will handle the power supply, internal or external?
 
PS4 has 50% more GPU power
If XBox one only has 5 GB available for games, and PS4 has 7GB, that's pretty close to 50% more RAM too
How many CPU cores are being reserved for each machine?


And how about kinect? 2Gb/s is a lot of data - is it known whether kinect is doing processing onboard, or using the Xbox to do it again?
 
Specs didn't matter In any of the previous generations. consoles will be judge on the games they have and don't have.

Yes, and unless MS starts money hating, they are in serious trouble. How can they compete with all Sony first and second party studios?
 
With these piece of shit specs from both consoles, I'm more tempted than ever to jump into PC gaming. The gulf between PC and console has never been so wide, and the cost to build a beast of a PC never so low.

Damn those exclusive games.

And the architectures have never been so similar, so ports should be a small enough cost to be a no-brainer
 
Didnt Gabe (and it was a pretty long and good piece) say during the dice tech summit that cloud stuff related to gaming was actually very bad.
 
Marc Whitten, Microsoft Chief Product Officer, says you're wrong.

If Xbox One was using just Windows 8 as an OS you might have something, but you are forgetting that Xbox One has 3 OSs.

It's not 3 OSes in the traditional sense

OK . . . maybe I should know the answer to this . . . . but what is the deal with going to slower clock speeds? I know we have more cores now but can't we have higher clock speeds and more cores? Is the lower clock speeds to increase yield? To save power?

Yes. That's why you have net book CPUs rather than desktop-grade CPUs.

What I mean is, given the cost of MSAA I don't think it will be the prevalent method of dealing with Aliasing. That is beside the point anyway as Anand's post states that it is enough for 1080p framebuffer. Including MSAA (much less 4xMSAA) as a measure for the validity of that post is wrong.

Unfortunately i see post AA being even more common this gen.

Yes, and unless MS starts money hating, they are in serious trouble. How can they compete with all Sony first and second party studios?

The same way they competed this previous generation
 
Marc Whitten, Microsoft Chief Product Officer, says you're wrong.

No he hasn't. Every source claiming he has links back to the same Game Informer interview where he never comments on the split.

If Xbox One was using just Windows 8 as an OS you might have something, but you are forgetting that Xbox One has 3 OSs.

It has 2 OSes and uses a custom hypervisor to switch between them. The "3 OS" talk was marketing fluff. The 2 OSes are "Windows" and "Xbox."

Didnt Gabe (and it was a pretty long and good piece) say during the dice tech summit that cloud stuff related to gaming was actually very bad.

He was referring to streaming the entirety of the game from the cloud like Online and Gaikai. Not offloading some tasks to the cloud.
 
It seems with everything I am reading this 8GB DDR5 in the PS4 is a really big deal. If Microsoft was banking on the PS4 using 4GB DDR5, I would have loved to see the looks on their faces when Cerny said 8GB DDR5.

With Sony having their financial woes I was expecting the new Xbox to be more powerful. I think it is the case of Sony knowing what they lost this gen, with their backs against the wall they are coming out swinging this time. I hope they both do well because they push each other so hard and we get amazing stuff.
 
Sure. Maybe not in "AAA" games that need to blow performance on assets and effects to compete in bullshots, but in plenty of other games.

I own 5 PS3 games that use 4xMSAA.

What are they?

And you will probably end up owning about the same # of PS4 games that do at the end of next gen.
 
No he hasn't. Every source claiming he has links back to the same Game Informer interview where he never comments on the split.

yup. i can confirm. he declines to comment (as you'd expect, the os reserves on 360/ps3 were never publicly revealed).

i think the split is 5-3, but i also think that's the ps4 split. i think both can be reduced over time, i'd suspect they settle around 2gb in the final decision but it could take months/years to get there.
 
So much generosity with language. Their focus is more "broad." Lol.

Price please.

What does it even mean too? Isn't the whole point of hardware the following two things ->

1. Make it price conscious so consumers can jump it at a convenient price point
2. Make it easy to develop for with a wide-range of tools and the most possible power given your pricing strategy

Depending on which you're aiming for the most, your strategy will be different... but PS4's philosophy is as broad as it gets, since it gives the most developers/indies the most number of potential tools and power. It's that simple, until we find out about price points.

If Xbox One was $299.99 and PS4 was $599.99, I'd say Microsoft's aiming at a much broader market... but that's as close as I'd get to this dudes point
 
The gamasutra interview with the guy that was on stage at the Sony event was asked why they went with gddr5. He said it was to eliminate the step of memory optimization for the developer. He mentioned that going with the Microsoft method of ddr3 + esram would have given them a similar performance result. I don't think the ram difference will really have much of an effect. Gpu and how much ram available will.


Yep, the ps4 is going to be a pretty large console it looks like. I wonder how they will handle the power supply, internal or external?

External. Faster clock speeds and gddr5 will create a lot more heat unless Sony plans to under clock the ram. DDR3 gets warm, GDdr5 gets hot to handle. Hopefully Sony has a good yet very quiet cooling solution.

Sony has walked right into the high cost console again which they said they needed to avoid this time around. Oh well.
 
kabini_die_angled_575b9z1w.jpg


benchgcu2e.png


In both our Xbox One and PS4 articles I referred to the SoCs as using two Jaguar compute units - now you can understand why. Both designs incorporate two quad-core Jaguar modules, each with their own shared 2MB L2 cache. Communication between the modules isn’t ideal, so we’ll likely see both consoles prefer that related tasks run on the same module.

Looking at Kabini, we have a good idea of the dynamic range for Jaguar on TSMC’s 28nm process: 1GHz - 2GHz. Right around 1.6GHz seems to be the sweet spot, as going to 2GHz requires a 66% increase in TDP.

The major change between AMD’s Temash/Kabini Jaguar implementations as what’s done in the consoles is really all of the unified memory addressing work and any coherency that’s supported on the platforms. Memory buses are obviously very different as well, but the CPU cores themselves are pretty much identical to what we’ve outlined here.

Sources;
http://www.anandtech.com/show/6976/...wering-xbox-one-playstation-4-kabini-temash/5
http://www.anandtech.com/show/6974/amd-kabini-review/3

So 2GHz rumor looks unlikely at this point.
 
I also wanted to throw out an "lol" at the "hardware" people who threw around suggestions like "Oh ARM can compete with that" or "Project Denver blah blah";

Anandtech said:
In its cost and power band, Jaguar is presently without competition. Intel’s current 32nm Saltwell Atom core is outdated, and nothing from ARM is quick enough. It’s no wonder that both Microsoft and Sony elected to use Jaguar as the base for their next-generation console SoCs, there simply isn’t a better option today. As Intel transitions to its 22nm Silvermont architecture however Jaguar will finally get some competition. For the next few months though, AMD will enjoy a position it hasn’t had in years: a CPU performance advantage.
 
Welp, the Jag CPU is pretty light indeed.

At least on Intel's page that i5 has hyperthreading though for 4 threads, so that makes it a little less bad for the Jag.
 
Top Bottom