EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

I thought the rumour was the PS4 used 2 cores for its OS, although I am sure all this can change even after release.

Yeah, I wouldn't get to wrapped up in either new console's OS footprint this early. Those requirements are fluid and very subject to being revised via updates; they did last gen, and they have even more room to do so in the future.
 
Could you explain me in first place why are we talking about DME like some special souse that could leverage Xbone architecture ?

PS4 doesn't have split memory system. It doesn't need to move stuff around beside initial data load from drive. I don't see how DME can leverage anything since PS4 architecture is best case of DME where you don't need to move anything.

as for PRT wasn't that feature added long time ago to AMD gpus and now all their GPU support it via hardware ?

If you are talking about different type of PRT then is this solution 100% better or something wash ? Or we are talking here about some technique that doesn't have any real improvement from standard PRT.

Also how having 3 display planes could leverage anything ? I mean isn't this just feature just to accommodate running apps in background + having game UI in native resolution regardless of internal game rendering resolution which thanks to it could be less than for example 1080p and thanks to it your UI would look clear without low res blur ?

Also why you mentioned UI as something that PS4 has to deal with (with its ROPS) as if it is something expensive for hardware. Is UI expensive for hardware ?

If not you then someone can explain me above ?

Most of this conversation comes back to whether the move engines reorder data using a space fitting curve, that's a different form of tiling/swizzling.

PRTs are something else and both consoles support it and both have to manage it on their CPU.
 
Is that even a good feature? If you can't get a game to run at 1080p, then what is the point? Anything below that is, frankly, a blurry/jaggy mess.

I believe killzone mercenaries uses it too one for Hud and one for weapon models.
If stories from ass is true halo 5 will also use it background will be 1920x720p and Hub plus weapon and main character at 1080p.
 
Most of this conversation comes back to whether the move engines reorder data using a space fitting curve, that's a different form of tiling/swizzling.

PRTs are something else and both consoles support it and both have to manage it on their CPU.

Dude. If you don't send a PM to bishoptl and discuss your background knowledge soon you're going to get perma banned. He is not kidding.
 
I think we're all getting our wires crossed with regards to tiling/prt/swizzling and swizzling :)

Isn't the tiling/prt/swizzling you're talking about mentioned in the wiki article below? I'm sure that excerpt is not talking bout vector/tuple swizzling. You were talking about how it needs to be done on CPU for ps4 but it seems like all modern hardware accelerates that in GPU.

Unless you know about the special types of tiling that's done in xbone's move engine. Which is a blackbox btw.

From: http://en.wikipedia.org/wiki/Tiled_rendering

Tiled rendering is the process of subdividing (or tiling) a computer graphics image by a regular grid in image space to exploit local spatial coherence in the scene and/or to facilitate the use of limited hardware rendering resources later in the graphics pipeline.

Tiled rendering is sometimes known as a "sort middle" architecture.[1]

In a typical tiled renderer, geometry must first be transformed into screen space and assigned to screen-space tiles. This requires some storage for the lists of geometry for each tile. In early tiled systems, this was performed by the CPU, but all modern hardware contains hardware to accelerate this step.
 
Most of this conversation comes back to whether the move engines reorder data using a space fitting curve, that's a different form of tiling/swizzling.

PRTs are something else and both consoles support it and both have to manage it on their CPU.

Then question why is it important ?

And how this change anything in DME vs no moving ? I mean any taking -> reordering -> placing data will be slower than not moving data at all. As i am aware from earlier tech threads PS4 has unified memory with coherent address space between GPU and CPU. Once you load data from drive there is no moving and what CPU or GPU just need is to access it.

If PS4 has one more core for games relative to Xbox One then the multplatforms games comparisons will go very badly for MS.

I don't think one more or less CPU core for games would change anything. Having 6 less CU is noticeable difference.
 
Isn't the tiling/prt/swizzling you're talking about mentioned in the wiki article below? I'm sure that excerpt is not talking bout vector/tuple swizzling. You were talking about how it needs to be done on CPU for ps4 but it seems like all modern hardware accelerates that in GPU.

Unless you know about the special types of tiling that's done in xbone's move engine. Which is a blackbox btw.

From: http://en.wikipedia.org/wiki/Tiled_rendering


Tiled rendering is used primarily on mobile phones. They have a small amount of esram and draw calls are binned into "tiles".

The GPU goes through each tile in turn and renders. It copies the render buffer data from its cache to main memory.

Multi core powervr GPUs can process multiple tiles at once.
 
Tiled rendering is used primarily on mobile phones. They have a small amount of esram and draw calls are binned into "tiles".

The GPU goes through each tile in turn and renders. It copies the render buffer data from its cache to main memory.

Multi core powervr GPUs can process multiple tiles at once.

Ah I see. So it's for using the esram effectively. Thanks.
 
Then question why is it important ?

And how this change anything in DME vs no moving ? I mean any taking -> reordering -> placing data will be slower than not moving data at all. As i am aware from earlier tech threads PS4 has unified memory with coherent address space between GPU and CPU. Once you load data from drive there is no moving and what CPU or GPU just need is to access it.

Accessing the data on the CPU could be very slow if it is "swizzled" and you need linear access.

GPUs read non-linear ("swizzled") data natively and faster. If your doing GPGPU or rendering which the CPU needs access to then you can swizzle/de-swizzle "for free".

You can also use it to copy data for the GPU "in-place".

No one said it'll make the Xbox One faster than a PS4 or anything like that, just that it's possible and beneficial to the Xbox.
 
Accessing the data on the CPU could be very slow if it is "swizzled" and you need linear access.

GPUs read non-linear ("swizzled") data natively and faster. If your doing GPGPU or rendering which the CPU needs access to then you can swizzle/de-swizzle "for free".

You can also use it to copy data for the GPU "in-place".

No one said it'll make the Xbox One faster than a PS4 or anything like that, just that it's possible and beneficial to the Xbox.

It also hasn't been mentioned in regards to the XBONE yet either.
 
Tiled rendering is used primarily on mobile phones. They have a small amount of esram and draw calls are binned into "tiles".

The GPU goes through each tile in turn and renders. It copies the render buffer data from its cache to main memory.

Multi core powervr GPUs can process multiple tiles at once.
All this stuff is way above my head but I have enjoyed reading your back and forth on this.
Are you referring to the Direct 3D tiled resources here? MS had a session at Build 2013 that mentions it is coming to Windows 8.1 and Xbox One.
This is the best link I can find right now http://www.youtube.com/watch?v=QB0VKmk5bmI

Edit: Direct link to Channel 9
 
Tiled rendering is used primarily on mobile phones. They have a small amount of esram and draw calls are binned into "tiles".

The GPU goes through each tile in turn and renders. It copies the render buffer data from its cache to main memory.

Multi core powervr GPUs can process multiple tiles at once.


doesn't deferred rendering mess with that though?
 
It also hasn't been mentioned in regards to the XBONE yet either.

You just said you had a document which confirms tiling/swizzling move engines, leaking? ;)

Add that to the VGLeaks info and the hot chips slides with "swizzle copy" units, I'm comfortable with it.
 
I believe killzone mercenaries uses it too one for Hud and one for weapon models.
If stories from ass is true halo 5 will also use it background will be 1920x720p and Hub plus weapon and main character at 1080p.
Hmm, that's interesting, though the disparity might look pretty jarring.

I know some current-gen games which render the UI and game at different resolutions. Call Of Duty does this for starters. The HUD is 1080p, game is 600p.
 
If I get banned, I'll take it on the chin.

I wound people up on purpose who aren't necessarily as technically minded as others in the thread.

No need to plead for me, but thanks :)

I take it you're not an insider then?
 
I don't work on either next gen console for a manufacturer or a dev.

I've no idea what the definition of an insider is.

Then do yourself a favour and send a PM to bishoptl to apologies and state is was a mistake before the ban hammer hits.
 
Lol, you're just adding things I didn't say. I only said Cerny used raycasting for audio because the other guy kept bringing audio GPGPU up.

Did I say the CPU wouldn't be used? Nope.

Did I say the Xbox One would have to do it the same way? Nope.

You must think you are speaking with brain dead posters with no memory. From "The audio chip only decodes streams", to the GPU being used for audio purposes (hinting at a lack of audio processing in the audio chip) just because Cerny said it could (could, doesn't have to), to throwing sand around playing games with insider info.

Even now you say "Did I say the Xbox One would have to do it the same way? Nope", listen son, not even I said that you said that so please bark at another three. I see no bite so I'm not interested.
 
If I get banned, I'll take it on the chin.

I wound people up on purpose who aren't necessarily as technically minded as others in the thread.

No need to plead for me, but thanks :)

No, you decided to degrade yourself to troll status just so you could end the conversation and win the argument rather than legitimately engaging with Eltorro as to why he could be possibly be wrong.
 
I don't work on either next gen console for a manufacturer or a dev.

I've no idea what the definition of an insider is.

Basically if you are presenting your post as fact in that you actually have internal knowledge of the system by actually working on the system itself. Then sharing said information would qualify you to be an insider which you are not.
 
I find it hilarious how someone in this thread is lambasting other posters about technical accuracy and then continuously calls space-filling curves "space fitting curves" (which makes no sense at all).
 
You just said you had a document which confirms tiling/swizzling move engines, leaking? ;)

Add that to the VGLeaks info and the hot chips slides with "swizzle copy" units, I'm comfortable with it.

And if you found PS4 doco with the words Hardware Texture Swizzling what would you think.
 
What is going on in this thread? It has become stupidly large and hard to follow.

I see there were a few warriors slain in the war. Can anyone do a recap?
 
Always fun seeing how these sorts of threads start on one thing did down then goes into another sub area then dies down then again :) not saying its a bad thing just interesting how subtopics change every 10 pages or so
 
I don't work on either next gen console for a manufacturer or a dev.

I've no idea what the definition of an insider is.

insider is most of the time dev that has legit knowledge about subject. For example if you work on Xbone dev kit you are insider if you don't then you are not. If you say something like "i know" then you are pretending to be insider.

Being insider on other forums normally means two things :

a) To boost your credits and simply lie. That is how pastebin most of the time works and other leaks
b) you are legit insider and you want to share stuff.

Difference between NeoGaf and other forums is that if you claim you are insider then you need to send PM to mods with your credentials or proof. Without it claiming you are insider = ban.

Thanks to this we have legit discussions about many things without anyone pretending to be someone who they aren't and selling lies as #truthfact.

There is ton of devs here working on their next gen games and they don't want their identity to be known and most of the time they do stuff normal users do. Sometimes people want to share things with rest of community and thanks to people like CBOAT we know a lot what is going on without official messages from corporations like recent Xbox one policy being complete shit or specs etc.
 
What is going on in this thread? It has become stupidly large and hard to follow.

I see there were a few warriors slain in the war. Can anyone do a recap?

Best recap:

iOkfYZoLz2ATi.gif


EDIT: Credit goes to ElTorro for this amazing gif, btw.
 
What is going on in this thread. It had become stupidly large and hard to follow. I see there were a few warriors lost in the war. Can anyone do a recap?

Last few pages have been discussions on what extra coprocessor ms added add value to the system or not . The boco something junior contended

Audio chip vs gpu for advanced audio
Something about 3 display buffers
And finally something about partially rendered textures

His point being xbone has a more interesting approach while ps4 is more brute force and so while ps4 is stronger there might be something to what penelo said

Most got shut down or explained as already present or unnecessary due to no esram on ps4 or ms giving fancy names to components . Audio chip was acknowledged as an advantage but not a massive one

Prt thing has been going on with a lot of words like swizling and tiling which I don't understand it's been back and forth most stuff I guess countered except some space fitting curves he keeps bringing up.

Sorry if vague don't understand half these terms
 
What is going on in this thread? It has become stupidly large and hard to follow.

I see there were a few warriors slain in the war. Can anyone do a recap?

PS4 is still pretty widely expected to be 50% faster, some think it will show some don't.

The rest is mostly white noise and some fantastic GIFs.
 
PS4 is still pretty widely expected to be 50% faster, some think it will show some don't.

The rest is mostly white noise and some fantastic GIFs.
I think it will be there - but I'm not expecting a huge jump, personally. Maybe a more solid frame rate, higher native res, etc. I'm not expecting people to say "holy shit!" - more like "yeah - a bit better" from multiplatform offerings. 1st party is where I believe will make the most forward jump ahead of the competition.
 
No, you decided to degrade yourself to troll status just so you could end the conversation and win the argument rather than legitimately engaging with Eltorro as to why he could be possibly be wrong.

As far as the scheduling of CUs goes, there is definitely no need for the host application to define a fixed assignment of CUs to task categories, especially none that is static throughout the runtime of the application. One of the big bullet points in GCN is the hardware-based scheduling of concurrent graphics- and compute-related tasks.
 
I think it will be there - but I'm not expecting a huge jump, personally. Maybe a more solid frame rate, higher native res, etc. I'm not expecting people to say "holy shit!" - more like "yeah - a bit better" from multiplatform offerings. 1st party is where I believe will make the most forward jump ahead of the competition.

That's good enough for me. If I'm going to buy a multiplatform title on a console, then I'd like my choice to be simple and easy knowing that the PS4 version is likely going to have the edge in just about all games.
 
I think it will be there - but I'm not expecting a huge jump, personally. Maybe a more solid frame rate, higher native res, etc. I'm not expecting people to say "holy shit!" - more like "yeah - a bit better" from multiplatform offerings. 1st party is where I believe will make the most forward jump ahead of the competition.

I'm hoping that if it does show it's in a minor resolution hit or maybe PS4 might be +30FPS and XB1 locked to 30FPS.

Edited as I wrote PS3 instead of PS4
 
As far as the scheduling of CUs goes, there is definitely no need for the host application to define a fixed assignment of CUs to task categories, especially none that is static throughout the runtime of the application. One of the big bullet points in GCN is the hardware-based scheduling of concurrent graphics- and compute-related tasks.

Oh, sorry didn't mean it to say that you are wrong. More like him just saying "I really can't say any more than that" was very much a pathetic, lazy response. Like seriously, what's stopping you from saying more?
 
I'm hoping that if it does show it's in a minor resolution hit or maybe PS3 might be +30FPS and XB1 locked to 30FPS.

30+ is bad, though. I think locking at 30 is better. It does mean PS4 won't have drops below 30 like Xbox may have in this case, though. Maybe a few more special effects too.
 
Top Bottom