EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

I have a question, from what I understand even if the internal 3D rendering resolution of something like Killer Instinct might be 720p, can't the UI layer be native 1080p so when the two are combined for eventual final output we can expect razor sharp native 1080p UI elements alongside the upscaled characters? Is that how devs are doing this for next-gen?

Because it would suck to literally have the entire picture be upscaled from 720p. I think it would be easier to swallow these downgrades if at least the UI layers remain pixel perfect 1080p from creation to final output.

2D overlays generally look fine on upscaling since they are not as colorful as the game itself and pixel repetition will largely go unnoticed. So, even though what you are suggesting is technically possible, I don't know if devs would bother.
 
Thread whining about specs in a thread about specs...

Asian-Facepalm-Of-Shame.gif



If you want to talk about games, how about going to the hundreds of games related thread on the forum.
 
I'm not whining. I was just hoping that when I joined this site it would be a community who cares about games more than stroking each others dicks. Obviously I was dreaming. The specs have been out for quite some time now. The only reason this thread is still going is because people are more interested in having the better console than talking about what's more important, games.

I was hoping to bring a little rationality to the thread but apparently I was dreaming again. By all means, continue stroking each others dicks. Don't let a little rationality get in your way.

This is what spec threads are for. It is to determine what console and to what degree is going to be more powerful.

This won't be the last one.
There are other threads to talk about games and how much we love to play them.
 
This is what spec threads are for. It is to determine what console and to what degree is going to be more powerful.

This won't be the last one.
There are other threads to talk about games and how much we love to play them.

Thread whining about specs in a thread about specs...

If you want to talk about games, how about going to the hundreds of games related thread on the forum.

Guys, he has apologized and that's admirable. Let's get back on topic. :)
 
You're absolutely correct. There is more than just teraflop figures.

XBO: 1.18 TF GPU (12 CUs) for games
XBO: 768 Shaders
XBO: 48 Texture units
XBO: 16 ROPS
XBO: 2 ACE/ 16 queues

PS4: 1.84TF GPU ( 18 CUs) for games + 56%
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +300%

This is a decisive advantage for the PS4. I expect this difference to become readily apparent later down the line.

Fixed. But yeah. The power difference next gen compared to current gen is staggering.
 
Pretty sure XBO has 2 ACEs: meaning 400%

4 times equals a 300% difference.

2 aces = 100%
8 aces = 400%

Difference is 400-100=300%


or you could put it that way
PS4: 1.84TF GPU ( 18 CUs) for games -> x 156%
PS4: 1152 Shaders -> x 150%
PS4: 72 Texture units -> x 150%
PS4: 32 ROPS -> x 200%
PS4: 8 ACE/64 queues -> x 400%
 
I'm going to do something I might regret now.

The PS4 GPU is 40% more TFlops than the XB1 GPU butThe XB1 GPU is only 29% less Tflops than the PS4 GPU.

So the XB1 is only 29% weaker.
 
It's gonna be fun having him trying to downplay well-grounded graphical differences after nitpicking over 2-3fps for the whole 7th generation

he already started with Ryse in 900p

"I came across the same bush in BF4 on both PS4 and Xbox One.. On this particular bush, the Xbox version seemed to have just slightly more detail. Xbox wins."
 
I'm going to do something I might regret now.

The PS4 GPU is 40% more TFlops than the XB1 GPU butThe XB1 GPU is only 29% less Tflops than the PS4 GPU.

So the XB1 is only 29% weaker.

The statement is the same just phrased differently. P is 1/2 more or X is 1/3 less.
 
It's gonna be fun having him trying to downplay well-grounded graphical differences after nitpicking over 2-3fps for the whole 7th generation

he already started with Ryse in 900p
We'll see what he does. He might legitimately think there are big diminishing returns when looking at resolution past 900p while simultaneously think 2-3 fps in a 30FPS game is significant.

That can just be preference and I don't think there is anything wrong with that. What would be wrong if he dismisses differences which he highlighted in the past as significant and now says they are not significant. It has to be the same kind of things though, not just some perceived 10% difference in different areas.
 
We'll see what he does. He might legitimately think there are big diminishing returns when looking at resolution past 900p while simultaneously think 2-3 fps in a 30FPS game is significant.

That can just be preference and I don't think there is anything wrong with that. What would be wrong if he dismisses differences which he highlighted in the past as significant and now says they are not significant. It has to be the same kind of things though, not just some perceived 10% difference in different areas.
Given the latest "interview" Leadbetter put out that was prefaced with a pre-emptive defense, I'm thinking he's going into full defense-mode.
 
Given the latest "interview" Leadbetter put out that was prefaced with a pre-emptive defense, I'm thinking he's going into full defense-mode.
I think he has made a lot of blunders lately, I just don't think that Leadbetter not thinking resolution is that important past a certain point is the same as highlighting a 10% variation in average frame rates.

I for example don't think using 4K resolution and downsampling current generation games gives great visual benefits. I can conceive of someone thinking the same of resolutions that are lower than that until a certain point.

One can argue 640p vs. 720p for example is a bigger difference than 900p vs. 1080p, despite mathematically that not being the case.
 
Has it been discussed in this thread why 900px seems to be the new magic number for compromise?
It makes no sense, other than being less than 1080p, so less fillrate/bandwidth used.

1080p is the magic number. It's the biggest 16:9 resolution where
a)both axes are even divisible by 8
b)the number of pixels is below a power of 2 (2^21 - 1920*1080 =23552)

Basically, it's ideal for memory alignment and memory consumption. You can fit a 32bit 1080p image almost exaclty into an 8MiB memory chip. You never have to worry about extra padding bytes after each line, because every line's already naturally aligned to a 32 byte boundary. You can tile and swizzle it without every worrying about "partial" blocks at the edges of the image.

1080p is magic.

900p makes no goddamn sense.
 
I think he has made a lot of blunders lately, I just don't think that saying him not thinking resolution is that important past a certain point is the same as highlighting a 10% variation in average frame rates.

I for example don't think these 4K resolution and downsampling of current generation games gives great visual benefits. That's why I can conceive of someone thinking the same of resolutions that are lower than that until a certain point.

One can argue 640p vs. 720p for example is a bigger difference than 900p vs. 1080p, despite mathematically that not being the case.

It took me a while to understand the first 2 paragraphs. I think I finally did it! Could you simplify the sentences for others' benefit? And I sort of agree with you. To summarize, you are saying that the point of diminishing returns for resolution could be subjective. Can't argue with that. :)

With regard to the last paragraph, though the sentence is simple, I'm not sure what you are referring to. Can you clarify?

EDIT: Ahhh... Now I get it, you mean that the jump from 640p to 720p could have a more qualitative impact on a viewer than the jump from 900p to 1080p. I doubt it.
 
You're absolutely correct. There is more than just teraflop figures.

XBO: 1.18 TF GPU (12 CUs) for games
XBO: 768 Shaders
XBO: 48 Texture units
XBO: 16 ROPS
XBO: 2 ACE/ 16 queues

PS4: 1.84TF GPU ( 18 CUs) for games + 56%
PS4: 1152 Shaders +50%
PS4: 72 Texture units +50%
PS4: 32 ROPS + 100%
PS4: 8 ACE/64 queues +400%

This is a decisive advantage for the PS4. I expect this difference to become readily apparent later down the line.
This comparison looks better, than it is.

The PS4 GPU also sacrifice some processing power to the OS layer.
The TMUs/ROPs and everything else are coupled with the clocks, so the ratio is the same.

1,31 TF vs. 1,84 = + 40% (Shader/ALU-Throughput)
41 Texture Fillrate vs. 57,6 = + 40% (TMUs Throughput)
The Pixel Fillrate is 88% higher, not 100%. (ROPs)

And a little fun fact at the end, the raster/triangle/geometry throughput from the front-end is 6,6% higher on the Xbox One GPU.


I don't expect much difference which you can immediately point out in most of the games.
 
The PS4 GPU also sacrifice some processing power to the OS layer.

I challenge that. Why exactly should the PS4 reserve GPU time for the OS, when there is no feature needing GPU time, whereas on the XBO there are snapped Metro apps and Kinect?
 
It took me a while to understand the first 2 paragraphs. I think I finally did it! Could you simplify the sentences for others' benefit? And I sort of agree with you. To summarize, you are saying that the point of diminishing returns for resolution could be subjective. Can't argue with that. :)
I tried to fix it.

With regard to the last paragraph, though the sentence is simple, I'm not sure what you are referring to. Can you clarify?
Because the absolute amount of pixels are fewer, each pixel is more important. So the ~20% difference between 640p and 720p is more noticeable than the ~30% difference between 900p and 1080p.
 
This comparison looks better, than it is.

The PS4 GPU also sacrifice some processing power to the OS layer.
The TMUs/ROPs and everything else are coupled with the clocks, so the ratio is the same.

1,31 TF vs. 1,84 = + 40% (Shader/ALU-Throughput)
41 Texture Fillrate vs. 57,6 = + 40% (TMUs Throughput)
The Pixel Fillrate is 88% higher, not 100%. (ROPs)

And a little fun fact at the end, the raster/triangle/geometry throughput from the front-end is 6,6% higher on the Xbox One GPU.


I don't expect much difference which you can immediately point out in most of the games.

Wow, so, true, I just cant tell the difference between 1080p and 720p, better save £ 100 more that xbox 1
 
I'm invested in the Microsoft Brand, Xbox360 was very good to me. I'm sticking with the XBox One mostly because of of the Xbox Live ecosystem and, to a lesser degree, Kinect and the Controller.

But to be perfectly honest, all the tech talk has me both worried and kind of irritated. Looking at the facts, my system of preference is going to have a significant power disadvantage. If the power disadvantage was due to Microsoft working on improving live and adding Kinect then I wouldn't be so irritated, but I honestly feel like MS was suprised by type of machine Sony could put out so they lowballed all of us gamers expecting Sony to put out a similarly spec'd device.

None of this is going to sway my choice due the the huge gap between Live and PSN (in my opinion.) But I'd be lying if I said I wasn't worried about it.
 
This comparison looks better, than it is.

The PS4 GPU also sacrifice some processing power to the OS layer.
The TMUs/ROPs and everything else are coupled with the clocks, so the ratio is the same.

1,31 TF vs. 1,84 = + 40% (Shader/ALU-Throughput)
41 Texture Fillrate vs. 57,6 = + 40% (TMUs Throughput)
The Pixel Fillrate is 88% higher, not 100%. (ROPs)

And a little fun fact at the end, the raster/triangle/geometry throughput from the front-end is 6,6% higher on the Xbox One GPU.


I don't expect much difference which you can immediately point out in most of the games.

What ElTorro said and isn't it 6.6% CPU, not GPU?
 
I challenge that. Why exactly should the PS4 reserve GPU time for the OS, when there is no feature needing GPU time, whereas on the XBO there are snapped Metro apps and Kinect?
Maybe we find out later what Sony processing exactly, but they build a second gfx-command-processor in the GPU, just as like MS.

High Priority Graphics (HP3D) ring and pipeline

- New for Liverpool
- Same as GFX pipeline except no compute capabilities
- For exclusive use by VShell

http://www.vgleaks.com/orbis-gpu-compute-queues-and-pipelines/
 
Because the absolute amount of pixels are fewer, each pixel is more important. So the ~20% difference between 640p and 720p is more noticeable than the ~30% difference between 900p and 1080p.

I will argue with that. Though you are theoretically correct, I don't think we are anywhere near diminishing returns at sub full HD. If one is able to notice the 20% difference at 720p, they would definitely notice the 30% difference at 1080p.

Having said that, personally, I wouldn't notice either! Anything less than 50% is too small a jump for my weak eyes. LOL...
 
This comparison looks better, than it is.

The PS4 GPU also sacrifice some processing power to the OS layer.
The TMUs/ROPs and everything else are coupled with the clocks, so the ratio is the same.

1,31 TF vs. 1,84 = + 40% (Shader/ALU-Throughput)
41 Texture Fillrate vs. 57,6 = + 40% (TMUs Throughput)
The Pixel Fillrate is 88% higher, not 100%. (ROPs)

And a little fun fact at the end, the raster/triangle/geometry throughput from the front-end is 6,6% higher on the Xbox One GPU.


I don't expect much difference which you can immediately point out in most of the games.

We've heard nothing about this being the case and even if it is it will be substantially smaller than the XBO reserve since it's not doing snapping.
 
Don't you get less aliasing at native 1080p resolution?

But I guess it also depend on the size of the screen?

You always have aliasing, regardless of screen size. What is important to diminish the appearance of aliasing is PPI. Pixels-per-inch. On a larger screen, a lower resolution will be more noticeable than on a smaller screen - even if both are the same native resolution. Viewing distance also comes into play - if you game on a monitor like i do - even with a smaller PPI you can easily spot the difference between 1080/720 due to sitting closer to the screen.

I would say that for most people gaming on larger TVs with a native 1080 screen that a 1080p native image will look strikingly better than the same game running at 720p upscaled to 1080p. Jaggies will be less noticeable because there is more pixel information on screen being displayed to the viewer with a native 1080 image.
 
None of this is going to sway my choice due the the huge gap between Live and PSN (in my opinion.) But I'd be lying if I said I wasn't worried about it.

There is no huge gap, where are you getting this from? The recent PS4 footage of the UI and performance clearly shows how fast it is and there's tons of new features and overall improvements. We will also see a full breakdown within the next 7 days from IGN of the PS4 UI/PSN so you will see more of what I"m talking about. These are completely different consoles, this isn't PS3 and 360.
 
We've heard nothing about this being the case and even if it is it will be substantially smaller than the XBO reserve since it's not doing snapping.
Here i could agree, i just don't liked the comparison.
As if Sonys SoC put 100% raw power into games and MS doesn't.
Or just counting the number of the units and put them into a comparison, regardless of the clock rates.
 
We've heard nothing about this being the case and even if it is it will be substantially smaller than the XBO reserve since it's not doing snapping.

This. But until we know what it is, I would like to maintain 1.31 vs 1.84. It is only fair that comparisons are made on an even plane where the exact numbers are known for both platforms. We can always change them when more authoritative info gets out, right?

This argument is very similar to the people comparing the Xbox having 5 GB available to PS4 potentially having 4.5, 5.5, 6, 7 or even full 8 (bless those misguided souls). Adding speculative numbers to the mix just muddles the water and everyone would start assuming whatever works for them. I'd rather avoid that and stick to confirmed performance figures.
 
I will argue with that. Though you are theoretically correct, I don't think we are anywhere near diminishing returns at sub full HD. If one is able to notice the 20% difference at 720p, they would definitely notice the 30% difference at 1080p.
Why? There is a large gap in absolute pixels that would influence that so it's not a given everyone will be able to see that difference in both cases.

There is no huge gap, where are you getting this from? The recent PS4 footage of the UI and performance clearly shows how fast it is and there's tons of new features and overall improvements. We will also see a full breakdown within the next 7 days from IGN of the PS4 UI/PSN so you will see more of what I"m talking about. These are completely different consoles, this isn't PS3 and 360.
I would also be interested in this huge gap between XBL and PSN. (And specifically XBL and PSN, not 360 vs. PS3.)
 
You always have aliasing, regardless of screen size. What is important to diminish the appearance of aliasing is PPI. Pixels-per-inch. On a larger screen, a lower resolution will be more noticeable than on a smaller screen - even if both are the same native resolution. Viewing distance also comes into play - if you game on a monitor like i do - even with a smaller PPI you can easily spot the difference between 1080/720 due to sitting closer to the screen.

I would say that for most people gaming on larger TVs with a native 1080 screen that a 1080p native image will look strikingly better than the same game running at 720p upscaled to 1080p. Jaggies will be less noticeable because there is more pixel information on screen being displayed to the viewer with a native 1080 image.

Ah thanks, I just finished reading an article about it, interesting stuff.
 
Maybe we find out later what Sony processing exactly, but they build a second gfx-command-processor in the GPU, just as like MS.

That is no indication for a alleged fixed reservation of GPU time by the OS. The crucial technology that allows Microsoft to schedule GPU time reliably in slices is the virtualization by the hypervisor OS.

Apart from that, if there are no known OS features that would need GPU time while a game is running then, by Occam's razor, there probably are none.
 
Top Bottom