Xbox One SDK & Hardware Leak Analysis CPU, GPU, RAM & More - [Part One]

Wolfenstein maintains a solid 1080p vertical resolution but dynamically alters the horizontal resolution. It's not something I've ever noticed when playing it. It's a very clever solution.

XB1 version of COD: Advanced Warfare used it too I believe for the single player campaign

* Sorry just noticed somebody else has just said the same thing *
 
XB1 version of COD: Advanced Warfare used it too I believe for the single player campaign

I believe Halo 2 Anniversary also uses such a technique. Looks clean to me, though I can still see some shimmering.

Seems like it's a beneficial technique to use that provides a better scaling image than say 900p
 
Thing is, that PS4 wont operate always in 1080p too.
EDRAM is a limitation, but its not as big limitation as its GPU.
Both consoles are quite weak for 1080p if they are really focus on heavy rendering techniques.
That's a a bit unfair statement. From what we know ps4 it's in a general better fit to offer 1080p games compared xbone. With more advanced tech even a more powerful pc can struggle to hit the standard 1080p 60 fps.
 
Thing is, that PS4 wont operate always in 1080p too.
EDRAM is a limitation, but its not as big limitation as its GPU.
Both consoles are quite weak for 1080p if they are really focus on heavy rendering techniques.
Not really! For current gen, the PS4 GPU is a damn good match for 1080p on a console. You could say it has been designed with 1080p in mind.
The Xbox One GPU on the other hand..... ehh.
 
That's a a bit unfair statement. From what we know ps4 it's in a general better fit to offer 1080p games compared xbone. With more advanced tech even a more powerful pc can struggle to hit the standard 1080p 60 fps.

Didn't AMD recommend 32 ROP's for 1080p? Seems someone didn't listen.
 
One nice feature that I use is that if a game crashes, it doesn't take down the whole system. Just hit the home button, kill the game and re-start it.

This is a feature of any OS that has processes and protected memory. You don't need two VMs running on a hypervisor to prevent a game bug from bringing your OS down.
 
That's a a bit unfair statement. From what we know ps4 it's in a general better fit to offer 1080p games compared xbone. With more advanced tech even a more powerful pc can struggle to hit the standard 1080p 60 fps.

Sure even much stronger PC than PS4 can struggle with 1080p60, but its not an overstatement. Battlefield 2-3 years from now will be running in 900p or something variable, same goes for any very heavy tech oriented game.
Especially with 20nm and Pascal like cards coming relatively soon to the whole current-gen console life cycle, and we know how big boost they will be.
If You really believe that devs will maintain 1080p till the end life cycle on PS4, You're in denial.

Not really! For current gen, the PS4 GPU is a damn good match for 1080p on a console. You could say it has been designed with 1080p in mind.
The Xbox One GPU on the other hand..... ehh.
Its mid-range GPU from 2011, in 2016 we will get Pascal GPUs with stacked DRAM and much lower manufacture process. Come on, be realistic here.
 
If you look from point of specs, probably, but catch is tghat GPU's in current gen console use DX 11.2, Open GL 4.4 and beyond that. And yes, AMD HD 7870, which is equivalent to PS4 GPU is from 2012.
Do you even know what your talking about? DX11.2 is a desktop graphics API for Windows same goes for OpenGL only that is multi OS. Having support for a certain version of DX or OpenGL doesn't suddenly change the specs, it just means that it's compatible with it.

Fact is that the consoles (yes both) were equipped with a GPU that was never high-end, no matter how you look at it.

Also a 7870 is stronger than the PS4's GPU.
 
If you look from point of specs, probably, but catch is tghat GPU's in current gen console use DX 11.2, Open GL 4.4 and beyond that. And yes, AMD HD 7870, which is equivalent to PS4 GPU is from 2012.

I personally, odn't know what's going on, because GPU's with LOWER on paper specs than the PS4's GPU are outperfomring that console.


The 750ti, in particular is besting that console's GPU performance in many games....
 
If you look from point of specs, probably, but catch is tghat GPU's in current gen console use DX 11.2, Open GL 4.4 and beyond that. And yes, AMD HD 7870, which is equivalent to PS4 GPU is from 2012.

7870 has 20 CU and 1000mhz, PS4 GPU has 18CU and 800mhz, they are no equal.
DX 11.2 or OpenGL doesnt matter at all.

People need to understand that in a year and half, mid-range GPUs will be at least 3x times faster than PS4 GPU and those GPUs will still target mostly 1080p, because it still will be main monitor resolution for most players. This means that games High settings will aim at 1080p, the only options to consoles to catch up is to decrease settings and resolution.
And we havent even started about GPGPU features that will eat some shading performance out consoles to subsidize for crappy CPUs.
 
Ive watched YouTube vids of this site and I think he uses a lot of bro science and does not know that much about tech stuff.

Its good that the dev state of the X1 is catching up to the ps4. Ps4 will probably have the edge the majority of the time, though thankfully it won't be as big as last gen, and will be the smallest graphical difference in the last 20 years.
 
Having support for a certain version of DX or OpenGL doesn't suddenly change the specs, it just means that it's compatible with it.

Yes, i know. Support for a certain version of DX or Open GL can't change specs at all. I thought about the compatibility.

Also a 7870 is stronger than the PS4's GPU.

OK, somewhere between a 7850 and 7870.
 
^^^ just have to wait and see, but for the PS4 1080p/30fps is the standard, and I think it will be the standard till the PS5. Later on in the console generation your going to have the narrative of consoles are holding back PC games as usual.

Edit: also I think KKRT00 is under estimating the benefits of a closed architecture abit. Yes P.C. Will always be more powerful, but most of it "power" can't be utilized since PC is open platform. You don't need a 5 tflop GPU for 1080p (just an example)
 
^^^ just have to wait and see, but for the PS4 1080p/30fps is the standard, and I think it will be the standard till the PS5. Later on in the console generation your going to have the narrative of consoles are holding back PC games as usual.

Unless cloud takes off.
 
Having support for a certain version of DX or OpenGL doesn't suddenly change the specs, it just means that it's compatible with it.

But the api support is usually used to describe the GPU feature set and age in a general way.

Ive watched YouTube vids of this site and I think he uses a lot of bro science and does not know that much about tech stuff.

What is bro science?
 
Ive watched YouTube vids of this site and I think he uses a lot of bro science and does not know that much about tech stuff.

Its good that the dev state of the X1 is catching up to the ps4. Ps4 will probably have the edge the majority of the time, though thankfully it won't be as big as last gen, and will be the smallest graphical difference in the last 20 years.

But isn't the tech difference the biggest ever?
 
But isn't the tech difference the biggest ever?

Yup, also there seem to be an underlying assumption that M.S. is reducing its OS resources and improving its dev tools and Sony isn't, which isn't true. Sony doesn't have to worry about Snap and Voice Commands, they could easily release an entire 7th Core and more ram, but by my understanding they went conservative with reserving system resources so they won't be caught off guard like last gen with cross game chat. Not to mention PS4 has much stronger GPGPU design.
 
That's a a bit unfair statement. From what we know ps4 it's in a general better fit to offer 1080p games compared xbone. With more advanced tech even a more powerful pc can struggle to hit the standard 1080p 60 fps.

And that is quite blanket statement as the ability to run a game at 1080p on either machine entire relies on the content that is being rendered.
 
Wait, so it's 34GB in either direction? Argh, I hate it when companies get confusing whether it's read and write or read or write.

Typically DRAM bandwidth is measured in either direction, not combined, right? So say dual channel DDR3 1600 is 25.6 GB/s, and that's going both up and down, it's not a combined figure. What is the PS4s figure, combined or one direction?

Every memory is always measured in total bandwidth. So the ps4 has 176 GB/s total, 88 gb/s read + 88 GB/s write.

PC DDR3 ram is the same, your stated 25.6 GB/s is total (12.8 read + 12.8 write).


It's 68 total either read or write or combination of and it depends on the client as well. The GPU could read and/or write at 68 GB/sec, the CPU is limited by what the L2 cache (20GB I think?)

No.
 
What happened with the PS4's secondary ARM processor ?

Also do these consoles have anything similar to AMD Turbo Core?
 
But isn't the tech difference the biggest ever?

I don't think so. Last gen Ps3 gpu was in raw specs even beefier than the one on 360, but the later being on a newer architecture net many performance advantages in real world performance.

Things like unified shaders made the discreet vertex shaders of RSX became a bottleneck, the older shader architecture also meant some more complex shaders with branches were sometimes an order of magnitude slower on Ps3 (like motion blur).

And 360 had more available memory (not only physically, specially early in the gen, but also benefit from more advanced and smaller data formats), significantly more even.

And the split bus between GDDR3 and XDR made RSX ROPs become extremely bandwidth bound, while on 360 the ROPs had literally all the bandwidth they could consume.

Sure, Cell was there to help, but at a expense of significant cpu time that could be spent for other tasks, and also there was a limit on how much it could help, because of added latency that could be induced when you offload (specially post processing effects) to it.

Think about this for a moment: In every single spec Ps3 was superior than 360. Even on the gpu side it had more flops. And in terms of flops (at least in theory) cell was in the same ballpark as RSX and Xenos, while the 360's processor was way behind. Half of it's memory was the same as 360 and the other half was faster.

And still it performed generally worse than 360 on many games. Mostly due a bad gpu choice with an old architecture (at least compared to what Xenos was).

Xbone gpu might be slower than Ps4's, but it has none of these crippling issues in comparison. Some of the shortcomings (like the ROPs) might not even be a factor at all in the long run if async compute takes off.
 
But isn't the tech difference the biggest ever?

Biggest ever was PS2 vs Xbox, closest ever was 360/PS3, closest ever architecture wise is PS4/Xbox One.

We've seen much bigger differences in the past, wether due to hardware differences or architecture differences. Levels redesigned entirely to suit a platform, see Double Dragon Sega Master system vs NES, see Splinter Cell on PS2 and GameCube vs Xbox. These things don't happen anymore, thanks to similar architectures, we're essentially getting the same games minus some resolution differences 1080p vs 900p, and such, but no redesigns that I'm aware of.
 
Ive watched YouTube vids of this site and I think he uses a lot of bro science and does not know that much about tech stuff.

Its good that the dev state of the X1 is catching up to the ps4. Ps4 will probably have the edge the majority of the time, though thankfully it won't be as big as last gen, and will be the smallest graphical difference in the last 20 years.

WTF does that even mean? Am sure you can give us all the technical details with your wide knowledge on tech instead and expand or correct any misinformation presented...

Good to see Redgamingtech getting some exposure. Explanation on tech details is usually good and easy to follow.
 
But the api support is usually used to describe the GPU feature set and age in a general way.



What is bro science?

It originates from bodybuilding forums, its basically somone who does not know what there talking about apart from stuff they have read online for example , I pretty sure the guy doing the article isnt an engineer or a dev he's just a fan that as basic computer hardware understanding.
But with bodybuilding it's kind of the opposite, its stuff ppl have learnt in the gym or through expirence that contradicts science or has no science supporting it.
 
I don't think so. Last gen Ps3 gpu was in raw specs even beefier than the one on 360, but the later being on a newer architecture net many performance advantages in real world performance.

Things like unified shaders made the discreet vertex shaders of RSX became a bottleneck, the older shader architecture also meant some more complex shaders with branches were sometimes an order of magnitude slower on Ps3 (like motion blur).

And 360 had more available memory (not only physically, specially early in the gen, but also benefit from more advanced and smaller data formats), significantly more even.

And the split bus between GDDR3 and XDR made RSX ROPs become extremely bandwidth bound, while on 360 the ROPs had literally all the bandwidth they could consume.

Sure, Cell was there to help, but at a expense of significant cpu time that could be spent for other tasks, and also there was a limit on how much it could help, because of added latency that could be induced when you offload (specially post processing effects) to it.

Think about this for a moment: In every single spec Ps3 was superior than 360. Even on the gpu side it had more flops. And in terms of flops (at least in theory) cell was in the same ballpark as RSX and Xenos, while the 360's processor was way behind. Half of it's memory was the same as 360 and the other half was faster.

And still it performed generally worse than 360 on many games. Mostly due a bad gpu choice with an old architecture (at least compared to what Xenos was).


Xbone gpu might be slower than Ps4's, but it has none of these crippling issues in comparison. Some of the shortcomings (like the ROPs) might not even be a factor at all in the long run if async compute takes off.

That is just revisionist history. Cell was not an afterthought, Cell and the RSX was Designed to work together, it was designed overtime for devs to take task traditionally done by the GPU over to Cell (including vertex calculations) and boosting I.Q. since the GPU could be pushed even more as a result. The 360 was very close in architecture to a P.C. especially in comparison to the PS3 and easier to program for, 3rd party devs with deadlines didn't have the time or resources to get the most out of the PS3 DIFFERENT architecture. 1st and 2nd party devs did and the exclusives outperformed 360 games.

Today the X1 and the PS4 architecture are the same but the PS4 is straight up stronger and is more customized with GPGPU in mind. The situation is not the same as last gen.

Edit: example MLAA runing on Cell Saved time and boosted I.Q. in the Process.
 
Biggest ever was PS2 vs Xbox, closest ever was 360/PS3, closest ever architecture wise is PS4/Xbox One.

We've seen much bigger differences in the past, wether due to hardware differences or architecture differences. Levels redesigned entirely to suit a platform, see Double Dragon Sega Master system vs NES, see Splinter Cell on PS2 and GameCube vs Xbox. These things don't happen anymore, thanks to similar architectures, we're essentially getting the same games minus some resolution differences 1080p vs 900p, and such, but no redesigns that I'm aware of.
And again no, xbox and ps2 are not from the same generation. Xbox was almost 2 years younger. From what I know ps4 and xbone are out in the same time, and one it's definitely weaker of ps4. But I know from a pc gamers sound ridiculous those difference but again in a close hardware those difference are not exactly minor.
Xbone gpu might be slower than Ps4's, but it has none of these crippling issues in comparison. Some of the shortcomings (like the ROPs) might not even be a factor at all in the long run if async compute takes off.
You missed a lot of face off, it seems. And I'm not that sure when developers will start to use the 8 ACE and Huma on ps4, how will end with the solely 2 ACE on xbone.
 
I pretty sure the guy doing the article isnt an engineer or a dev he's just a fan that as basic computer hardware understanding.

He has a pretty solid understanding. I'm fairly certain there are mistakes but it is all fairly accurate.

Xbone gpu might be slower than Ps4's, but it has none of these crippling issues in comparison. Some of the shortcomings (like the ROPs) might not even be a factor at all in the long run if async compute takes off.

Take the the deficient compute shader count to do operations to compensate for the deficient ROP count? For a lot of tasks compute shaders might be better, but you want to free up some compute resources you could use more rop dependent shaders?
 
Sure even much stronger PC than PS4 can struggle with 1080p60, but its not an overstatement. Battlefield 2-3 years from now will be running in 900p or something variable, same goes for any very heavy tech oriented game.
Especially with 20nm and Pascal like cards coming relatively soon to the whole current-gen console life cycle, and we know how big boost they will be.
If You really believe that devs will maintain 1080p till the end life cycle on PS4, You're in denial.


Its mid-range GPU from 2011, in 2016 we will get Pascal GPUs with stacked DRAM and much lower manufacture process. Come on, be realistic here.
That's a total different point here. Just depending to the developers priorities. Because from this point of view, ps360 weren't enough powerful too for reach just 720p with advanced tech. Say ps4 it's like xbone when we talk of 1080p it's s absolutely unfair.
 
Sure even much stronger PC than PS4 can struggle with 1080p60, but its not an overstatement. Battlefield 2-3 years from now will be running in 900p or something variable, same goes for any very heavy tech oriented game.
Especially with 20nm and Pascal like cards coming relatively soon to the whole current-gen console life cycle, and we know how big boost they will be.
If You really believe that devs will maintain 1080p till the end life cycle on PS4, You're in denial.


Its mid-range GPU from 2011, in 2016 we will get Pascal GPUs with stacked DRAM and much lower manufacture process. Come on, be realistic here.

Then you have people that want longer console cycles... :'(
 
And again no, xbox and ps2 are not from the same generation. Xbox was almost 2 years younger. From what I know ps4 and xbone are out in the same time, and one it's definitely weaker of ps4. But I know from a pc gamers sound ridiculous those difference but again in a close hardware those difference are not exactly minor.

You missed a lot of face off, it seems. And I'm not that sure when developers will start to use the 8 ACE and Huma on ps4, how will end with the solely 2 ACE on xbone.


PS3 was a full year younger than 360, the same distance as Xbox/GameCube which launched in 2001 vs PS2 which launched in 2000, I'm not understanding your point, unless your point is to try to dismiss Xbox as being in the same generation as PS2 in order to fit some agenda, but then again you also have the GameCube which on it's own is a bigger difference than Xb1 vs PS4,especially when you look at the GameCube's more modern architecture and ATI GPU which was leagues ahead of Sony's Graphics Synthesizer. It was much easier to program too which ended in clearly superior games on GameCube without the need for DF to point things out with a magnifying glass lol.

Don't get me wrong, the PS4 is indeed a much better machine to achieve 1080p at the level of fidelity we want, but at the same time, most of the difference in hardware is already spent getting to that bar, so you end up with basically the same game at 1080p vs another at 900p, but with equal assets, whereas in the past we saw differences in level design or considerably reduced assets below the point where we would consider acceptable for that generation once we got used to the sharp and pretty textures in GameCube and Xbox games.
 
Biggest ever was PS2 vs Xbox, closest ever was 360/PS3, closest ever architecture wise is PS4/Xbox One.

We've seen much bigger differences in the past, wether due to hardware differences or architecture differences. Levels redesigned entirely to suit a platform, see Double Dragon Sega Master system vs NES, see Splinter Cell on PS2 and GameCube vs Xbox. These things don't happen anymore, thanks to similar architectures, we're essentially getting the same games minus some resolution differences 1080p vs 900p, and such, but no redesigns that I'm aware of.
It depends on what you perceive as a difference. It's a very nuanced question.
Even if you look at raw performance, it's usually going to be an apple to oranges comparison with the exception of this generation perhaps.
For older generations and especially old 2D systems, games were developed differently, and options for improving performance were much different back then, this often necessitated large changes to be made to games to accommodate hardware differences, systems were also generally weak which meant being able to draw say 2 more sprites, was actually quite significant.

In modern games and modern systems, a lot of your performance goes to things that you can sacrifice quite easily without making major changes like level redesigns to your game to make it run, so it will become increasingly rarer that you see things you saw in the NES regardless of what gap exists between the systems. In fact, differences in games in systems is often not entirely tied to technology but business.

The question itself is pretty insignificant anyway
 
And again no, xbox and ps2 are not from the same generation. Xbox was almost 2 years younger. From what I know ps4 and xbone are out in the same time, and one it's definitely weaker of ps4. But I know from a pc gamers sound ridiculous those difference but again in a close hardware those difference are not exactly minor.

They are the same generation. If your logic is that generations have been launched on the same year, you're going to have 10x the amount of game console generations that we have now.
 
I never understood the need for 3 OS's.

It is actually a quite smart decision, if you can keep the overhead of virtualization low.

The hypervisor abstracts the actual HW presenting a virtual CPU and GPU the other two OS can use. It is a pretty thin interface that you can also use to ensure the two OS partitions are securely separated and cannot read in memory areas that do not belong to them. When done well it can help you make your console very resilient against hacking (mistakes with random numbers and keys aside, PS3's Hypervisor and SPE's isolation mode made it a really tough system to crack).

Keeping the Windows OS and the Game OS separate, but able to coexist at the same time and collaborate is tricky especially if you want a tightly integrated uniform user experience, but it can really pay dividends after a while. You can upgrade a lot of what users perceive as the console OS and GUI components without interfering with the Game OS and possibly introducing bugs in it due to changes it does not befor from. You can also profile and optimize each OS based on the different kind of apps you run and their unique resources usage patterns.

Let alone being able to open up more and more of the Windows OS, which shares a lot of the hard inner parts with the Desktop and Phone OS's, with third party developers working with Universal Windows apps in the future.

There is potential, but it is not easy to get right and still requires lots of fine tuning to get a consistently fast and smooth UI, which Sony has mostly got right with PS4 (save for some WebGL powered parts which sometime take a bit to load up).
 
Probably everyone will disagree with me but I think the difference u see on screen is the smallest it's been in the last 20 years.

Ray 'Charles' Maker.

As we've already seen this generation, some multiplatform games will be close on the two (main) systems, and some will have massive differences, and that's also how it's been for the past 20 years. You can't (well, apparently you can) make a blanket statement like that and not expect to be disagreed with, so you got that right.
 
They will be differences but not as high as ps2 vs xbox 1.

Didn' xbox one have a double graphical context? What happened with that?
 
truth411;147421682 Edit: also I think KKRT00 is under estimating the benefits of a closed architecture abit. Yes P.C. Will always be more powerful said:
What benefits? There were no benefits on past gen, there are no benefits on this gen games either.
DX12 will further decrease bottlenecks from CPUs, making consoles even harder to catch up in terms of CPU disparity.

-----
That's a total different point here. Just depending to the developers priorities. Because from this point of view, ps360 weren't enough powerful too for reach just 720p with advanced tech. Say ps4 it's like xbone when we talk of 1080p it's s absolutely unfair.

But developers priorities will be pushing tech forward, not limiting themselves just to hit 1080p, its a given. It was always like that.

-----
Then you have people that want longer console cycles... :'(

I feel You totally :(
Hell, i've read posts like several months ago from people that they didnt want current gen to already launch, like wtf?! :(
 
Probably everyone will disagree with me but I think the difference u see on screen is the smallest it's been in the last 20 years.

Except for the fact there's been some of the biggest differences this gen already. 720p vs 1080p, for example. That's already larger than any resolution difference in any gen.
 
Except for the fact there's been some of the biggest differences this gen already. 720p vs 1080p, for example. That's already larger than any resolution difference in any gen.

Not visible. The difference between 720p and 1080p visually is smaller than between 720p and 640p for example.
The difference between 900p vs 1080p with good AA is actually pretty minimal in comparison to both earlier examples, on good a TV. It changes slightly on a monitor, though.
 
I remember san Andreas on ps2 the characters not having fingers but Xbox had them. But the traffic lights didn't reflect on the cars on Xbox but did on ps2. I showed my brother and he didn't care. I felt like a fucking scientist tho
 
Not visible. The difference between 720p and 1080p visually is smaller than between 720p and 640p for example.
The difference between 900p vs 1080p with good AA is actually pretty minimal in comparison to both earlier examples, on good a TV. It changes slightly on a monitor, though.

Not visible? Bullshit. I've been playing the Halo 5 beta and it being 720p is extremely noticeable. The IQ is terrible because of it.
 
PS3 was a full year younger than 360, the same distance

Depends how you look. From the standpoint of launch, yes, PS3 is younger. But point is, PS3 specs were finalized then, but launch date was delayed. Since delay, nothing has changed in PS3 until the release.
 
Not visible? Bullshit. I've been playing the Halo 5 beta and it being 720p is extremely noticeable. The IQ is terrible because of it.

Read Your post and my answer ...
You said its the biggest difference this gen, which You meant by % of decreased resolution and i negated it by 'not visible'. Visible difference is not as big as % resolution difference.
And i even explained it later in the post ...

Play any game, where You can activate decent AA, on Your TV in 1080p, 900p, 720p, 640p and tell me that difference between 720p and 640p is bigger than 1080p and 900p.
 
Top Bottom