TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
XBox One has 100% more embedded RAM bandwidth than PS4 :) You really shouldn't isolate things like RAM bandwidth and ignore the architecture of the system itself.

Technically the XB1 has infinitely more embedded RAM bandwidth than the PS4, since the PS4 doesn't have any embedded RAM. That said though, the leaked eSRAM bandwidth is still less than the reported bandwidth that the PS4's GPU has to main memory.
 
So, PS4 is 50% more powerful than Xbone.
Expect multiplats to look better on PS4.
Expect Sony 1st party devs to put 3rd party and MS 1st party to shame.
 
XBox One has 100% more embedded RAM bandwidth than PS4 :) You really shouldn't isolate things like RAM bandwidth and ignore the architecture of the system itself.
Bandwidth is pretty similar if you take ESRAM into account.
And where did I say a difference wouldn't show? But the way it sounds, some are expecting a much bigger difference than what they'll get.



If they call that massive based on that, then I question their technological understanding.



Yes, but I'm on a roll right now so go away.
What are you talking about? 50% more raw GPU power is a generational leap.

GTX 580> GTX 680 = 40%-50% faster
AMD 6970> AMD 7970 = 40%-50% faster
 
XBox One has 100% more embedded RAM bandwidth than PS4 :) You really shouldn't isolate things like RAM bandwidth and ignore the architecture of the system itself.


The ESRAM is there as a patch to compensate for the slower DDR3 ram, and it's amount is much to small to make a meaning ful difference when it comes to rendering. And the ESRAM is still much slowing the the GDDR5 that's in the PS4.
 
Bandwidth is pretty similar if you take ESRAM into account.

The total sum of bandwidth is similar. But that by itself does not mean much. You can't transfer the same data over that bandwidth as only 68GB/s of that bandwidth are connected to main memory.

I am curious what the ESRAM will be used for. At this point, we don't know if it'll be some sort of managed cache (thinking about the "move engines") or a freely usable scratchpad. We also don't know if it has some sort of addition functionality that makes it more usable for storing framebuffers, like on the 360.
 
The total sum of bandwidth is similar. But that by itself does not mean much. You can't transfer the same data over that bandwidth as only 68GB/s of that bandwidth are connected to main memory.

I am curious what the ESRAM will be used for. At this point, we don't know if it'll be some sort of managed cache (thinking about the "move engines") or a freely usable scratchpad. We also don't know if it has some sort of addition functionality that makes it more usable for storing framebuffers, like on the 360.

I think it will be used as a scratchpad and keep the framebuffer in ddr3 like intel has it on haswell.
Personally i think the next gen console are cpu bounded then gpu bounded.
 
No its not. You can't just add the bandwidth of the ESRAM and DDR3 together. It's doesn't work like that.

Are people still really not clear on this point? I saw these debates raging for months after the PS4 announcements, and evidently it's still something that needs to be explained.
 
I said PC GPU.

In your opinion, console generation wise, would you say there is a generational leap between the Wii U HD4650 and the Xbone HD7790? When people kept saying the Wii U is "current gen" tech. To me thats the same difference between the PS4 and Xbone GPU.

I fail to see your logic behind "console generations".
It still stands
7850 / 18cus/32rops
7790/ 12cus/16rops worse than the pc 7790 with 14CUs.

Its going to be a leap. A noticeable one too.
 
Should be clear enough how large the difference in power is when the games hit.

I'm not certain this will be the case. At least not with the first wave. Give it a year or two as devs start using the full potential of the boxes and then you'll start to see the disparity. The first wave of games won't have been built with the exact specs and advantages in mind.
 
In your opinion, console generation wise, would you say there is a generational leap between the Wii U HD4650 and the Xbone HD7790?

Close to it, yes. And definitely more so than what seems to be said here about PS4 vs Xbone.

I fail to see your logic behind "console generations".
It still stands
7850 / 18cus/32rops
7790/ 12cus/16rops worse than the pc 7790 with 14CUs.

Its going to be a leap. A noticeable one too.

How can you fail to see the logic behind console generations when that's what it's always been defined by? Are we just going to act like everything before this coming gen never existed now to justify what you're trying to say?

You've got some spinning going on there with the "worse than the pc 7790 with 14CUs". You could say the same thing with PS4 and a 7870.

PS4 will definitely do things better than Xbone, but it seems like some are acting like PS4 is some kind of next-next gen console in comparison to Xbone.
 
Should be clear enough how large the difference in power is when the games hit.

I don't think we will see other than IQ and FPS. I think I'm quite positive we will see the different IQ on day one if the spec haven't change.

I will be quite surprised if the same game use one of the system got more enemies on the screen or biggest level than other system.
 
Close to it, yes. And definitely more so than what seems to be said here about PS4 vs Xbone.



How can you fail to see the logic behind console generations when that's what it's always been defined by? Are we just going to act like everything before this coming gen never existed now to justify what you're trying to say?

You've got some spinning going on there with the "worse than the pc 7790 with 14CUs". You could say the same thing with PS4 and a 7870.

PS4 will definitely do things better than Xbone, but it seems like some are acting like PS4 is some kind of next-next gen console in comparison to Xbone.
Well you could say they're both in the same generation console wise.

You could also say the PS4 is technologically a generation ahead of Xbox one in terms of raw power.

Neither of those statements would be wrong.
 
Well you could say they're both in the same generation console wise.

You could also say the PS4 is technologically a generation ahead of Xbox one in terms of raw power.

Neither of those statements would be wrong.

Second one would be wrong.

Using the BS multiplier system, console gaps are usaually 10x. ~1.5x != 10x.
 
I don't think we will see other than IQ and FPS. I think I'm quite positive we will see the different IQ on day one if the spec haven't change.

I will be quite surprised if the same game use one of the system got more enemies on the screen or biggest level than other system.

I would not be surprised if the devs just went with 2xMSAA on ps4 and some better form of post process AA on X1. Time will tell would not be surprised if its not noticeable VGLEAKS did make it seem that the SHAPE audio block can take away quiet some processing tasks.

On paper the ps3 was also leaps over the 360 but in practice even in the later years 360 would still perform better in majority of the multiplat titles. Could be mistaken haven't really gamed much on console last 3 years. But then the platform architectures were completely different and ps3 was on the wrong end of the gpu revolution.
So yeah im not really impressed with next gen showings its what im already used to on laptop and desktop.

Second one would be wrong.

Using the BS multiplier system, console gaps are usaually 10x. ~1.5x != 10x.

That also only based on gpu paper specs.
 
I would not be surprised if the devs just went with 2xMSAA on ps4 and some better form of post process AA on X1. Time will tell would not be surprised if its not noticeable VGLEAKS did make it seem that the SHAPE audio block can take away quiet some processing tasks.

I don't think that this audio hardware is any different from the PS4's audio hardware (apart from echo cancelation for Kinect). Both decode and process audio streams (in the audi formats preferred by Microsoft/Sony)

I guess the Xbone's hardware merely sounds more important, because of the leaked documents that give it a name.
 
I would not be surprised if the devs just went with 2xMSAA on ps4 and some better form of post process AA on X1. Time will tell would not be surprised if its not noticeable VGLEAKS did make it seem that the SHAPE audio block can take away quiet some processing tasks.

Yeah it won't be surprised if this gen same from last gen. I just can't see PS will get IQ ahead, it would be first time and not easy to getting used for me.
 
And I've never said we won't. It's just not going to be as big as the way it sounds some are expecting.
Well, from what I've read of your posts on the subject, it appears to mainly be a perspective thing.

A lot of people do see a 50% difference in power as huge though, and it will have a noticeable effect on games; especially for first party games.
 
The ESRAM is there as a patch to compensate for the slower DDR3 ram, and it's amount is much to small to make a meaning ful difference when it comes to rendering. And the ESRAM is still much slowing the the GDDR5 that's in the PS4.

The eSRAM is there as a key part of the systems design, not just some tacked on after thought. As to the idea that it needs to be large to have a meaningful effect, that's totally untrue. Embedded memory is used to fulfill operations that require lots of bandwidth but not much space, such as render targets. By doing these operations in a small fast buffer you not only remove that burden from main memory but you also get the advantage of much lower latency access. Something that's extremely important when it comes to these kind of operations. Alternatively things such as texture streaming aren't very latency dependent at all, which is why they're fine to keep of chip in main memory.

My point is simple here, if you compare PS4 and XBox One's main memory bandwidth alone you're not going to see the real picture of each systems memory architectures. Its really a pointless comparison without taking into account what those pools of memory are required for.
 
And I've never said we won't. It's just not going to be as big as the way it sounds some are expecting.

Exactly. I remember being completely skeptical about the leaked PS4/Xbone specs because they seemed so goddamn lopsided. But even then I knew we weren't going to see some next-next-generation difference between the two. Not even the Wii U-PS4 gap is that big.

Also, I think a better word for the performance gap than huge is "significant".
 
Well, from what I've read of your posts on the subject, it appears to mainly be a perspective thing.

A lot of people do see a 50% difference in power as huge though, and it will have a noticeable effect on games; especially for first party games.

+1
Had the tables been turned, it would be game over for PS4, Im sure some users would say. Now, its not that big a leap lol. I swear just a few months ago we were hearing rumors of dual APUs and dual GPUs (7770s) having 50% more power than the PS4, how its a huge leap of secret sauce, extra mayo, double cheese big mac power.
Fact is, the difference is much larger than the 360 vs ps3.
 
Well, 100% more than 0 (due to non-existence) is still 0 - I hope MS engineers didn't bungle their inferior RAM implementation to the point of being completely useless!

Heh yeah was sort of tonue in cheek hence the smillie. Just trying to make the point that these comparisons can be pointless if you use a solitary point of comparison.

Not sure how it could be useless by the way. You have a large pool of memory with something like 60GB+ bandwidth for textures, geometry, game code, audio code and OS. All things you need decent bandwidth for but not things that are too latency dependent. Then you have a small pool of memory with something like 100GB/s and extremely low latency for render targets, something that's much more latency dependent. Or are you talking about the implimentation of the embedded memory specifically? Like on XBox 360 where a lot of data had to be sent to main memory before being read back to embedded memory?

Overall it seems like a fairly efficient design. I get cold sweats personally at the idea of using 8GB of GDDR5 only to use 3GB of it for such a low bandwidth task as the OS :)
 
+1
Had the tables been turned, it would be game over for PS4, Im sure some users would say. Now, its not that big a leap lol. I swear just a few months ago we were hearing rumors of dual APUs and dual GPUs (7770s) having 50% more power than the PS4, how its a huge leap of secret sauce, extra mayo, double cheese big mac power.
Fact is, the difference is much larger than the 360 vs ps3.
I'm in tears reading that part.
 
I'm in tears reading that part.

lol you get my point though. Every rumor ended with MS having deep pockets. Prepare for the end of Sony. It was apparently inevitable.
I cant understand why a big difference like there is now should be downplayed. If DICE announced tomorrow BF4 is 60FPS at 720p on Xbone and 1080P and 60FPS on PS4, some would still downplay that. Yet, we will see a similar thing happen with multiplats next gen.
 
Well, from what I've read of your posts on the subject, it appears to mainly be a perspective thing.

A lot of people do see a 50% difference in power as huge though, and it will have a noticeable effect on games; especially for first party games.

It's definitely a perspective thing. Like I've said it sounds like some are treating PS4 like it's another gen above Xbone. And for me it will primarily be shown in first party games.

Exactly. I remember being completely skeptical about the leaked PS4/Xbone specs because they seemed so goddamn lopsided. But even then I knew we weren't going to see some next-next-generation difference between the two. Not even the Wii U-PS4 gap is that big.

Also, I think a better word for the performance gap than huge is "significant".

I'm fine with that word as well.

I guess I'm just having a hard time understanding what kind of difference some expect to see with the specs we know of. I can't put words in their mouths, but it's like they're expecting 2x the visuals with ~50+% the performance.

+1
Had the tables been turned, it would be game over for PS4, Im sure some users would say. Now, its not that big a leap lol. I swear just a few months ago we were hearing rumors of dual APUs and dual GPUs (7770s) having 50% more power than the PS4, how its a huge leap of secret sauce, extra mayo, double cheese big mac power.
Fact is, the difference is much larger than the 360 vs ps3.

Thankfully your post doesn't apply to me as I have always preferred PS over Xbox, though I'm not against Xbone. I would say the same thing regardless of the position. It still sounds there are more expectations out of the gap than we're likely to see.
 
Is the PS4 powerful enough that the some games really will be 60fps on it and 30fps on xbone? Because it's only 50% more powerful (theoretically) and I'm sure it would take a lot more effort to really get that 60fps, and on a multiplat would devs really spend the time doing so?
 
Is the PS4 powerful enough that the some games really will be 60fps on it and 30fps on xbone? Because it's only 50% more powerful (theoretically) and I'm sure it would take a lot more effort to really get that 60fps, and on a multiplat would devs really spend the time doing so?

There could be cases where the Xbone version is running at ~45 fps during development and needs to be locked down to 30 but the ps4 version is running at 60.

However, I think it's more likely that most games will run at the same frame rate on the two machines and the differences will be like going from med to high settings on a pc version of a game. How big of a difference that is will vary quite a bit from developer to developer.
 
We also don't know if it has some sort of addition functionality that makes it more usable for storing framebuffers, like on the 360.

Pretty sure it doesn't, otherwise they'd be touting its internal bandwidth.
 
But the way it sounds, some are expecting a much bigger difference than what they'll get.

The way you're making it sound, the difference will be smaller than what we'll actually get. You're making it sound like 600 gigaflops of more performance is not a whole entire football field to play around with. That difference is ridiculous for two systems that are supposed to be similarly priced. You would expect the edge to be going towards the console that's more expensive. Somehow it's going towards the console with the lower price? Like, wtf.

That's not even mentioning the 176 GIGABYTES per second of bandwidth across EIGHT GIGS of RAM, as compared to only 66 or so GB/s of bandwidth with an additional 102 or so GB/s across only 32MBs of RAM... good luck optimizing and pigeon-holing your game code.

Again, that difference is pretty ridiculous between two systems that are supposed to be similarly priced and launching in the same generation. Again, you would expect the one that's more expensive to be more powerful. How is it not? Oh, that's right, because one of the systems, the more expensive one, is substituting those other 600 gigaflops of power for a fucking camera.

If they call that massive based on that, then I question their technological understanding.

Where are you to question Microsoft's technological understanding when they claim that their cloud processing will balance the difference in game performance?

Yes, but I'm on a roll right now so go away.

Can you get on a roll with some leaks instead. Or am I confusing you with someone that still has some?
 
Is the PS4 powerful enough that the some games really will be 60fps on it and 30fps on xbone? Because it's only 50% more powerful (theoretically) and I'm sure it would take a lot more effort to really get that 60fps, and on a multiplat would devs really spend the time doing so?

I think the FPS difference won't happen, most of the time the CPU will limit the possible FPS. So if a game on the xBone is 30FPS, it won't do much more on the PS4 because they both have almost the same CPU built in. Games like Online Multiplayer 64 Players like in BF4. Though I've heard Dice is running their Game at solid 60FPS on Nextgen, it sounds too good to be true, an Octacore Jaguar clocked at 1,6Ghz shouldn't be able to do that, but then again, this is no PC it's Consoles we're talking about.

But what could happen is, that the PS4 might have nicer Anti-Aliasing, or higher Res(instead of 720p they might opt for sub-1080P like in GT5 for the PS3: 1200x1080). That could make a big difference!

What's also possible: Sharper Textures since the PS4 only uses 1GB for the OS instead of the 3GB of the xBone. Plus more Details overall, higher Polygoncount, less pop ups, more special effects, etc...
 
Is the PS4 powerful enough that the some games really will be 60fps on it and 30fps on xbone? Because it's only 50% more powerful (theoretically) and I'm sure it would take a lot more effort to really get that 60fps, and on a multiplat would devs really spend the time doing so?

I think the cpu is determining factor between that people keep to forget that cpu are really weak sauce. Those benchmarks are running I5 or I7 clocked most likely at 4Ghz+ a 8 core jaguar is not that.
 
50% more powerful than something that is already significantly capable in itself could be a big deal, especially later in the next generation. And what's very different now, it's that the most powerful is not the one with the most twisted architecture.
 
I'd like to know this as well.

Crytek dev on B3D said Ryse ran on X1 devkit silicon.
A lot of others X1 devs lashed out on twitter to Phil fish saying their game ran on X1 devkit silicon.

LocoCycle was the game that crashed to win7 go look up the game and if that kill the X1 ms has bigger issues then this bad pr.
 
It has already been said many times that PS4 will have an advantage over XB1 in graphics. The specs tell the truth. This is not PS3 vs 360, this situation is completely different. Jack Trenton even commented saying 3rd partys expect this difference too-

http://www.youtube.com/watch?v=1Mm3P4Ft-y4&t=6m20s

Now how much of a difference remains to be seen. But again, the numbers tell the truth, and the numbers say it's going to be pretty noticeable. Call it whatever you want to; "small" "not that big" "massive"- the difference will be there. Which is the point some are making.

For me, no, i'm not expecting a ILM graphics difference, but improvements you can see and appreciate.
 
Crytek dev on B3D said Ryse ran on X1 devkit silicon.
A lot of others X1 devs lashed out on twitter to Phil fish saying their game ran on X1 devkit silicon.

LocoCycle was the game that crashed to win7 go look up the game and if that kill the X1 ms has bigger issues then this bad pr.

LocoCycle is from a first party studio too, there's something telling about that.
 
LocoCycle is from a first party studio too, there's something telling about that.

LocoCycle had been in development from the beginning as a 360 game. It could just be something as simple as that they have not yet finished porting it over to X1. They are a small team designed to make downloadable titles. Not a giant first party studio.
 
LocoCycle is from a first party studio too, there's something telling about that.

We still have Gamescom and TGS to go.
The games did not scream fuck look at this this is titan power baby.
From press reaction the games did not looked better then ps4 games.
Its either a devkits or a pc specced/restricted to X1 performances.

So unless you disagree and say X1 showed games that looked that much better then ps4 games.
Or come to the conclusion that ps4 is stronger then an titan or 680GTX card.
I think we will come to the same conclusion that most games were likely run on devkits.

#teamDevkit
 
Status
Not open for further replies.
Top Bottom