TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
MS having eSRAM yield problems on Xbox One, RUMOR: downclocking GPU

You normally downclock because of heat issues. Why would you lower the clock for a smaller than predicted yield (less than 32 meg good eSRAM). Edit: prag16's post is a possible but I wouldn't want to buy a Xbox one with that chip because hot spots tend to erode and expand. Running at a slower clock can just increase the time to failure.

Because not all chips run at the same speed. Yield isn't just a binary work/broken status. They may be just close enough to getting them working at a lower clock.

Intel discovered the opposite of this years back, when they rebranded some higher end Pentiums as Celerons.
 
MS having eSRAM yield problems on Xbox One, RUMOR: downclocking GPU

You normally downclock because of heat issues. Why would you lower the clock for a smaller than predicted yield (less than 32 meg good eSRAM). Edit: prag16's post is a possible but I wouldn't want to buy a Xbox one with that chip because hot spots tend to erode and expand. Running at a slower clock can just increase the time to failure.

You lower the clock so you can run it at lower voltage, if it's not stable at a higher voltage. It doesn't have to be heat related.
 
The reason GAF as a whole is not pro Microsoft, really ever, is because they are seen by most as bad for the industry. Nearly everything they have done is in an attempt to constrict the market further and further and make it less and less global in the process as well. Even their wonderful indie initiative has been completely thrown in the garbage with past devs, that made lots of money on the 360, now speaking out against Microsoft.

Why wouldn't you hate Microsoft at this point? This is the same company that makes you spend hundreds of dollars just to use freaking Word....WORD!!! And hundreds more on Operating System iterations. Why would anyone think their involvement in gaming would end in anything but tears?

Also sorry for the derail, but man what a jerk thing to say.
 
MS having eSRAM yield problems on Xbox One, RUMOR: downclocking GPU

You normally downclock because of heat issues. Why would you lower the clock for a smaller than predicted yield (less than 32 meg good eSRAM). Edit: prag16's post is a possible but I wouldn't want to buy a Xbox one with that chip because hot spots tend to erode and expand. Running at a slower clock can just increase the time to failure.

The rumor comes from trying to increase yields by lowering clocks on current chips. Not sure if the bad chips are overheating much more easily at the originally set clocks or if there are other defects causing the chips to not meet previously stated clocks.
 
You lower the clock so you can run it at lower voltage, if it's not stable at a higher voltage. It doesn't have to be heat related.

It's most likely heat related, directly or indirectly. Heat can cause you to fail timing tolerances, which causes a downclock. The silicon is the silicon and the delays are pretty well understood (transistor models have 100+ variables these days). It's probably heat, one way or another.
 
Why wouldn't you hate Microsoft at this point? This is the same company that makes you spend hundreds of dollars just to use freaking Word....WORD!!! And hundreds more on Operating System iterations. Why would anyone think their involvement in gaming would end in anything but tears?
No offense or defense, but hating anything, especially a major corporation, is a waste of energy. And if you're spending hundreds of dollars for office you're doing something wrong.
 
Kind of. When the yield is lower, a lot of the bad chips will still "work" much of the time. Until you get the temperature up. Then they start failing. So in a manner of speaking they are "overheating", just at a lower temp than the "good" chips.

This isn't how this works. They test all chips before they leave the factory. They are specced to work at a particular temperature, and they are tested at this temperature. If they don't work at this temperature (which should be higher than they will see in real life), they don't get used. There's no overheating.

The trouble with the 360 was that they screwed up the cooling of the system because the heatsink for one of the chips was way too small and UNDER the DVD drive where it got no airflow.

From looking at the XBox One, there is no such issue.
 
To be fair, he then goes on to [clarify] a few lines after:


Dunno if people are deliberately missing that.

The defensiveness in this thread (including the majority of posters assuming it was personally directed at them, or that owning a device eliminates the possibility of their reacting irrationally) gives credence to what he's saying. Preemptively crying out for the failure of a platform is pretty odd no matter what company makes it. If the device is truly anti-consumer, then consumers won't be interested in it, and certainly not long term. I got bent out of shape when I saw people saying the PS3 was rubbish in its first year because it wasn't optimized and there weren't a lot of games, but I also trusted that if it was as good as I gave it credit for being, it would do fine long-term. That absolutely ended up being the case. I look at it the same way here: if the XB1 really does have a cruddy ownership proposition for consumers, after the initial rush, people will react to those problems in their purchases. I'm comfortable with that dynamic. It's clear many people here aren't, at least as far as the XB1 is concerned.

What I see here is a substantial number of voices in this community not trusting other consumers to make the purchase choices they want them to make (which is crazy in its own way), so they shout louder and try all the harder to proselytize against something instead of actually rallying for something. Again, it's not merely reacting to rumors, or considering rumors. It's enthusiasm for negative rumors, in a way that distorts the overall image of what's actually happening. That's not to say there's not issues with the proposition of the XB1. There's a reason I have no plans to purchase one, but I'm not personally attacking anyone who dares point out the possibility that we need more info on it, or who dares not call it the "xbone," or dares point out that certain rumors like the down-clock aren't really solid yet. The fact that there's a thread with this much attention on it because yields of a component are lower than they want them to be again proves Brad's point has merit. Almost every mainstream tech product faces a similar production inefficiencies. Look at this thread compared to the reaction in those cases. This is bile.

As I said before, when the audience reaches the point where they assume any positive statement by the company can't be trusted at all, things have reached a fever-pitch. When that reaches the point where people are attacking the people carrying those messages on a personal level, things have done too far. I'm not saying "assume PR is always good," but if you categorically won't accept info from the only primary source we have on the record for it, it's hard not to see that as baseless denial, because people don't want it to be true.
 
MS having eSRAM yield problems on Xbox One, RUMOR: downclocking GPU

You normally downclock because of heat issues. Why would you lower the clock for a smaller than predicted yield (less than 32 meg good eSRAM). Edit: prag16's post is a possible but I wouldn't want to buy a Xbox one with that chip because hot spots tend to erode and expand. Running at a slower clock can just increase the time to failure.

Because bad yield mean that not all chips will go to some targeted performance like 1600 or other Mhz. That is beside errors and same reason why AMD has 3 core variant of their CPUs. Same reason why we can OC our CPUs.

There was no rumor about overheating. CBOAT didn't say that.
 
The reason GAF as a whole is not pro Microsoft, really ever, is because they are seen by most as bad for the industry. Nearly everything they have done is in an attempt to constrict the market further and further and make it less and less global in the process as well. Even their wonderful indie initiative has been completely thrown in the garbage with past devs, that made lots of money on the 360, now speaking out against Microsoft.

Why wouldn't you hate Microsoft at this point? This is the same company that makes you spend hundreds of dollars just to use freaking Word....WORD!!! And hundreds more on Operating System iterations. Why would anyone think their involvement in gaming would end in anything but tears?

Also sorry for the derail, but man what a jerk thing to say.

Uhhhh...what? Yes, a very powerful application is something the creators want to get paid for, in some way. I suppose your outrage is because Google Docs is free, and you see it as a "good enough" competitor? Of course, then I hope you aren't also complaining about the perceived privacy issues of XO, seems Google Docs is about as anti-privacy as it comes. But if you're ok with that, then Office WebApps are free and more feature rich than gDocs. I don't see how you made any point against MS with this bit of rant.

OS iterations? XP->7 was no iteration. Other upgrades can be argued, but MS also doesn't charge for service packs, those yearly iterations that usually cost $20 on the OSX side. And I've happily bought, I think, the last two-three OSX "iterations".

Basically what I'm saying is, son, you seem to have an emotional response to the name Microsoft and aren't as clear in your thinking as you may think you are. Bad derail, but c'mon, blind hatred is as bad as blind loyalty.
 
This isn't how this works. They test all chips before they leave the factory. They are specced to work at a particular temperature, and they are tested at this temperature. If they don't work at this temperature (which should be higher than they will see in real life), they don't get used. There's no overheating.

The trouble with the 360 was that they screwed up the cooling of the system because the heatsink for one of the chips was way too small and UNDER the DVD drive where it got no airflow.

From looking at the XBox One, there is no such issue.

Additionally, to save money, MS used lead-free solder, which is cheap and has a lower melting temp, because they didn't want to pay the fine.
Over time, as it GFX chip would get hot and cool, that would cause the solder to slowly reflow itself, allowing the chip to bend off of the board and become disconnected. That's why later lazy quick revisions of the Xbox added the X-clamp over the chip to keep it from popping off.

And because of gravity, that's why it failed quicker when vertical. Not because of heatflow, but because the solder was falling in that direction.
 
Man, I love GB but I think both Jeff and Brad (the most vocal so far) have just really missed the point on this. Makes me feel a bit sad because before my ideals and theirs lined up so well.

"They were completely reasonable when they agreed with me, but now that we disagree, they're wrong and just don't get it."

(I'm not even saying I disagree with the concerns about the ownership dynamics/connectivity requirements with the XB1, but look at what you wrote and think about it: I think your reasoning may not be allowing for someone to simply come to a different conclusion from you. Reasonable people can have different opinions.)
 
As I said before, when the audience reaches the point where they assume any positive statement by the company can't be trusted at all, things have reached a fever-pitch. When that reaches the point where people are attacking the people carrying those messages on a personal level, things have done too far. I'm not saying "assume PR is always good," but if you categorically won't accept info from the only primary source we have on the record for it, it's hard not to see that as baseless denial, because people don't want it to be true.

nevermind i thought you were talking about nelson
 
Uhhhh...what? Yes, a very powerful application is something the creators want to get paid for, in some way. I suppose your outrage is because Google Docs is free, and you see it as a "good enough" competitor? Of course, then I hope you aren't also complaining about the perceived privacy issues of XO, seems Google Docs is about as anti-privacy as it comes. But if you're ok with that, then Office WebApps are free and more feature rich than gDocs. I don't see how you made any point against MS with this bit of rant.

OS iterations? XP->7 was no iteration. Other upgrades can be argued, but MS also doesn't charge for service packs, those yearly iterations that usually cost $20 on the OSX side. And I've happily bought, I think, the last two-three OSX "iterations".

Basically what I'm saying is, son, you seem to have an emotional response to the name Microsoft and aren't as clear in your thinking as you may think you are. Bad derail, but c'mon, blind hatred is as bad as blind loyalty.

To be fair he never mentioned Google Docs, maybe he is a fan of Libre Office?
 
Because bad yield mean that not all chips will go to some targeted performance like 1600 or other Mhz. That is beside errors and same reason why AMD has 3 core variant of their CPUs. Same reason why we can OC our CPUs.

There was no rumor about overheating. CBOAT didn't say that.
I guess that's possible if it's low power eSRAM. Normally eSRAM is much faster than DRAM and 800Mhz would not cause an overheat issue.

http://web.eecs.umich.edu/~prabal/teaching/eecs373-f10/readings/sram-technology.pdf said:
An SRAM (Static Random Access Memory) is designed to fill two needs: to provide a direct interface with the CPU at speeds not attainable by DRAMs and to replace DRAMs in systems that require very low power consumption. In the first role, the SRAM serves as cache memory, interfacing between DRAMs and the CPU.

The second driving force for SRAM technology is low power applications. In this case, SRAMs
are used in most portable equipment because the DRAM refresh current is several orders of magnitude more than the low-power SRAM standby current. For low-power SRAMs, access time is comparable to a standard DRAM
 
MS having eSRAM yield problems on Xbox One, RUMOR: downclocking GPU

You normally downclock because of heat issues.

No. Let me guess, you are familiar with how overclocking works. This is not Overclocking 101.

A chip leaving a factory is specced to work at a particular frequency. This means that when the clock fires, a signal leaving one particular unit of memory within the chip has to get to another point on the chip and through a ton of logic before the clock fires again. If it does not, you turn down the frequency and test again. If it works, you just determined the clock frequency that your chip works at.

If the XBox One's APU is really having these rumored issues, the trouble is that the frequency that the chip is tested to work at is lower than they would like. There are ways to fix this. Some take longer than others.
 
Additionally, to save money, MS used lead-free solder, which is cheap and has a lower melting temp, because they didn't want to pay the fine.
Over time, as it GFX chip would get hot and cool, that would cause the solder to slowly reflow itself, allowing the chip to bend off of the board and become disconnected. That's why later lazy quick revisions of the Xbox added the X-clamp over the chip to keep it from popping off.

And because of gravity, that's why it failed quicker when vertical. Not because of heatflow, but because the solder was falling in that direction.

It is also why my original 360 failed after I moved it when it was warm. Since then I always made sure that the console was cool before I moved it.
 
Brad also posted this:



(source)

People who are against Xbone's horrid DRM policies are just Sony youngsters who don't know any better!

He has a point..

many of us grew up gaming well before Sony and MS, and have seen lots of crap come and go from all manufacturers..

I am amazed how short peoples memories seem to be, and amazed how irratoinal and illogical 80% of the arguments made against MS are..

Not that I think MS deserve defending, I'm just not emotionally invested for some reason and it just feels like deja vue whenever any paradigm shift occurs and the backlash, only to look back years later and see how ridiculous people where being and how little the fears came to fruition..

It seems age and experience do count for something..
 
I guess that's possible if it's low power eSRAM. Normally eSRAM is much faster than DRAM and 800Mhz would not cause an overheat issue.

But lowering clock for ESRAM would mean effective bandwidth would fall.

What would be point of ESRAM with ~80GB/s bandwidth ? That's only a little more than 68 for DD3.
 
He has a point..

many of us grew up gaming well before Sony and MS, and have seen lots of crap come and go from all manufacturers..

I am amazed how short peoples memories seem to be, and amazed how irratoinal and illogical 80% of the arguments made against MS are..

Not that I think MS deserve defending, I'm just not emotionally invested for some reason and it just feels like deja vue whenever any paradigm shift occurs and the backlash, only to look back years later and see how ridiculous people where being and how little the fears came to fruition..

It seems age and experience do count for something..

He has a point ? Where ?

Most of GAF is shitting on MS because their policy on DRM used games and lending not HAHAHA MS is shitte because it is Black BOX ! PS4 RULEZ !

Rage is justified and MS deserves it.
 
But lowering clock for ESRAM would mean effective bandwidth would fall.

What would be point of ESRAM with ~80GB/s bandwidth ? That's only a little more than 68 for DD3.

Latency of the ESRAM is ridiculously low compared to off-chip DDR or GDDR. It isn't just bandwidth. It is how quickly you can get the data after the CPU/GPU requests it. The compute resource, if it is an in-order design (not sure if Jaguar is in order or out of order on the GPU side), will end up doing nothing while it waits for the data to come back. The lower the latency, the better.
 
Latency of the ESRAM is ridiculously low compared to off-chip DDR or GDDR. It isn't just bandwidth. It is how quickly you can get the data after the CPU/GPU requests it. The compute resource, if it is an in-order design (not sure if Jaguar is in order or out of order on the GPU side), will end up doing nothing while it waits for the data to come back. The lower the latency, the better.

It's to limited in size to do anything impressive. This isn't secret sauce. This is subpar to the PS4 DDR5 solution. It's just not quite as significantly inferior with the ESRAM.
 
As in Brad Shoemaker?

If so then I think he's trolling us since Giantbomb is quite sour on MS's current tactics.

There's been an unfortunate dynamic, where overall the GiantBomb guys have been negative toward Microsoft, but they've also been making a point to include industry reference points, and justifications where they think the situation deserves justification, and people in the particularly heated XB1 Gaf threads will toss in that justification out-of-context to a room full of people looking for information or attitudes to be frustrated by regarding the XB1, and suddenly people here are reacting to that, and sometimes send messages to the GiantBomb people. Which get reacted to.

It's a vitriolic environment. Gerstmann's "Still A Threat" feed might give a good sense of it. I'm finding myself in the odd position of agreeing with a lot of the negativity, but finding the way people are acting about it (or for what reasons they're arriving at some conclusions) to be out of line. Like I tried to call out earlier, there's a LOT of personal attacks toward people who disagree, or who go out of their way to try to explain a situation. With a few exceptions, people have been polite to me though, and I'm in a "reactionary" role as much as anything else, so it's not the Wild West yet.
 
You lower the clock so you can run it at lower voltage, if it's not stable at a higher voltage. It doesn't have to be heat related.

Yeah, and by lowering the voltage it also runs cooler. That's why the PS4 memory was a little bit downclocked with its bandwidth going from 192GB/sec. to 176GB./sec., and in that case I remember how the rumor was confirmed by some people here and it turned out to be true. So, it's possible this rumor about the eSRAM is also real.
 
Latency of the ESRAM is ridiculously low compared to off-chip DDR or GDDR. It isn't just bandwidth. It is how quickly you can get the data after the CPU/GPU requests it. The compute resource, if it is an in-order design (not sure if Jaguar is in order or out of order on the GPU side), will end up doing nothing while it waits for the data to come back. The lower the latency, the better.

i don't see connection to what i said.. You mean if that would be the case (what jeff said) latency will be worse or that it will stay the same ? We know latency is strong SRAM point.
 
There's been an unfortunate dynamic, where overall the GiantBomb guys have been negative toward Microsoft, but they've also been making a point to include industry reference points, and justifications where they think the situation deserves justification, and people in the particularly heated XB1 Gaf threads will toss in that justification out-of-context to a room full of people looking for information or attitudes to be frustrated by regarding the XB1, and suddenly people here are reacting to that, and sometimes send messages to the GiantBomb people. Which get reacted to.

It's a vitriolic environment. Gerstmann's "Still A Threat" feed might give a good sense of it. I'm finding myself in the odd position of agreeing with a lot of the negativity, but finding the way people are acting about it (or for what reasons they're arriving at some conclusions) to be out of line. Like I tried to call out earlier, there's a LOT of personal attacks toward people who disagree, or who go out of their way to try to explain a situation. With a few exceptions, people have been polite to me though, and I'm in a "reactionary" role as much as anything else, so it's not the Wild West yet.

Thanks for the link.
 
Latency of the ESRAM is ridiculously low compared to off-chip DDR or GDDR. It isn't just bandwidth. It is how quickly you can get the data after the CPU/GPU requests it. The compute resource, if it is an in-order design (not sure if Jaguar is in order or out of order on the GPU side), will end up doing nothing while it waits for the data to come back. The lower the latency, the better.
Jaguar is OOE and latency is not that important in 3D rendering as it is in CPU operations, because there is an ENORMOUS amount of computation going on the GPU.
 
It's to limited in size to do anything impressive. This isn't secret sauce. This is subpar to the PS4 DDR5 solution. It's just not quite as significantly inferior with the ESRAM.

I guess you don't think that caches can do anything impressive, either, since they are so small? Might as well not include them? And I didn't say it wasn't an inferior solution to utilizing GDDR5.
 
MS having eSRAM yield problems on Xbox One, RUMOR: downclocking GPU

You normally downclock because of heat issues. Why would you lower the clock for a smaller than predicted yield (less than 32 meg good eSRAM). Edit: prag16's post is a possible but I wouldn't want to buy a Xbox one with that chip because hot spots tend to erode and expand. Running at a slower clock can just increase the time to failure.

It's not heat. The shmoo chart may have indicated that a good number of eSRAM chips fail at a higher clock, which happens to a lot of chips out there.
 
Before I fell back into PC gaming around 2 years into the 360/PS3 gen, I mained a 360 simply because lol PS3 no games. I wasn't a fanboy for the console, it was just the better choice at the time. Didn't own a PS2 until the launch of the current gen, owned a GameCube instead. I'm all over the map with no "loyalty" (I had Nintendo loyalty, but they utterly destroyed that).

Now, I'd like the following to happen:

1. The WiiU live a short life and get discontinued next year
2. The Xbone to never make it out of the gates

If the PS4 isn't totally anti-consumer like the Xbone and to a much lesser extent (though still present) the WiiU, I hope it rains gold upon Sony. No, not piss.

Or change [bolded] :)
 
This isn't how this works. They test all chips before they leave the factory. They are specced to work at a particular temperature, and they are tested at this temperature. If they don't work at this temperature (which should be higher than they will see in real life), they don't get used. There's no overheating.

The trouble with the 360 was that they screwed up the cooling of the system because the heatsink for one of the chips was way too small and UNDER the DVD drive where it got no airflow.

From looking at the XBox One, there is no such issue.

What you and I are saying isn't mutually exclusive. If the yield is bad (meaning an undesirable percentage of the chips can't handle the expected temps) that means the "bad" chips are overheating, in terms of what THOSE CHIPS can handle. You're splitting hairs.
 
What you and I are saying isn't mutually exclusive. If the yield is bad (meaning an undesirable percentage of the chips can't handle the expected temps) that means the "bad" chips are overheating, in terms of what THOSE CHIPS can handle. You're splitting hairs.

No, that is not true. What you are saying is false. There is no overheating. The chips simply are not fast enough, but heat is not the issue. Hell, SRAM usually runs pretty cool since such a small part of it is active at any one time.
 
This is what Brad Shoemaker from Giant Bomb had to say about a majority of GAF after this topic came up.

"GAF has such a hard-on to see the Xbox One fail, I'd reserve judgement. Place is full of frothing Sony fans."

Link here

Dumb thing to say, it only betrays his own bias, which to be fair, was pretty evident evident even before his brain fart.
 
I'm not surprised he said that. Its clear he's has no understanding of Gaf.

GAF tends to skew Sony, in part because the majority of GAF's userbase, sadly, are console fans, many of whom have been console gaming for a very long time. As such, they're more likely to be invested in Sony or Nintendo. It's always seemed fairly obvious to me that in a normal GAF discussion, the response to criticizing a Microsoft or Nintendo game is more likely to elicit agreement, whilst a negative comment towards a Sony game will readily garner a fairly consistent wall of general "fuck you," "you're insane" and other, similar remarks.

Buttocks claimed that Dead Space 3 was a straight up Co-op shooter. Except for the first hour or so in the beginning, DS3 was a generic third person shooter. Anyone who played DS3 could tell you that this was the case. There was no horror outside of maybe a meta commentary on how far the series had changed from it's initial goals. Also, publish dates get moved around constantly due to other games, developmental, or legal issues. Ask Wii owners waiting for Rayman: Legends or anyone waiting for a release of a less popular title during the release timeframe of massive franchises such as WoW, GTA, or CoD.

--------------

Remember Me was a Sony funded exclusive when leaked by Buttocks. Later down the line Sony wasn't happy with the quality and cancelled the title. It was then picked up by Capcom and switched to a PC/360/PS3 release. You can ask the developers if you want, they'll tell you the same thing.


Crazy hasn't been wrong about any announcement when they were made.

Dead Space 3 was just as much a horror game as the previous games. The first hour is when it was at its least horrific ('cause it was just "hey, run away from men who are shooting you). After that, it put players in space, with necromorphs, and got more alien from there.

Feel free to argue that it wasn't very terror-focused, or that it was bad. I'll back you up on those counts. But it was just as much a horror game as the original Dead Space.

It's possible that Cboat saw early takes of the game, particularly that first hour, and came to that assumption.

My point, I think, still stands: sometimes, Cboat ends up being wrong. Cboat might be right on at the time, but sometimes, that information isn't correct.
 
GAF tends to skew Sony, in part because the majority of GAF's userbase, sadly, are console fans, many of whom have been console gaming for a very long time. As such, they're more likely to be invested in Sony or Nintendo. It's always seemed fairly obvious to me that in a normal GAF discussion, the response to criticizing a Microsoft or Nintendo game is more likely to elicit agreement, whilst a negative comment towards a Sony game will readily garner a fairly consistent wall of general "fuck you," "you're insane" and other, similar remarks.

Like many said, gaf is rather balanced and stomped on sony when ever the chance was (ps3 beginning, PSN hacks, shitty marketing, multiplats and many other things).

If Sony would be in the place of ms now, people would stomp sony and praise MS for the indy support, GDDR5 ram and game focus. And Sony fanboys would cry that gaf is MS biased.
Personally i don't want one of them to dominate a gen. Competition is healthy for the consumer.
 
GAF tends to skew Sony
GAF skews whatever you're most sensitive about. I don't know where you're getting your data from other than your own idiosyncrasies regarding what you notice most, but for example for multi-platform MP console games on GAF, the larger communities are almost always 360, by far. Those Halo and other 360-specific topics also seem to do alright.
 
Dumb thing to say, it only betrays his own bias, which to be fair, was pretty evident evident even before his brain fart.

GAF really hates the Xbox right now, and I'm sure some people are getting a little too muh glee out of it; Sony's lack of clarity regarding its own policies is also VERY troubling and we aren't talking enough about that.

That being said, The Xbox is really upsetting gamers, me included. I get the Steam argument, but I'm not buying. These policies hurt the consumer and that's me.

Plus, GAF eviscerated Sony when the 599 thing was announced, and the PS3 is generally acknowledged as sub-optimal on many fronts on this forum.
 
Status
Not open for further replies.
Top Bottom