TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
I dunno if this is true but I just wondered if any of the insiders could find out if some rumour I heard could be true. Simply the rumor is stating that MS have underestimated how much of its RAM is needed for the multiple OS on Xbox One and instead of using 3GB they may now have to use 4GB.

Just to re iterate, I have no idea if this is true, I just wondered if anyone could shed any light on whether this is true or bogus.
 
I dunno if this is true but I just wondered if any of the insiders could find out if some rumour I heard could be true. Simply the rumor is stating that MS have underestimated how much of its RAM is needed for the multiple OS on Xbox One and instead of using 3GB they may now have to use 4GB.

Just to re iterate, I have no idea if this is true, I just wondered if anyone could shed any light on whether this is true or bogus.

Sounds bogus to me. I don't have any insider anything, but at this point I have to assume they have their minimum RAM usage locked down.

Speaking of which, I really wonder how well that Twitch integration is going to work with Xbone. It's handling it via software instead of the custom streaming hardware that the PS4 has, so I wonder just how much in resources something like that is gonna take up.
 
Sounds bogus to me. I don't have any insider anything, but at this point I have to assume they have their minimum RAM usage locked down.

Speaking of which, I really wonder how well that Twitch integration is going to work with Xbone. It's handling it via software instead of the custom streaming hardware that the PS4 has, so I wonder just how much in resources something like that is gonna take up.

Really? There's no dedicated hardware in the Xbone for this?
 
Is it possible that this 3GB reserved RAM is partly due to video streaming, done without a separate streaming card and thus using actual main RAM itself?

I don't think MS knew Sony would include a separate video streaming card until the actual announcements and they had to have something similar, but by that point they probably already had the hardware in production and couldn't add a card.

No, the amount of RAM needed for encoding a stream is not very large, about 100-200MB.

The PS4 does not have a streaming card, it has an encoder chip that is likely to be sending the data to main RAM, same as the xbone.

Actually, with dedicated silicone like on the PS4, the memory requirement is much less. Much, MUCH less.

AbeMC, I've described how this works in several posts before, so forgive my if I cut-and-paste an explanation of how this works from another thread:

The PS4 is confirmed to have a separate, dedicated chip for AV compression/decompression. For Sony, this is likely to be an off-the-shelf chip, (or based on an off-the-shelf chip) from the Digital Imaging division (which makes world-class video cameras, ect.) of their own company. So they probably have a lot of designs to choose from for the best fit in the PS4.

These dedicated chips are purpose-built, highly-efficient streaming processors that have their own fast local store on the video chip itself (SRAM) or next door (DRAM) for the reference frame storage, ect. To give you a frame of reference of how efficiently and fast these processors chew through the data, a typical chip of this type for 1080p/60 H.264 processing can get away with having around 256kb of this type of local store. (In fact, if you read any number of white papers on various processors of this type, you can see this is actually a larger footprint than many actually use. Remember, these chips don't need to work on a whole frame at a time. They operate super-fast, so the task is broken up into the tiniest pieces.)

From there, it's a simple write to the HDD to store the file for later.

Think about it this way...in an HD camera like the one in your nicer smartphones, a GoPro or Sony's SONY AS15 "action camera," do you think they have a whole bunch of random memory on board? Of course not.

Knowing how this works, and knowing that they have this dedicated chip aboard, it's reasonable to assume that any OS activity reserved for this function would be extremely negligible to the OS/memory footprint.

Hope that helps. If the Xbone has dedicate hardware for this task, it would be foolish to use the system RAM to do it. It would slow the whole process down.
 
I thought that was for the TV stuff? Again, I'm not knowledgeable about tech stuff like this. :p

Video is video. It's going to take the video of you playing (the game feed), encode it using the GPU and send it to Twitch. If you're watching a stream, it will use the decoding hardware.

This same hardware is also used for their TV stuff like you mention, and it's also use for next-gen Smartglass with their remote rendering (render something on the console and send it to a tablet/phone like the WiiU).
 
Remember how the 360 was 300 some gflops and the PS3 was 500 some gflops? It amounted to very little.

It's not going to be 30fps on X1 while 60fps on PS4.

It's going to be slightly better AA solutions.

Slightly better shadows.

Slight better particle effects etc.

What is truely amazing is that both consoles while being quite pathetic when compared to gaming PCs are pumping out so many 1080/60 games. Clearly the raw specs mean little.

You cling onto that bit of hope.
 
Why dont Sony push the DDR5 in the press releases or conferences?

Most of the tech sites and podcats have no clue the difference between DDR3 and DDR5.

You answered your own question. At the end of the day, nobody gives a shit. Sony has tried marketing bleeding edge tech specs for years, and it's simply a waste of marketing efforts.

The proof is going to be with third party game comparisons, which we are months away from seeing, and STILL nobody is going to care except the few people who get off using it as fanboy fodder.
 
Actually, with dedicated silicone like on the PS4, the memory requirement is much less. Much, MUCH less.

AbeMC, I've described how this works in several posts before, so forgive my if I cut-and-paste an explanation of how this works from another thread:



Hope that helps. If the Xbone has dedicate hardware for this task, it would be foolish to use the system RAM to do it. It would slow the whole process down.

I was going on how much RAM would be needed for software encoding as an upper bound.

You answered your own question. At the end of the day, nobody gives a shit. Sony has tried marketing bleeding edge tech specs for years, and it's simply a waste of marketing efforts.

The proof is going to be with third party game comparisons, which we are months away from seeing, and STILL nobody is going to care except the few people who get off using it as fanboy fodder.

Kind of ironic that Telepathy also does not understand the difference between DDR5 (that is not even being developed let AFAIK) and GDDR5.
 
Any insiders hear anything more one way or another, or is Microsoft zipping the leaky lips in a panic? It seems very dangerous for MS to play a waiting game for yields with Playstation beating the new Xbox pretty handily even in America and UK. If they don't downclock any word on the scale of the yield problems?
 
You answered your own question. At the end of the day, nobody gives a shit. Sony has tried marketing bleeding edge tech specs for years, and it's simply a waste of marketing efforts.

The proof is going to be with third party game comparisons, which we are months away from seeing, and STILL nobody is going to care except the few people who get off using it as fanboy fodder.

Even the first few months we wont see much between them, its going to be the 3rd and 4th gen software where we will see the ps4 raw power in third party games i would guess.
 
Even the first few months we wont see much between them, its going to be the 3rd and 4th gen software where we will see the ps4 raw power in third party games i would guess.

We will see difference DAY 1. IQ in Xbone will be mess compared to PS4 games. It is either that, much less effects on xbone or better framerate on PS4.
 
Sounds bogus to me. I don't have any insider anything, but at this point I have to assume they have their minimum RAM usage locked down.

Speaking of which, I really wonder how well that Twitch integration is going to work with Xbone. It's handling it via software instead of the custom streaming hardware that the PS4 has, so I wonder just how much in resources something like that is gonna take up.

That's why I wanted an insider to look at it as I was unsure about that rumor too.
 
You answered your own question. At the end of the day, nobody gives a shit. Sony has tried marketing bleeding edge tech specs for years, and it's simply a waste of marketing efforts.

The proof is going to be with third party game comparisons, which we are months away from seeing, and STILL nobody is going to care except the few people who get off using it as fanboy fodder.

Xbox fans clearly care about it before when they used to brag about how much better multiplats performed on the 360 over the ps3.
 
Any insiders hear anything more one way or another, or is Microsoft zipping the leaky lips in a panic? It seems very dangerous for MS to play a waiting game for yields with Playstation beating the new Xbox pretty handily even in America and UK. If they don't downclock any word on the scale of the yield problems?


Nahh.. Based on the small number of pre sales that they are giving retailers to release it sounds like yeild issues were solved by making less number of machines and NOT down clocking as people were speculating that might another reason they didn't mind coming at $499 as they know they will still sell out for a while at that price since it will take longer to get production up with lower yields
 
Even the first few months we wont see much between them, its going to be the 3rd and 4th gen software where we will see the ps4 raw power in third party games i would guess.

The initial difference may not be noticeable to the average user, but there will be a definite difference to those who fixate on IQ and frame rate. Quite simply, the PS4 can just do more in 16 or 33 ms. So you would either need to intentionally throttle the PS4 or be completely incompetent to achieve parity.
 
Nahh.. Based on the small number of pre sales that they are giving retailers to release it sounds like yeild issues were solved by making less number of machines and NOT down clocking as people were speculating that might another reason they didn't mind coming at $499 as they know they will still sell out for a while at that price since it will take longer to get production up with lower yields

Isn't it more likely that the yield issue is what is forcing them to have the high price and that low stock rates?
 
Even the first few months we wont see much between them, its going to be the 3rd and 4th gen software where we will see the ps4 raw power in third party games i would guess.

I highly doubt it will take that long to see the difference. This is completely different than the PS3/360 gen where the PS3's power had to be coaxed out by experienced developers who had to spend thousands of man-hours figuring out how to use the SPU's effectively.

The PS4 simply has more raw GPU power and memory bandwidth, by quite a large margis. Most of the third parties are also experienced PC developers and their x86 engines are designed around raising various visual attributes for different-specced machines.

There will be a difference right away in most games that will be apparent when digital foundry posts side-by-side videos. Most people probably won't care, but many of us will.
 
On the bright side, surely the fact Intel will be pushing EDRAM on their 'R' variants of Haswell might get Dev's optimising engines for these intermediary large RAM scratchpads..

Still, you can't argue with the KISS approach..
 
The initial difference may not be noticeable to the average user, but there will be a definite difference to those who fixate on IQ and frame rate. Quite simply, the PS4 can just do more in 16 or 33 ms. So you would either need to intentionally throttle the PS4 or be completely incompetent to achieve parity.
Those who fixate on IQ and framerate don't play multiplatform games on a console ;)
 
1000+? You don't have to spend that kind of money to get a Pc game to look and run well.
If you want graphics to look on par with next-gen consoles I think a $1000 is a fair estimate. I paid $2000 for mine and I expect games like Uncharted 4 to be almost on par with what I'm getting now from this machine.
 
Xbox fans clearly care about it before when they used to brag about how much better multiplats performed on the 360 over the ps3.

It wasn't bragging, not in all cases anyway. I was one of "those" people.

The majority of multiplats gave a superior experience on 360. This is not even a point of debate, and really why I chose to play the vast majority on my 360 instead of PS3. I poked my head into threads from time to time and warn people about how shitty the PS3 version was if I'd had a chance to play both myself.

This gen, the hardware between the two systems is the same basic architecture, so I don't foresee vast differences this time. I am starting with the PS4 this gen, so there had better not be, at least on "my" side! ;)
 
Gemüsepizza;63990626 said:
He is talking about a PC capable of much better image quality than nextgen consoles. You won't get that for a fraction of $1000...

oh, right.

ah well. There's a reason why I feel that buying a PS4 is a better use of my money than upgrading my PC. For now, anyway.
 
oh, right.

ah well. There's a reason why I feel that buying a PS4 is a better use of my money than upgrading my PC. For now, anyway.

Same. I have an older i5 system with SLI'ed ATi 6870s, and even if I just replaced the video card with something better, that is most of the price of a PS4 right there.

In a few years when the divide is too great to ignore, I'll transition back to PC gaming, much as I did this gen.
 
With all the talk about a possible downgrade of clock speed on xb1, why couldnt sony have upgraded their gpu clockspeed to 1000mhz and got extra 20% performance spike for no aditional cost?
 
With all the talk about a possible downgrade of clock speed on xb1, why couldnt sony have upgraded their gpu clockspeed to 1000mhz and got extra 20% performance spike for no aditional cost?

Why would they do it? Yield wold be lower and they would need a more expensive cooling solution or they would have to settle with higher noise. And for what? They're already beating MS in performance by a wide margin.
 
It wasn't bragging, not in all cases anyway. I was one of "those" people.

The majority of multiplats gave a superior experience on 360. This is not even a point of debate, and really why I chose to play the vast majority on my 360 instead of PS3. I poked my head into threads from time to time and warn people about how shitty the PS3 version was if I'd had a chance to play both myself.

This gen, the hardware between the two systems is the same basic architecture, so I don't foresee vast differences this time. I am starting with the PS4 this gen, so there had better not be, at least on "my" side! ;)

I don't understand this.

The X360 versions were nearly always better because it was easier to make that version perform better than the PS3. The PS3 was capable enough, it just needed a lot of work that third parties didn't bother with for understandable reasons.

The PS4 is by all accounts, 50% more powerful on the GPU side than the XBONE. How is that not going to also show in some way on multiplat games? Since they are a similar architecture as you mention, they could fairly easily exploit that extra power.
 
It wasn't bragging, not in all cases anyway. I was one of "those" people.

The majority of multiplats gave a superior experience on 360. This is not even a point of debate, and really why I chose to play the vast majority on my 360 instead of PS3. I poked my head into threads from time to time and warn people about how shitty the PS3 version was if I'd had a chance to play both myself.

This gen, the hardware between the two systems is the same basic architecture, so I don't foresee vast differences this time. I am starting with the PS4 this gen, so there had better not be, at least on "my" side! ;)

Since the architecture will be the same, being 50% faster will mean a lot.
 
It wasn't bragging, not in all cases anyway. I was one of "those" people.

The majority of multiplats gave a superior experience on 360. This is not even a point of debate, and really why I chose to play the vast majority on my 360 instead of PS3. I poked my head into threads from time to time and warn people about how shitty the PS3 version was if I'd had a chance to play both myself.

This gen, the hardware between the two systems is the same basic architecture, so I don't foresee vast differences this time. I am starting with the PS4 this gen, so there had better not be, at least on "my" side! ;)
I think you are really looking at this the wrong way. The fact that architectural differences are smaller should only make performance advantages of one platform easier to leverage.
 
The PS4 is by all accounts, 50% more powerful on the GPU side than the XBONE. How is that not going to also show in some way on multiplat games? Since they are a similar architecture as you mention, they could fairly easily exploit that extra power.

My guess is, being the architectures are similar, the code bases will be largely the same. The difference will be the same as running a PC game at medium settings vs high.
 
1000+? You don't have to spend that kind of money to get a Pc game to look and run well.

If you have to include a monitor + case etc, $1k isn't overspending.

oh, right.

ah well. There's a reason why I feel that buying a PS4 is a better use of my money than upgrading my PC. For now, anyway.

Thats where I'm at. Next pc I make will either be an 8 core intel (if they ever come). Or a tiny silent APU based one. I'm leaning towards option #2 because while I like gfx, having a powerful computer does nothing to change game design that is based on console power and/or LCD pc specs. I.e., powerhouse computers = great looking games, not better games.
 
So what would it of cost microsoft extra to upgrade to the same GPU as Sony? I would have to assume that if there were leaks of both consoles to the public MS would have to have known what GPU Sony were going with.
I cant believe MS would have gone in so low with their GPU,as they have always been good with the power of their consoles.
 
Status
Not open for further replies.
Top Bottom