eSRAM question

bGanci

Banned
I don't know a lot about this stuff so bear with me. I know there is a lot of controversy about MS going with eSRAM. My question is does eSRAM work better for the future "Cloud gaming"? Is that why they went with eSRAM? It sounds like Microsoft may be showing off the potential of the cloud at E3. I guess we will see. Thanks for reading.
 
No, it has nothing to do with the cloud.

The ESRAM is a fast, small memory. It's good for quick switching between small programs. Not that great for running one dedicated high-resource-usage program.
this is a very simplified analogy that isn't 100% correct but serviceable.
 
"Cloud" is just a generic term they use to befuddle less tech-savvy people to hide the power difference between PS4 and XB1.

It won't amount to anything substantial and can't make up for any power differences.
 
Was there a controversy over the eSRAM?

I was under the impression that they needed the eSRAM as a band aid fix due to their main RAM being so slow.

Also, the "cloud" can be used with anything, including your phone; it's not platform exclusive.
 
The went eSRAM because they couldn't get the superior eDRAM manufactured. eSRAM was the shitty fall back option. It is a giant clusterfuck of a design. There is no silver lining to it, they done fucked up.
 
Nothing to do with the cloud. What makes cloud a possibility for xbone is availability.
Nope

They went with it because their main ram is too slow, so it's a crutch/mitigation for that

No, it's not.

Esram was chosen before any other memory choice. It was a deliberately choice to be an evolution of the 360 design. Any other memory decision, inclusive capacity os type of the mais ram came after esram was chosen.
 
No, they went for it to save money. They chose cheap DDR3, which is too slow in combination with a gpu. ESRAM is a way to kit the flaw. A inefficient way.

Ironically the choice of DDR3 prevents future decrease of manufacturing costs, because the price of DDR3 is already low and cant go much lower.
 
Nothing to do with the cloud. What makes cloud a possibility for xbone is availability.


No, it's not.

Esram was chosen before any other memory choice. It was a deliberately choice to be an evolution of the 360 design. Any other memory decision, inclusive capacity os type of the mais ram came after esram was chosen.

Pretty sure they decided that they needed 8gigs of ram and saw no other way to achieve that other than DDR3 and as a result they decided on ESRAM to increase bandwidth/speed.
 
eSRAM is a band-aid to the slow DDR3 RAM.

Cloud means scalable and cheaper dedicated servers.

Nothing related.

Pretty sure they decided that they needed 8gigs of ram and saw no other way to achieve that other than DDR3 and as a result they decided on ESRAM to increase bandwidth/speed.
That's exactly what happened... Sony choose 4GB at start but near PS4 reveal the industry finally delivered 512Mbits GDDR5 modules... so they could increase to 8GB because they had luck.
 
Nothing to do with the cloud. What makes cloud a possibility for xbone is availability.


No, it's not.

Esram was chosen before any other memory choice. It was a deliberately choice to be an evolution of the 360 design. Any other memory decision, inclusive capacity os type of the mais ram came after esram was chosen.

Gonna need some sources for this cos it sounds like revisionist history. I can see 8GB for 3 OS's as the 'first' memory choice. The simplest memory choice is always unified. If they went for ESRAM first...truly they have regressed since the Allard days.
 
No. eSRAM is used to make up for slower main memory.

Now as to why MS went with slower main memory: MS from the start wanted 8GB of RAM in order to accommodate their vision of having a box multitasking so many different apps/services. When they were planning their console, it looked like DDR3 was the only way to guarantee 8GB at a reasonable price.

Sony went with GDDR5 as the main system memory since they knew it would perform better for gaming and they didn't think they needed 8GB. Later in the dev cycle they were able to get 8GB of GDDR5 at an affordable cost so they upped their spec.

Thus is the story of GDDR5, DDR3, and eSRAM as it relates to this gen. (If I'm wrong or grossly oversimplified things, I am sorry).
 
Pretty sure they decided that they needed 8gigs of ram and saw no other way to achieve that other than DDR3 and as a result they decided on ESRAM to increase bandwidth/speed.

Again that is completely wrong. Esram was chosen before they decided to go with ddr3, and it was chosen before they decided to go with 8GB.

You are mixing up cause and consequence. DDR3 was a viable option *because* of the esram. it's not like they were stuck with DDR3 and come up with the esram hack to try improving the performance.
 
Gonna need some sources for this cos it sounds like revisionist history. I can see 8GB for 3 OS's as the 'first' memory choice. The simplest memory choice is always unified. If they went for ESRAM first...truly they have regressed since the Allard days.

Same. Why on earth would they choose esram and use up valuable APU real estate without even knowing how much or what type of main ram they would use? Makes no sense, especially as the esram isn't that fast.

For example, if MS had gone with GDDR5, the esram could have been slower than main ram, reducing GPU power for no reason.

They must have known they were going with DDR3, informing the choice of esram. The other way round doesn't make sense,
 
Again that is completely wrong. Esram was chosen before they decided to go with ddr3, and it was chosen before they decided to go with 8GB.

You are mixing up cause and consequence. DDR3 was a viable option *because* of the esram. it's not like they were stuck with DDR3 and come up with the esram hack to try improving the performance.
He is right.

You need some source to backup this other story.

When MS choose 8GB the GDDR5 could support only 4GB with 256Mbits modules... so they choose DDR3 plus needed eSRAM to try to avoid the bottlenecks caused by the low bandwidth for games.
 
720p

I'm sorry. I didn't read the questions.
i30xpapHAXFjy.gif

No, it's not.

Esram was chosen before any other memory choice. It was a deliberately choice to be an evolution of the 360 design. Any other memory decision, inclusive capacity os type of the mais ram came after esram was chosen.
Interesting, got a link to where this was stated?
 
Again that is completely wrong. Esram was chosen before they decided to go with ddr3, and it was chosen before they decided to go with 8GB.

You are mixing up cause and consequence. DDR3 was a viable option *because* of the esram. it's not like they were stuck with DDR3 and come up with the esram hack to try improving the performance.

Sorry but how do you know this? Ofcourse my theory is just a theory too but to me it makes more sense to first decide on the amount of ram you want to work with and then start looking for what is best suited for the job (or cheapest).
 
Nothing to do with the cloud. What makes cloud a possibility for xbone is availability.


No, it's not.

Esram was chosen before any other memory choice. It was a deliberately choice to be an evolution of the 360 design. Any other memory decision, inclusive capacity os type of the mais ram came after esram was chosen.

Source? I had always heard it the other way around (i.e., the 8GB DDR3 was chosen first, the ESRAM as mitigation).
 
"I don't know a lot about this stuff so bear with me." OK

"I know there is a lot of controversy about MS going with eSRAM. My question is does eSRAM work better for the future "Cloud gaming"?" No

"Is that why they went with eSRAM?" No

"It sounds like Microsoft may be showing off the potential of the cloud at E3." You got info?

"I guess we will see. Thanks for reading." You welcome.
 
Again that is completely wrong. Esram was chosen before they decided to go with ddr3, and it was chosen before they decided to go with 8GB.

You are mixing up cause and consequence. DDR3 was a viable option *because* of the esram. it's not like they were stuck with DDR3 and come up with the esram hack to try improving the performance.

Although i believe you i can´t understand the choice of ESRAM. It´s like building a racecar and the first thing the designer says "Well, i think we start with the chassis of a VW Beatle and build the racecar around it, everyone cool with that?" (RacingMattrick: "Yeaaaah, nice, lets do this!")
 
Has this been posted?

Cloudgine is Microsoft’s secret Xbox One sauce

http://www.developer-tech.com/news/2014/may/20/cloudgine-microsofts-secret-xbox-one-sauce/

They talk a bit about the Esram.

Thread worthy?


That article is an example of everything wrong with the internet i.e. people/websites passing off complete and utter bullshit as fact without so much as a shred of evidence and other people believing said bullshit because "it must be true because why would someone pull stuff out their arse just for clicks?".
 
"I don't know a lot about this stuff so bear with me." OK

"I know there is a lot of controversy about MS going with eSRAM. My question is does eSRAM work better for the future "Cloud gaming"?" No

"Is that why they went with eSRAM?" No

"It sounds like Microsoft may be showing off the potential of the cloud at E3." You got info?

"I guess we will see. Thanks for reading." You welcome.

He got his info from a Tim Doge tweet.

https://twitter.com/PNF4LYFE/status/468833337528705024
 
Reads like a fanfic that MisterXMedia might have written.

Here's his other article May 14, 2014 Opinion: Xbox One is losing its “Next Gen” vision

http://www.developer-tech.com/news/2014/may/14/opinion-xbox-one-losing-its-next-gen-vision/

Before I bought a “next-gen” console I wanted to be sold on how I can be given different experiences from the previous generation. The PS4 looked like a PS3 with enhanced graphics – nothing impressed me in the slightest. It remains a dumb PC in my eyes.

I’m both a PlayStation and an Xbox fan before I start a flame war – I’ve always bought every generation of each console.

Xbox One caught my attention right-off-the-bat for its talk of Cloud processing, the ability to digitally share games, and for cool Kinect functionality like auto sign-in via facial recognition.

He's an opinion writer that loves xbox.

If you follow the crowd and say “but games aren’t in 1080p” then first I’ll show you Forza 5, which is, then I'll show you Ryse and you can see for yourself how ridiculous you sound. Crytek’s title is widely-regarded as the most graphically advanced thus far, it's 900p, and it’s only available on Xbox One. Although if this is still really going to cause you an issue; remember by the time PS4 starts getting its real killer titles such as God of War and Uncharted in 2015 – DirectX 12 will be out which is expected to boost performance twice what it can achieve today through enhanced multi-core processing.
 
The eSRAM was supposed to be a solution to a problem(provideing enough bandwidth for the GPU) but interduced a very large bottleneck, and thats the size of the pool of memory. 32MB is just not enough for a true next gen console, so developers are making sacrifices to match the PS4(resolution, AA...etc).


If the XO was equipped with GDDR5 memory like the PS4(even a slower variant), the difference between them would not have been this big.
 
"I don't know a lot about this stuff so bear with me." OK

"I know there is a lot of controversy about MS going with eSRAM. My question is does eSRAM work better for the future "Cloud gaming"?" No

"Is that why they went with eSRAM?" No

"It sounds like Microsoft may be showing off the potential of the cloud at E3." You got info?

"I guess we will see. Thanks for reading." You welcome.

http://www.developer-tech.com/news/2014/may/20/cloudgine-microsofts-secret-xbox-one-sauce/

I don't know how reliable this is hence why i said maybe.
 
Prophet you say?

Prophet of truth from Halo :D

Sorry bro, Am new here. Don't know him/her :)

Lol.... its kinda hard to impart any sense of legitimacy when you're the source and... I did just sign up yesterday.

I can tell you that there is a deal with Microsoft and Cloudgine for the project (codenamed Nimbus) that was struck back last year.

There is a bunch of other stuff I can maybe say that would make me look like I'm telling the truth too, but its pretty awkward just strolling into a thread and blowing my load, you know?

He leaked a bunch of info about MS back in Jan I think
 
Gonna need some sources for this cos it sounds like revisionist history. I can see 8GB for 3 OS's as the 'first' memory choice. The simplest memory choice is always unified. If they went for ESRAM first...truly they have regressed since the Allard days.

I don't like linking to another forums, but here:

http://forum.beyond3d.com/showpost.php?p=1838682&postcount=9369

http://forum.beyond3d.com/showpost.php?p=1838379&postcount=9324

If you look at that forum you'll find many other posts regarding the subject, and the answer is always the same. Esram was the primordial point of the xbone design, then came other choices.
 
Again that is completely wrong. Esram was chosen before they decided to go with ddr3, and it was chosen before they decided to go with 8GB.

You are mixing up cause and consequence. DDR3 was a viable option *because* of the esram. it's not like they were stuck with DDR3 and come up with the esram hack to try improving the performance.

No, give sources cos what you say does not make any sense. Going ESRAM first means you totally gimp your GPU from the start by hogging up a huge chunk of the die. Why would you do this?

The reason they went for ESRAM is for the 8GB and OS solution. Nothing else makes sense. They wanted 8GB couldn't risk on GDDR5 coming through so had to come up with a workaround solution. The ESRAM was not first. If it was then who ever architected this mess should be summarily fired for incompetence.

So bkiliian is the source. Fine, I'll stick with my statement, total incompetence from Microsoft.
 
That article is an example of everything wrong with the internet i.e. people/websites passing off complete and utter bullshit as fact without so much as a shred of evidence and other people believing said bullshit because "it must be true because why would someone pull stuff out their arse just for clicks?".
The mainstream press was doing this years before we had the Internet. I think the bigger issue is the people who believe this stuff and share it around. Folks need to think more critically and be more open-minded.
 
I don't like linking to another forums, but here:

http://forum.beyond3d.com/showpost.php?p=1838682&postcount=9369

http://forum.beyond3d.com/showpost.php?p=1838379&postcount=9324

If you look at that forum you'll find many other posts regarding the subject, and the answer is always the same. Esram was the primordial point of the xbone design, then came other choices.

The first link says nothing except that maybe it was <8GB earlier in the development. It doesn't say anything about when the type of ram was chosen vs timing of esram - both were already there based on that quoted post.


Second link is the same - says DDR3 was chosen before 8GB. Which would have then meant esram to mitigate the bandwidth. That they then used esram, allowing them to increase the amount of DDR3 is a side issue
 
Top Bottom