Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
BocoDragon said:
CES? Is this 1993?
Lol.

My thoughts are the same as the majority:
Hex-processor sounds good.
2GB (alone) is too low and developers will have to do tricks to maintain a great 1080p image like developers are doing tricks with 512MB/720p. 4GB would have removed such a bottleneck.
Dual-GPU in the Devkits indicate strong (singular)GPU for final console.
 
gatti-man said:
Let's compare in game Gears 3 to maxed in game BF3 PC. Yeah I'd say that's a huge jump. Sub HD to true HD with AA and AF will be a similar improvement to last gens jump. Not only that but you finally get HD textures swoooon

Since I have both games you mention, I'm in agreement.

I play Gears 3 on my Xbox 360 on a 50" Samsung 1080p plasma.

BF3 on a 24" monitor in 1080p, my rig is 8gb RAM, i5 2500k, and a gtx 560ti.

BF3 shits all over Gears 3 graphically on my setup, and I don't even have a high end video card.

Gears 3 in comparison to it, is quite ugly and looks almost outdated. Most noticeable are the textures. Gears 3 texturing looks last-gen compared to BF3 on my particular PC rig.
 
specs don't mean shit, the lack of good artists is the biggest problem. this gen exposed so many developers that it's not even funny.
 
Thing I learned from this thread:

1) A shocking amount of people think console RAM is the same as PC RAM.

2) A good number of the above believe the costs associated w/console RAM are the same as PS RAM.

3) Someone will pick apart my post for saying "Console RAM" instead of the more technical and correct name.

4) A surprising number of people are mostly concerned with graphics and things such as animations, AI, scale, yadda, yadda, yadda are not necessarily being discussed.

Look my post is not intended to insult or "troll" as I've always considered the majority of gaffers' to be more knowledgeable than the average gamer/hobbyist on such matters; However I do find many of the posts itt shocking.
 
Nealand Liquor said:
Thing I learned from this thread:

1) A shocking amount of people think console RAM is the same as PC RAM.

2) A good number of the above believe the costs associated w/console RAM are the same as PS RAM.

3) Someone will pick apart my post for saying "Console RAM" instead of the more technical and correct name.

4) A surprising number of people are mostly concerned with graphics and things such as animations, AI, scale, yadda, yadda, yadda are not necessarily being discussed.

Look my post is not intended to insult or "troll" as I've always considered the majority of gaffers' to be more knowledgeable than the average gamer/hobbyist on such matters; However I do find many of the posts itt shocking.
That's the console gaming community for you!
 
Nealand Liquor said:
Thing I learned from this thread:
[...]
However I do find many of the posts itt shocking.
What I find more shocking is that this meme has taken root that all the ram used in consoles is some kind of magical creation of silicon handed down directly from the heavens, and incomparable in price to what is used in PCs.

It's not.

DDR3 is DDR3, there are different timings and frequencies, but I doubt they'd be higher (or even as high) as the highest tiers on PC. Same thing for GDDR5, it's the same as the one used in GPUs. Maybe slightly different timings and frequencies, but there's no fundamental cost difference. The only type of memory where that is somewhat true is the XDR in PS3, since there's no mainstream hardware that uses it.
 
Is the dual GPU realistic?

Wouldn't it be cheaper and better for a console to use a single, powerful GPU?

Also microstutter might be an issue.
 
Kerub said:
Is the dual GPU realistic?

Wouldn't it be cheaper and better for a console to use a single, powerful GPU?

Also microstutter might be an issue.
Might be dual-core GPU.

That is something that has not been introduced in the gaming world.
 
Kerub said:
Is the dual GPU realistic?

Wouldn't it be cheaper and better for a console to use a single, powerful GPU?

Also microstutter might be an issue.
it still might be, dual gpu on a devkit might indicate one really strong gpu on the final console.
 
Any clue what the CPU will be? We know it is 6-core but will this also be a PowerPC-derived? I hope so. That would mean full backward compatibility.
 
PopcornMegaphone said:
You don't understand VRAM vs DDR. Check the thread title. Notice VRAM is not confirmed, which is the far more important RAM spec.
It isn't confirmed that there will be VRAM to begin with, either. Probably not.
 
So I think next year is the year. Theres tons of rumours coming out of the woodwork now.

BTW, just because the devkit has these specs doesn't mean its final. Its probably just what MS can put together from off the shelf parts for now.
 
Nealand Liquor said:
Thing I learned from this thread:

1) A shocking amount of people think console RAM is the same as PC RAM.

2) A good number of the above believe the costs associated w/console RAM are the same as PS RAM.

3) Someone will pick apart my post for saying "Console RAM" instead of the more technical and correct name.

4) A surprising number of people are mostly concerned with graphics and things such as animations, AI, scale, yadda, yadda, yadda are not necessarily being discussed.

Look my post is not intended to insult or "troll" as I've always considered the majority of gaffers' to be more knowledgeable than the average gamer/hobbyist on such matters; However I do find many of the posts itt shocking.
thisisneogaf.gif
 
Nealand Liquor said:
Thing I learned from this thread:
1) A shocking amount of people think console RAM is the same as PC RAM.
... GDDR3 has been used in PC video cards as well. You're not too well-informed.

Anyways, in regard to the RAM issue. It's important for Microsoft to have the same amount of RAM as the PS4 does in their new console. Having 512 instead of 256 MB's on the 360 ensured they could compete with the PS3. Without it, I'm sure it would have been a massively different story.
 
Luckyman said:
Nintendo fans have very rose tinted glasses on. Wii U will get support as long as 360 is supported if it sells games.

With huge disparency in number of threads, memory and GPU power Wii U versions of true nextgen titles would be awful

What are the specs for the Wii U then?
 
Kerub said:
Is the dual GPU realistic?

Wouldn't it be cheaper and better for a console to use a single, powerful GPU?

Also microstutter might be an issue.
A power GPU or weaker dual GPUs offer better performance. I'll take the later. Also, I wouldn't worry about micro stutter. Happens on PC because they are many different CF/SLI setups which developers can't test. Thus, we have to wait for profile and fixes from the video card manufacturers. On a console, it's set in stone and developers would have plenty of time to test their games before release. And, added GPU scaling should be almost 200%.
 
Kerub said:
Is the dual GPU realistic?

Wouldn't it be cheaper and better for a console to use a single, powerful GPU?

Also microstutter might be an issue.

I think power consumption and heat would be a real issue as well.
 
For fucks safe. One last time.

AMD Graphics cards have never used DDR3 for ram. The DDR3 is system ram only, the gpus will 100% have its own vram.

The retards thinking the GPUS will use DDR3 for vram. SMH.
 
Nealand Liquor said:
Thing I learned from this thread:
1) A shocking amount of people think console RAM is the same as PC RAM.

The 360 shared the 512mb of RAM between graphics and central processing. The thread title of this rumor suggests a separate pool of VRAM, so I would expect that they are talking about basic PC memory for central processing.
 
Proelite said:
For fucks safe. One last time.

AMD Graphics cards have never used DDR3 for ram. The DDR3 is system ram only, the gpus will 100% have its own vram.

The retards thinking the GPUS will use DDR3 for vram. SMH.
I love it when people say something wrong and then call others retards. <3

Unless you meant to throw "high end" in there.
 
Hari Seldon said:
The 360 shared the 512mb of RAM between graphics and central processing. The thread title of this rumor suggests a separate pool of VRAM, so I would expect that they are talking about basic PC memory for central processing.

I think they are referring more to the fact that people keep talking about 4-8GB of memory because you can get it on Newegg for $5.
 
guek said:
what in the world do you need 16 gb of ram for. Unless you're doing high end video processing or graphic design, there's pretty much no reason for you to have that much ram.

Games, video encoding, possibly streaming my game live, being a file/print server, running an encrypted full disk backup, web browsing.

That's a typical weekday evening for me.
 
BurntPork said:
I love it when people say something wrong and then call others retards. <3

Unless you meant to throw "high end" in there.

Ha.

Just trolled myself.

I still stand by my comment that the AMD GPU will 100% not use the ddr3 ram as vram.
 
Anyone know what the price differential between DDR3 and GDDR3/GDDR5 is? If the new box is using unified memory again, including more than 2GB of GDDR5 may be a costly move.

If they're not using a unified memory setup, using cheaper DDR3 for system RAM will probably be the way they go.
 
bill0527 said:
Gears 3 in comparison to it, is quite ugly and looks almost outdated. Most noticeable are the textures. Gears 3 texturing looks last-gen compared to BF3 on my particular PC rig.

But if the 360 had 1 gig of RAM, and hence Gears 3 had better textures, the gap wouldn't be nearly so great, right?

RAM is really important. It's the best thing a console maker can do to future-proof their system. Shit, the 360 might've shipped with 256 megs if not for Epic's Gears 1 side-by-side.

A next gen system with 2 gigs of RAM would do a lot, and yea, 2 gigs on console is not at all analogous to 2 gigs on PC. But 4 gigs would be better, if it is at all possible. Considering component prices, it might not be.
 
I believe this... especially if Microsoft plans to release a console in 2013... this gives them ample time to tease and make adjustments based on developer feedback.

but i am curious about the RAM issue... microsoft typically went with a unified ram system... but to now have dedicated ram.. I can't imagine it having any negative effects on backward compatibility but wouldn't this bring future bottlenecks considering devs can't use as much ram as the system has like before?
 
knitoe said:
A power GPU or weaker dual GPUs offer better performance. I'll take the later. Also, I wouldn't worry about micro stutter. Happens on PC because they are many different CF/SLI setups which developers can't test. Thus, we have to wait for profile and fixes from the video card manufacturers. On a console, it's set in stone and developers would have plenty of time to test their games before release. And, added GPU scaling should be almost 200%.

Micro stutter has fuck all to do with developers.
 
Personally I think you can get a faster setup with dedicated VRAM. You clump that VRAM real close to the GPU cores and don't have to dick with having 2 memory bus masters (GPU or CPU). Plus if you now have 6 CPU cores and 2 GPU cores, that is a lot of friggen cores all sharing 1 memory bus in a unified architecture.
 
Dipswitch said:
Anyone know what the price differential between DDR3 and GDDR3/GDDR5 is? If the new box is using unified memory again, including more than 2GB of GDDR5 may be a costly move.

If they're not using a unified memory setup, using cheaper DDR3 for system RAM will probably be the way they go.

Currently I don't. But when the 360 was released, the GDDR3 at launch cost them approx. $16.25 per 1Gbit chip (128MB).
 
Proelite said:
For fucks safe. One last time.

AMD Graphics cards have never used DDR3 for ram. The DDR3 is system ram only, the gpus will 100% have its own vram.

The retards thinking the GPUS will use DDR3 for vram. SMH.
How much VRAM did Xbox or Xbox 360 have?
 
K.Jack said:
Was it really?

Technically I guess, It was a 7 series derivative which came out a month after e3 2005. The problem is the 7 series wasn't a 100% increase over the 6 series, more like 50%. Vertex shaders went from 6 to 8 and pixel shaders from 16 to 24.
 
bgassassin said:
Currently I don't. But when the 360 was released, the GDDR3 at launch cost them approx. $16.25 per 1Gbit chip (128MB).

Wasn't when the 360 release around the same time memory manufacturing was off line due to earth quakes and memory prices were sky high?
 
A Hex-core and 2GB ram? Sounds like music to my ears! Look at what they have done with a Tri-Core and 512MB ram with the Xbox. Keeping price down and performance up.
 
kitch9 said:
Micro stutter has fuck all to do with developers.
Developers can tell Microsoft their game is micro stuttering for some reason. What can they or Microsoft do to fix the problem before release.
 
joshwaan said:
I think it will have 4GB of memory if CliffyB steps in again and shows " Gears of War 4 Off it's tits addition " :P

Serious though, I hope it's 6 core with 4GB DDR4 at least with 1GB VRAM based on 6000 series AMD.
Holy moly, the prospect of Gears 4 on Unreal 4, drool. I have no idea about tech specs, but f the concensus is that 2GB of RAM is too little, I hope MS see sense and add £50 to the asking price and throw more grunt at it. I know enough to know that RAM has been a big sticking point for dev's this gen, would hate to see them face a similar problem.
 
People need to stop judging ram based off your computer.

Your computer needed more ram simply due to the operating system and background tasks. A game console's OS and almost on existent background tasks means it needs far less.

Most modern gaming PC's only need 4gb of ram right now. I can max out Skyrim, Witcher 2, Battlefield 3, while streaming with a browser, skype, ventrilo, foobar open with only 4gb of ram on Windows 7.

Would it be nice for the system to have 4? Yes. But it almost means a re-engineering any type of plans they currently have. You don't just stick a 2gb stick where you have a 1gb stick. The smaller the stick the faster the performance.
 
Mudkips said:
Games, video encoding, possibly streaming my game live, being a file/print server, running an encrypted full disk backup, web browsing.

That's a typical weekday evening for me.

You don't need 16gb of ram to do this. As I do the same thing every day and use less then 4.
 
I am a tech noob hardware-wise, could anyone elaborate how 'big' the jump from X360 > X Loop/3 would be? (It's been asked before, without getting answers in this thread.)
Witcher 2 / Battlefield 3 / Crysis 1 (?) graphics (PC, on ultra) or would it be a 'small' upgrade compared to Xbox 1 > Xbox 360?
 
Baron_Calamity said:
Wasn't when the 360 release around the same time memory manufacturing was off line due to earth quakes and memory prices were sky high?

I don't know. The little bit of research I did gave me nothing.

-NinjaBoiX- said:
but f the concensus ...

Don't trust the consensus.
 
a high clocked Hexcore system with a powerful ATI gpu and 3GB of RAM sounds reasonable and honestly, there isn't a need for 4GB of RAM in a closed system like a console. Look at what developers are cranking out of 512 MB or actually less considering OS overhead taking a good 30-67 Megs of that. 2GB sounds practical, sure 4 would be nice but I don't see it happening. 2GBs of very fast RAM could work but I'm not betting on it being DDR3.

I'm more interested in hearing about this ATI gpu. Will it have eDRAM again and if so how much? 256? 512?
 
Dreaver said:
I am a tech noob hardware-wise, could anyone elaborate how 'big' the jump from X360 > X Loop/3 would be? (It's been asked before, without getting answers in this thread.)
Witcher 2 / Battlefield 3 / Crysis 1 (?) graphics (PC, on ultra) or would it be a 'small' upgrade compared to Xbox 1 > Xbox 360?

Until we see the specs on the GPU and CPU nobody knows. It is sounding like a massive leap though. You are more then doubling your CPU power. If each GPU is twice as powerful as whats in the 360 (which $120 PC GPU's are now) you are quadrupling it. Same goes for ram.
 
Proelite said:
Ha.

Just trolled myself.

I still stand by my comment that the AMD GPU will 100% not use the ddr3 ram as vram.
That's nothing short of commendable. You really show those retards that even when utterly clueless, real gaffers never back off an inch.
 
Status
Not open for further replies.
Top Bottom