Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
flyinpiranha said:
You don't think developers have gotten better? I think it's amazing the stuff that has come out this generation. Or am I misreading that last part?

I just mean team sizes will continue to expand, and with that production time and cost, and with cost more bean-counters pushing back on the creatives to not rock the boat too hard so they can protect their investment...

Technology can be an empowering, democratizing, force but in the case of console "power" its actually the reverse because it takes a lot of skilled people to actualize that power. The Samaritan demo is very impressive, but practically speaking making a whole game to that standard is a major creative and technical engineering project. Its massively expensive, which automatically excludes its undertaking to all but a handful of large well-funded companies.

tldr; I don't want to come across as being cynical here but, in my opinion, all that the next gen can offer is technology with which we can finesse what we have already. And finesse is expensive in terms of manpower and money.

Also, from a creative standpoint, the defining feature of this gen has been the rise of online interconnectivity, that's been the key transforming innovation, not the ability to push and shade more polys.

Looking ahead I can't see anything upcoming that's going to have the same impact. Admittedly I'm a Kinect skeptic, but even a massive improvement to that technology (short of a direct neural link!) I can't see revolutionizing the way we play games as much as the way online services have.
 
Nirolak said:
Tim Sweeney and/or CliffyB (I forget which) was talking about making graphics code that scaled across a lot of cores.

Six CPUs and two GPUs would actually make sense for fitting that.

The main question would be what the VRAM is, and is probably the main bartering point for developers if the rumor is true.
A multi GPU setup from AMD or Nvidia makes no sense at all.
 
Orayn said:
I propose we go back to measuring console power in arbitrary numbers of "bits" in addition to buzzwords like BLAST PROCESSING!

Failing that, we can come up with an exponential scale for how many Gamecubes you'd have to tape together to achieve the same performance.
In this case, measure how many X360s ducktaped.

I bet the power of the 360 will be compared in the Next Xbox's reveal whenever that is.
 
Jonm1010 said:
I agree, but as a sidenote I think its naiive on Microsoft's part to think the Loop will be the only console for the next ten years. It may still be manufactored while a new system is being phased in but I think reality and external factors will push Microsoft to introduce something else before those ten years are up.

Whether its the emergence of the Apple HDTV that will no doubt have a method to play HD games or Apple putting an HDMI out on the Ipad and selling a cheap controller to use for your TV. Or the emergence of fiber technology that will increase demand for things like OnLive. I dont think Microsoft, or Sony for that matter, will in the end be able to rest on their laurels and think their next console will go ten years before needing to innovate.
I wouldn't take the "life span" stuff too seriously. The PS2 is still alive and that didn't stop the PS3 from releasing.
 
Durante said:
What's that picture of the 360 PCB supposed to tell me?
(I hope it's not just that consoles don't use the PC RAM "stick" form factor, because that much should be obvious and is beside the point)
The chips are the same but the considerations are different. More RAM chips means more motherboard traces and a larger and more complex PCB.
 
gofreak said:
It's possible devs etc. would tell them to use 2GB of GDDR5 instead of 3GB (split DDR3/GDDR5). That might be true, but I can also entertain the possibility they'd prefer the latter. It's not that outrageous.

Never meant to suggest that your idea wasn't perfectly plausible - just presenting another viewpoint.

gofreak said:
...Let's say MS wants to reserve 256MB of RAM for the OS/Kinect processing/whatever.

Under the unified 2GB of GDDR5, that leaves devs with 1.75GB to play with.

Under the other approach, it leaves them with 1.75GB of DDR3 and 1GB of GDDR5.

If any scenarios where non-graphics RAM requirements in a game exceed 750MB (quite plausible), the non-unified set up with DDR3 wins.

Obviously things only get more favourable for that setup if the OS requirements are more than that.

Based upon MS previous memory usage in the prior Xbox consoles, I think we'll see them continue to run lean and mean as far as their own requirements go. 128MB of Ram would be a smorgasbord for them. I'd say 64 - 96MB is more likely this time around (Given that they use 32MB currently). As such, I think having 2GB of fixed system RAM is likely overkill for most developers. Not to say they wouldn't use it if they had it - I just think they'd prefer for the ratio to be flipped in favor of VRAM if possible.

But I agree that in the case of a non unified memory setup, 2GB of DDR3 and 1GB of GDDR VRAM is the most likely breakdown.
 
Proelite said:
It makes sense if its a 6990 crossfire solution made to emulate the single card solution under development.
I highly doubt we're looking at anything like a 6990, much less two of them.
 
purple cobra said:
In theory, are these specs enough to give us graphics at the quality of Battlefield 3 on PC's now? Or the equivalent of Crysis 2 on high(?) at 60fps? That would be fine by me.

that would depend on the GPU but seeing that this is a closed box & the devs can code with the specs in mind I would say yes.
 
did people talk this much shit and nonsense about ram back in 2005? How has everyone suddenly become such a huge expert in such a short amount of time?

I wasn't on gaf back then but were people in 05 yelling "2gbs of ram or bust! anything else is a complete joke!"?

Because that's what people sound like now clamoring for 4gb of ram absolute bare minimum in a system that simply does not need it.
 
Futureman said:
http://www.engadget.com/2011/11/15/microsoft-xbox-turns-x-years-old-today-celebrates-decade-of-con/

Engadget article today saying MS's next console won't be out for "4 more years." Haha.

I guess Engadget doesn't really deal much with rumors, but it's still pretty hilarious to see them following MS's "4 more years" PR-speak while the same day rumors are flowing over that internet that MS will reveal details in as little as 2 months from now (CES 2012).

Why people continue to interpret the "10 year lifespan" comments from MS as meaning that there'll be ten years between Xbox iterations is beyond me. It's not like Sony hasn't demonstrated how these transitions works two generations in a row now.

Drives me crazy.....
 
Dipswitch said:
Why people continue to interpret the "10 year lifespan" comments from MS as meaning that there'll be ten years between Xbox iterations is beyond me. It's not like Sony hasn't demonstrated how these transitions works two generations in a row now.

Drives me crazy.....
Yes, but this gen is different. Xbox 360 is selling really well and showing 'growth'. And even Blezinski said, "Expect cool stuff in 2014." Logically thinking, a 2012 launch doesn't make sense from a business POV.
 
guek said:
did people talk this much shit and nonsense about ram back in 2005? How has everyone suddenly become such a huge expert in such a short amount of time?

I wasn't on gaf back then but were people in 05 yelling "2gbs of ram or bust! anything else is a complete joke!"?

Because that's what people sound like now clamoring for 4gb of ram absolute bare minimum in a system that simply does not need it.
6 years, in the videogame industry no less, is "such a short space of time"?
 
A multi-GPU Xbox would mean that ATI would have to figure out a solution for microstuttering.

Or we'll just get stuttery games, whatever. :(
 
-NinjaBoiX- said:
6 years, in the videogame industry no less, is "such a short space of time"?
I think he's saying "A lot of people that don't know what they're talking about are making pie in the sky demands that have no basis in reality."
 
guek said:
did people talk this much shit and nonsense about ram back in 2005? How has everyone suddenly become such a huge expert in such a short amount of time?

I wasn't on gaf back then but were people in 05 yelling "2gbs of ram or bust! anything else is a complete joke!"?

Because that's what people sound like now clamoring for 4gb of ram absolute bare minimum in a system that simply does not need it.

Oh ram was a huge sticking point in discussions back then. There was pure meltdowns when the 256mb number was originally announced from Microsoft.

On another note, can someone break down for me in detail why or why not 2gb of ram is or isnt needed?

Why is 2gb optimal? Why is it "enough." Specifically I want to know how it will or will not bottleneck things like huge open world games and textures or physics. Is 2GB enough to get high end graphics in an open world game with all the bells and whistles and say make it completely destructable like Red Faction along with scores of NPC's and quality AI and so that the destruction doesnt disapear once you go two miles away? Or say they put in a beast CPU and a beast GPU but skirt on the ram. Why would that not affect certain types of games?
 
Meisadragon said:
Yes, but this gen is different. Xbox 360 is selling really well and showing 'growth'. And even Blezinski said, "Expect cool stuff in 2014." Logically thinking, a 2012 launch doesn't make sense from a business POV.

Not really. Sony introduced the PS3 when the PS2 was still selling very well. MS will continue to do the same with the 360, albeit at a lower price point. They'll compete with the Wii at the lower price points (Likely taking market share away from Nintendo in this market segment) using the 360 and Kinect. And they'll start the rat race all over again with the high end model.

There's no reason to believe MS won't continue to shift a substantial amount of 360 units over the next few years, even if they already have the successor machine on the market. And I dare say that arrangement will work out just fine for them, as they'll be able to offset the losses associated with the new models with the profits from the current models.
 
StevieP said:
And what's with all this graphics talk? If you people cared about graphics *this* much as to be complaining about a next gen console releasing with 2GB of RAM (which is fantastic, btw)... why aren't you all playing your multiplat games on PC?
StevieP said:
If you guys care *this* much about visuals (LOL again at the people expecting Samaritan) you'd have PC rigs already.
You keep repeating that even though people (like me) tell you we are of course already playing everything possible on PC. I'm even using my PC to improve the graphics of console games! However, while immaculate IQ and 60+ FPS are great and almost a full "generational" difference by themselves, this doesn't help raise the minimum spec most developers sadly have to work with.

brain_stew said:
The chips are the same but the considerations are different. More RAM chips means more motherboard traces and a larger and more complex PCB.
I am aware of that, but the difference in cost isn't nearly as significant as people make it out to be. They used 8 (!) chips on 360 at launch, and they could easily get 4 GB in 8 chips whenever the next console launches (and probably soon in 4).
 
Thunder Monkey said:
I think he's saying "A lot of people that don't know what they're talking about are making pie in the sky demands that have no basis in reality."

I've been following this thread very closely and I havent seen a breakdown, in detail, of why 2gb is "enough."

Im not saying im taking the other position, I would just like to hear details from both sides of the argument so I can form a better opinion. So far its a lot of snoddyness and assertions.
 
brain_stew said:
A multi GPU setup from AMD or Nvidia makes no sense at all.
Well yes, that part doesn't make sense, but I'm talking solely from a code perspective if we were to ignore the large amount of issues I have with this rumor in general.
 
Thunder Monkey said:
I think he's saying "A lot of people that don't know what they're talking about are making pie in the sky demands that have no basis in reality."
Oh yeah, I got that. But he also said people seem to have become RAM experts since '05, which is apparently a really short space of time. Just seemed an odd comment about such a fast moving industry.
 
-NinjaBoiX- said:
Nice avatar dude. Melissa Clarke is truly one of the most beautiful girls in the world. If I had the computer from Weird Science, she would be the result.
Thanks, she's pretty nice huh. ;)

onQ123 said:
that would depend on the GPU but seeing that this is a closed box & the devs can code with the specs in mind I would say yes.
Ooooh, that sounds nice! I've seen video on gamersyde of Crysis 2 running on PC at 720p and 60fps and it was beautiful. From vids I've watched Battlefield 3 also looks beautiful running at 60. I'd seriously be happy with that quality on the next consoles.
 
Dipswitch said:
Not really. Sony introduced the PS3 when the PS2 was still selling very well. MS will continue to do the same with the 360, albeit at a lower price point. They'll compete with the Wii at the lower price points (Likely taking market share away from Nintendo in this market segment) using the 360 and Kinect. And they'll start the rat race all over again with the high end model.

There's no reason to believe MS won't continue to shift a substantial amount of 360 units over the next few years, even if they already have the successor machine on the market. And I dare say that arrangement will work out just fine for them, as they'll be able to offset the losses associated with the new models with the profits from the current models.
When Sony introduced the PS3, PS2's software sales were declining rapidly. I don't think that's the case here, but yes, what you're saying is a totally valid strategy used by companies, but I don't think third parties are ready for a transition to a newer hardware right now, especially if you notice Bethesda, Epic and the likes commenting that 2014 would be a good time. I guess the release timing could be based on how well the Wii U does, but this 2012 rumour isn't that credible compared to some of the better ones out there, which say 2014.
 
Lonely1 said:
But the power of a CPU isn't solely determined by the numbers of cores. If there anything planed with ARM that can compete with years old PC parts? Smartphones might have quadcores next year, but they will still be considerably behind 5 year old Core 2 Duo's in performance.
tegraroadmap.jpg


Note the "Core 2 Duo" level. Sure, ARM A9s are not as powerful as x86 CPUs but they're not completely out of picture either. Plus you have to consider that ARMs in smartphones operate under completely different power and heat dissipation requirements than would be avialable in a console's box.

Plus just how much GP processing power does a console need? ARM is used in supercomputers now, to feed data to GPU compute clusters. Thus having a good GPU may mean that you don't need as strong of a CPU anymore.

DCKing said:
You should've checked out the Wii U thread. My theory is they used an RV770 because it has a GDDR3 memory controller, and it has the power they wanted. It needed a GDDR3 memory controller because it seems they're using the Xbox 360 CPU in devkits as well.
RV770 was the first one to use GDDR5. Every Radeon since X800 (R420) have a GDDR3 controller. Current Radeons can use GDDR3 too. They could've easily use Cayman in their devkits but ut seems that they're using RV770. Why? I see the only explanation: Wii U GPU will be on RV770 level features wise.

DCKing said:
More cores doesn't mean more power. There's a point at which it is not useful anymore (Amdahl's Law) and there aren't any more tasks to thread out. Fast multicore POWER7 > many simple parallel cores. It simply doesn't get faster than POWER7 at this point.
In a console environment 8 ARM cores would be preferable to 4 POWER7 cores while being much much simplier. This was already shown this gen with Cell and XCPU. Btw, ARM A15 is an OoOE design.

DCKing said:
Every chip is customized. I agree. No chip in the history of gaming has not had its roots in some other application. Cell (and PPE) came as close as you get as something new, and that didn't even turn out that great. At the moment, there's nothing in the works that I know of that rivals high-end x86 and POWER7 in performance. That was not the case last gen.
Cell is the only reason why PS3 is technically ahead of competition at the moment. It's a great example of many simplier and harder to program cores winning against several more complex but easier to program for cores.

Going into next generation though one thing should be considered: most of what Cell SPEs are doing should now be done on GPU (via CUDA/OpenCL). And thus it now may be better to use smaller (meaning cheaper) CPU cores in a not-so-high numbers (2 POWER7, 6 PPCs or 8 ARM A15s) while shifting SPEs workload to a next gen GPU (AMD's GCN/Tahiti, NV's Kepler). That's the main reason why going with small numbers of OoOE cores may well be an option for Xb3/PS4.

DCKing said:
POWER7 is similar to, but not backwards compatible with the old PowerPC G3 architecture used in Wii and GC. The CPU in the Wii/GC had some customizations that would need to be ported over.
I'm quite sure that this can either be added to POWER7 relatively simply or emulated without much perfomance loss.

DCKing said:
Furthermore, the GameCube and Wii GPUs have a peculiar shading solution that is not easy to emulate.
Nothing peculiar about it, they're both fixed function GPUs (pre-DX8 era). Anything done in fixed function can be run on current unified shaders hardware.

DCKing said:
Then there's some fast timed caches, microcode here and there, as well as a funky sound system. I actually think there's no way Nintendo is going for software emulation. Less than 100% BC is unacceptable to them, and I think for many of the Wii buyers as well.
Well, including Wii's hardware is an option but I'd really prefer s/w emulation. It just allows you to do much more with the old code than the old h/w has ever allowed.
 
Thunder Monkey said:
I think he's saying "A lot of people that don't know what they're talking about are making pie in the sky demands that have no basis in reality."

Bingo bango mingo mango...tango flango

Just about everyone here is basing their ram requirements off of current PC demands (which really don't require more than 4gb system ram for most people anyway).

But consider this:

Battlefield 2 (came out in 2005) recommended specs:
CPU: 2.4 Ghz
RAM: 1 Gb
Video Card with at least 256 Mb of RAM

Oblivion (came out in spring 2006)
3 Ghz Intel Pentium 4 or equivalent processor
1 GB System RAM
ATI X800 series, NVIDIA GeForce 6800 series, or higher video card (about 256mb ram)

These high end games that came out around the launch of the 360 required at least 2.5x the total ram found in the HD twins. How do these old games compare visually to today's high end 360/PS3 games? It's like they're from two completely different generations. They look like fucking garbage compared to present day console games.

BF3, likely the most visually impressive game we have to date, recommends a total of 5gb of ram. What most people don't understand is the ram recommended for BF3 is likely much much slower than the ram they'll make custom for all next gen consoles. What need is there for 8gb of ram? Or more than 4gb? Around 3gb is what we should be expecting, 4 if they're really going all out.

You guys should be more concerned about the speed of your ram than simple gb amounts.
 
Are we expecting to see another iteration of the R700 series chip?

Can anyone link me to AMD's next complete redesign? I know they finally got the first bit of press out there.
 
Shambles said:
Lol @ a dual GPU console. That makes no technical, or economic sense. Won't happen.
I guess I was of the opinion that it really meant dual core. Or is that still nonsensical? I don't know anything about hard tech like this.
 
Just played Castelvania Lords of shadows and this game BEGSfor 1080p, 60 fps and 4x AA.
It's just stupid that devs spend too much time (and money) playing around crippled hardware. 6 years is so long ! my ps3 is old and obsolete. Games are plateauing : my first game was GoW 3 and it still hasn't been beaten (haven't played UC3 but i didn't like UC2).
Move on : i'll buy any reasonnable hardware in 2012, 2013 is much too far away.
 
CES is going to be all about Nokia's product reveal to the US market. Hard to believe they would upstage that with a new xbox announcement, especially considering phones and xbox share the same division.
 
Jtwo said:
I guess I was of the opinion that it really meant dual core. Or is that still nonsensical? I don't know anything about hard tech like this.
Unless designed around it a true dual GPU solution would run too hot and have too high a power budget to be in a gaming console. You could always downclock, remove transistors from that dual GPU set to get it in power budget, but at that point it would just be smarter to use an older GPU with a lower power budget.
 
Theoretically speaking, would a next-gen console be more powerful than current high end gaming rigs? I still can't picture the generation leap here.
 
For the people asking about ARM, ARMv8 64-bit architecture will be powerful enough for consoles.

In an article yesterday it was stated that the X Gene (first ARM 64-bit serve chip) would have half the power of an Intel Sandy Bridge Xenon at considerable less power consumption.

So, imagine a solution with 4 to 6 of those cores integrated into an AMD CPU.
 
guek said:
Bingo bango mingo mango...tango flango

Just about everyone here is basing their ram requirements off of current PC demands (which really don't require more than 4gb system ram for most people anyway).

But consider this:

Battlefield 2 (came out in 2005) recommended specs:
CPU: 2.4 Ghz
RAM: 1 Gb
Video Card with at least 256 Mb of RAM

Oblivion (came out in spring 2006)
3 Ghz Intel Pentium 4 or equivalent processor
1 GB System RAM
ATI X800 series, NVIDIA GeForce 6800 series, or higher video card (about 256mb ram)

These high end games that came out around the launch of the 360 required at least 2.5x the total ram found in the HD twins. How do these old games compare visually to today's high end 360/PS3 games? It's like they're from two completely different generations. They look like fucking garbage compared to present day console games.

BF3, likely the most visually impressive game we have to date, recommends a total of 5gb of ram. What most people don't understand is the ram recommended for BF3 is likely much much slower than the ram they'll make custom for all next gen consoles. What need is there for 8gb of ram? Or more than 4gb? Around 3gb is what we should be expecting, 4 if they're really going all out.

You guys should be more concerned about the speed of your ram than simple gb amounts.

This makes sense to me.
Also, I'm more concerned about future allocation towards behaviors and AI. If anything, the iPad 2 has shown me that you can have tons of flashy graphics but an empty experience if there's not enough of the processing power put towards routines. I'm no tech guy by ANY measure, but I'm starting to feel that the next generation of games is going to have to become a lot more engaging beyond the visuals in order to hold my attention. The most impressive thing about, say, RAGE, wasn't that it was gorgeous...which it was! However, it was that the A.I. of the enemies made the fights incredible.

I would honestly trade prettier textures and lighting for smarter enemies and more 'on-the-fly' unscripted intelligence within future games. If the future simply holds prettier bigger action Modern Warfare (insert gorgeous AAA titles) instead of a richer and more believable Skyrim world, then I'll pass...
 
bigjimmystyle said:
Theoretically speaking, would a next-gen console be more powerful than current high end gaming rigs? I still can't picture the generation leap here.
Not in a pure brute force sense.

But then again neither were the PS3 or 360. Or GCN, PS2, or Xbox.

Mainly because they don't work under the same restraints.

Actual useable for gaming power? Much more so.
 
Maybe MS wants 2 GB DDR3 to launch a new console faster in 2020, and don't do the same error they did like the current gen with 8 years of life for xbox 360. The faster they launch a new console, more money they will make. Next gen will last 5 years. You will never see 8 years again.
 
WTF is a "dual-core GPU" and what makes people think that MS is going to use a 200W chip?

Proelite said:
No. I think it'll more like two 6990 in crossfire mode.

:D
Then it should be called the XBox Fireplace. ;)
 
bigjimmystyle said:
Theoretically speaking, would a next-gen console be more powerful than current high end gaming rigs? I still can't picture the generation leap here.
Well they can't match the current high end rigs specs wise, but since developers can code to the metal and optimize things well on consoles, the graphical fidelity will be way better than whatever we see right now for at least 2 years until the PC catches up. It will be the same thing that happened this gen. Considering the games we are seeing on current consoles, I simply can't wait to see what they can achieve when most of the bottlenecks are removed.
 
Dual GPU won't happen with all the heat and energy it consumes. If it does, expect a new model that will use 1 GPU that gives the same power down the line.
 
woober said:
Dual GPU won't happen with all the heat and energy it consumes. If it does, expect a new model that will use 1 GPU that gives the same power down the line.

Dual GPUs don't even work in PCs. SLI or Crossfire. I made the mistake of building an SLI rig in the early days ( I know it's gotten better), but I later made the mistake of recommending a very good friend of mine buy a Crossfire rig. I've spent countless hours helping him troubleshoot it.

We came up with an infallible way of proving this technology is fundamentally broken.

Google: *Name of Game* + *Crossfire or SLI*

The entire front page will be troubleshooting. Maybe a few YouTube links of gameplay videos sprinkled in. Also, for bonus points, visit the official forums of any recently released PC game. Visit troubleshooting subforum, count the number of dual GPU problems.
 
Status
Not open for further replies.
Top Bottom