Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
Eh, I don't think that anything above Thames XT will fit into a console power- and heat-wise. Hell, perhaps even the size would be an issue.


... I'm going to be hurt for this opinion.


But youre looking at it way to wrong. These pc gpus are not just pluged into the console motherboard and call it a day. AMD will make a chip more optimised to run without all the heat and power that a pc card needs. There isn't a fan to run the gpu as well as the ram as it will be seperated on the motherboard for a unified memory.

Can't wait to see the specs for the gpu in the Next-XBOX.
 
But youre looking at it way to wrong. These pc gpus are not just pluged into the console motherboard and call it a day. AMD will make a chip more optimised to run without all the heat and power that a pc card needs. There isn't a fan to run the gpu as well as the ram as it will be seperated on the motherboard for a unified memory.

Can't wait to see the specs for the gpu in the Next-XBOX.

A lot of people forget this point. The G5's that were used for the early Xbox 360 devkits were absolute power hogs compared to even the original 360.
 
But youre looking at it way to wrong. These pc gpus are not just pluged into the console motherboard and call it a day. AMD will make a chip more optimised to run without all the heat and power that a pc card needs. There isn't a fan to run the gpu as well as the ram as it will be seperated on the motherboard for a unified memory.

Can't wait to see the specs for the gpu in the Next-XBOX.

That doesn't mean that just any chip can be customized. The chip you're looking at uses three times as much power as the card Xenos is most likely based on did at launch. That card uses as much power as the 360 did at launch. Are you honestly so deluded that you think it can be customized so much that the power is cut in half? The days of consoles using the top GPU are over. You're still going to get your full generation leap with the fastest mid-range card, so it's not like you're really losing anything.

A lot of people forget this point. The G5's that were used for the early Xbox 360 devkits were absolute power hogs compared to even the original 360.

That has nothing to do with anything. Those "dev kits" barely resembled the final unit at all.
 
This card sounds absolutely beast mode. I love how all the juicy details are finally starting to come together, and the next few months should be a rollercoaster.
 
That doesn't mean that just any chip can be customized. The chip you're looking at uses three times as much power as the card Xenos is most likely based on did at launch. That card uses as much power as the 360 did at launch. Are you honestly so deluded that you think it can be customized so much that the power is cut in half? The days of consoles using the top GPU are over. You're still going to get your full generation leap with the fastest mid-range card, so it's not like you're really losing anything.


That has nothing to do with anything. Those "dev kits" barely resembled the final unit at all.

If you're saying the 360 gpu was 110-120w then no i dont expect a 50% decrease in power. Just to cut it 190w to 120w that 360 uses for its gpu would only need a 20-30% decrease.
 
PS3 caught up because of the xenophobic Japanese market, which is becoming less relevant for the overall Asian market as a whole. Microsoft has a chance in this market if they could get Korea and China thinking it is ok to own their machine.
If we remove Japan from the equation completely, MS and Sony would still be neck and neck worldwide even though Sony launched a year later at a $200 higher pricepoint and had a huge backlash because of it.

Reducing it to "it only sells because Japan is racist" is a little simple minded.

It's obvious Sony still has strong brand recognition in the gaming market if they can make such a comeback after the whole 599 USD fiasco and they are going to be in a stronger position next gen unless they screw up badly.
 
This card sounds absolutely beast mode. I love how all the juicy details are finally starting to come together, and the next few months should be a rollercoaster.

It does, but it doesn't mean the next xbox will use the exact same GPU inside.

We don't know if MS is going to take a hit on the system or try to sell it at a profit from day one. There is no way this gpu will be in a 399 system being sold for a profit. This gpu might not even be in it if the system is not sold at a profit. It would be awesome to have something like the 7970 in the system, but I really don't see it financially feasible.
 
Eh, I don't think that anything above Thames XT will fit into a console power- and heat-wise. Hell, perhaps even the size would be an issue.


... I'm going to be hurt for this opinion.

You're right, but there might be (and probably is) some architecture/feature similarities between the Next-Xbox's GPU and these new cards.
 
If we remove Japan from the equation completely, MS and Sony would still be neck and neck worldwide even though Sony launched a year later at a $200 higher pricepoint and had a huge backlash because of it.

Reducing it to "it only sells because Japan is racist" is a little simple minded.

It's obvious Sony still has strong brand recognition in the gaming market if they can make such a comeback after the whole 599 USD fiasco and they are going to be in a stronger position next gen unless they screw up badly.

No.

Europe alone doesn't make up enough for the 8+ million lead the 360 has in NA over PS3.
 
We don't know if MS is going to take a hit on the system or try to sell it at a profit from day one. There is no way this gpu will be in a 399 system being sold for a profit. This gpu might not even be in it if the system is not sold at a profit. It would be awesome to have something like the 7970 in the system, but I really don't see it financially feasible.

Listing prices for stand alone discrete PC gpus isn't reflective of the price MS can get one for in their consoles. Xenos by itself was 1/3 of the price of comparative ATI gpus in 2005.
 
Regarding the entire dual GPU thing ... it just ain't happening. There are so many reasons why that would be horrible in a console, it's not even worth the speculation. If there is any truth to something being in the wild with such a setup, it's because they are alpha development kits. IIRC, the 360 alpha kits also had dual GPU's. It was a modified Mac G5 running god knows what as the OS and dev environment (a dev would have to chime in for that info). It doesn't reflect what the real HW will be at all.

So the question becomes why at this point in time would they use a dual-GPU setup en lieu of a ATi 6970? Does that mean the system will actually be more powerful than their current top single card part? I really doubt it unless MS plans to wait for the follow up to Southern Islands. Given how Southern Islands has already slipped though, I really can't imagine the next series hitting until 2014 ... which may be too late for MS's purposes.



Some speculation/possible reasons why there are dual GPU's right now:
  • The Alpha kits are mostly off-the-shelf parts running who knows what for an OS and environment. Whatever it is, it obviously ain't optimized for a console. In order to get anywhere near the ballpark for what the performance will be in the final closed system, you need to shoot way high in raw performance on the kit.
  • The raw power isn't actually reflective of targets for the card, but is needed in order to ballpark the performance for some specific features that are currently lacking in today's cards. For example, tessellation really pushes current DX11 cards, and unfortunately ATi is behind the curve compared to nVidia in the current series. Northern Islands simply can't match Fermi in this metric. One would assume ATi is working hard to make strides with Southern Islands, but in order to get anywhere near the expected performance today ... they need dual GPU's.
 
If it would be a dual gpu... I would look at roughly a slightly under clocked Thames XTx2

Would be good enough to be more powerful than the highest end GPU (if you were to ignore some of the PC limitations) at least in pixel shaders, which is probably the most important thing for MS

Yield should be very high- allowing for very decent pricing, wattage would be good (be aiming for a 300-350W system)
 
If you're saying the 360 gpu was 110-120w then no i dont expect a 50% decrease in power. Just to cut it 190w to 120w that 360 uses for its gpu would only need a 20-30% decrease.

The card with the GPU that Xenos is believed to be based on was a ~67W card.

What you're saying now people were saying before the 360 release.

There weren't 190W single GPU cards in 2005. -.-

If it would be a dual gpu... I would look at roughly a slightly under clocked Thames XTx2

Would be good enough to be more powerful than the highest end GPU (if you were to ignore some of the PC limitations) at least in pixel shaders, which is probably the most important thing for MS

Yield should be very high- allowing for very decent pricing, wattage would be good (be aiming for a 300-350W system)

AHAHAHA

Guys, seriously, do you have any idea what Microsoft's and Sony's goals are? Hint: it's not to satisfy gamers' wet dreams.
 
The card with the GPU that Xenos is believed to be based on was a ~67W card.



There weren't 190W single GPU cards in 2005. -.-



AHAHAHA

Guys, seriously, do you have any idea what Microsoft's and Sony's goals are? Hint: it's not to satisfy gamers' wet dreams.

So a 100watt increase is considered a wet dream? Wattage has been increasing every single generation by leaps and bounds, dude. I'd prefer a 7990 over my suggestion - but I stand by the assessment if it were to be a dual GPU solution - if it's any lower than that, it's not worth doing - my thought process is based on IF

Edit and the Xbox 360 gpu is based on a card that didn't arrive until 1.5 years later FYI - xenos was 90nm, it's pc card was 65nm...
 
So a 100watt increase is considered a wet dream? Wattage has been increasing every single generation by leaps and bounds, dude. I'd prefer a 7990 over my suggestion - but I stand by the assessment if it were to be a dual GPU solution - if it's any lower than that, it's not worth doing - my thought process is based on IF

Sorry, but all bets are on power consumption dropping next gen.
 
So a 100watt increase is considered a wet dream? Wattage has been increasing every single generation by leaps and bounds, dude. I'd prefer a 7990 over my suggestion - but I stand by the assessment if it were to be a dual GPU solution - if it's any lower than that, it's not worth doing - my thought process is based on IF

MS and Sony want these machines to be fixtures of the living room. Having huge, loud consoles would get in the way of that, big time. If either Sony or MS makes a 300-350W console, I'll personally find a crow, kill it, record myself eating it, and upload it to YouTube. I really mean it. I would sign a fucking contract over it. That's how sure i am that there's no chance of it happening.

And a 7990? Really? Let me guess; if it launches on 2013, anything less than a 7990 will be "not worth doing," right? Really, someone like you should be strongly in favor of 10-year generations and upgradeable consoles.
 
Sorry, but all bets are on power consumption dropping next gen.

Not likely.

^^^ and to the dude above... Read my post again... Actually everyone read it again... I was saying Thames XT would be the only worthwhile thing to put in dual for perf/watt reasons - any lower and you're wasting money

And as a gamer- id love 7990... That's why I said PREFER - because I want the best of the best
 
MS and Sony want these machines to be fixtures of the living room. Having huge, loud consoles would get in the way of that, big time. If either Sony or MS makes a 300-350W console, I'll personally find a crow, kill it, record myself eating it, and upload it to YouTube. I really mean it. I would sign a fucking contract over it. That's how sure i am that there's no chance of it happening.
Seconded.

And a 7990? Really? Let me guess; if it launches on 2013, anything less than a 7990 will be "not worth doing," right? Really, someone like you should be strongly in favor of 10-year generations and upgradeable consoles.
Something something Master Race something...
 
Edit and the Xbox 360 gpu is based on a card that didn't arrive until 1.5 years later FYI - xenos was 90nm, it's pc card was 65nm...

There's debate over exactly what Xenos is. When you get down to it, though, it's most similar to a card that was already out in 2005, but it used some features that weren't available in that card, and that's where the card you're thinking of comes in.
 
Not likely.

^^^ and to the dude above... Read my post again... Actually everyone read it again... I was saying Thames XT would be the only worthwhile thing to put in dual for perf/watt reasons - any lower and you're wasting money

And as a gamer- id love 7990... That's why I said PREFER - because I want the best of the best

You might as well stick with PC gaming. What you want simply won't happen. At all. Or even come close.
 
There's debate over exactly what Xenos is. When you get down to it, though, it's most similar to a card that was already out in 2005, but it used some features that weren't available in that card, and that's where the card you're thinking of comes in.

No. The cards in 2005 were similar in performance but not in architecture

Architecture wasn't used until 2007


You might as well stick with PC gaming. What you want simply won't happen. At all. Or even come close.

Sigh... Nobody is paying attention or reading my posts - just looking at key words and responding :S
 
The card with the GPU that Xenos is believed to be based on was a ~67W card.



There weren't 190W single GPU cards in 2005. -.-



AHAHAHA

Guys, seriously, do you have any idea what Microsoft's and Sony's goals are? Hint: it's not to satisfy gamers' wet dreams.


^^ the TDP on GPUs didn't start getting really crazy until 2008/2009 when the GeForce 200 series came out where the midrange part, the GTX 260, broke 200 W and ATi followed with the 4870 at 150 and 4890 at 190.

The 3870X2 and Geforce 9800GX2 dual GPU parts were both under 200 W, at 165 and 190 respectively. There were a couple super highly clocked GPUs in 2007ish that sucked a lot of power as well, the Radeon 2900 XT and the 8800 Ultra, but at the time the 360 and PS3 came out even the super high end GPUs were only around 100 W TDP. The 8800GT, released 2 years after the 360, was only a 125 W part.
 
Not likely.

^^^ and to the dude above... Read my post again... Actually everyone read it again... I was saying Thames XT would be the only worthwhile thing to put in dual for perf/watt reasons - any lower and you're wasting money

And as a gamer- id love 7990... That's why I said PREFER - because I want the best of the best

Oh, right, you're still stuck on the 2 GPUs part of the rumor. The final unit will have only one GPU no matter what. If MS is stupid and crazy enough to attempt two GPUs, I'm staying far away from that giant fire hazard.
 
Oh, right, you're still stuck on the 2 GPUs part of the rumor. The final unit will have only one GPU no matter what. If MS is stupid and crazy enough to attempt two GPUs, I'm staying far away from that giant fire hazard.

I'm not stuck on anything - I came in, posted my 2 cents about what Microsoft would use IF THEY WERE USING TWO and somehow i'm stuck on two GPUs? What?

apparently you guys are completely oblivious to what speculation based on hypotheticals?
 
No. The cards in 2005 were similar in performance but not in architecture

Architecture wasn't used until 2007

What are you talking about? The HD 3000 series used a totally different shader architecture. Xenos was clearly the older architecture. I mean, it took cues from both, but a quick look at the specs makes it clear that R520 was the starting point, not R600.

I'm not stuck on anything - I came in, posted my 2 cents about what Microsoft would use IF THEY WERE USING TWO and somehow i'm stuck on two GPUs? What?

apparently you guys are completely oblivious to what speculation based on hypotheticals?

Okay, but that really just goes to show why this rumor is iffy.
 
What are you talking about? The HD 3000 series used a totally different shader architecture. Xenos was clearly the older architecture. I mean, it took cues from both, but a quick look at the specs makes it clear that R520 was the starting point, not R600.

HD2xxx

Xenos is R600.
 
No. The cards in 2005 were similar in performance but not in architecture

Architecture wasn't used until 2007




Sigh... Nobody is paying attention or reading my posts - just looking at key words and responding :S

Huh? You've got it backwards. The first R500 PC parts were released in Oct. 2005, but the high end R500 parts that could actually outperform the Xenos(X1900/X1950) didn't release until a year later.

Sure you could argue that it had unified shaders which did not come until later on PC, but it was a R500 chip in every other way.


HD2xxx

Xenos is R600.

No it isn't, it's an R500 chip in every way aside from having unified shaders.
 
So a 100watt increase is considered a wet dream? Wattage has been increasing every single generation by leaps and bounds, dude. I'd prefer a 7990 over my suggestion - but I stand by the assessment if it were to be a dual GPU solution - if it's any lower than that, it's not worth doing - my thought process is based on IF
AHAHAHAHAHAHAHAHAHAHAHAHAH





Not likely.
Actually it is likely. They want to position these as set-top boxes. Power consumption plays a roll in that. No one wants a 300+W space heater running all day.
 
Sorry, but all bets are on power consumption dropping next gen.

My prediction is around 125 W for the next Xbox. The current slim with a 45nm SOC sucks around 90W.

On a refined 32nm process an SOC with 6 powerPC cores and an AMD GPU could fit nicely in that power envelope and provide a tremendous upgrade over the current 360.

The good news is that I don't think the CPU has to scale up in performance as much as the GPU does so that would leave more transistors in the budget for the GPU. I'm expecting 6850 level of performance (in raw flops).
 
No it isn't, it's an R500 chip in every way aside from having unified shaders.

It's very much it's own unique design.


ATI are probably fairly keen not to use the R500 name as this draws parallels with their upcoming series of PC graphics processors starting with R520, however R520 and Xenos are very distinct parts. R520's aim is obviously designed to meet the needs of the PC space and have Shader Model 3.0 capabilities as this is currently the highest DirectX API specification available on the PC, and as such these new parts still have their lineage derived from the R300 core, with discrete Vertex and Pixel Shaders; Xenos, on the other hand, is a custom design specifically built to address the needs and unique characteristics of the game console. ATI had a clean slate with which to design on and no specified API to target. These factors have led to the Unified Shader design, something which ATI have prototyped and tested prior to its eventual implementation ( with the rumoured R400 development ? ) , with capabilities that don't fall within any corresponding API specification.

http://www.beyond3d.com/content/articles/4/2
 
AHAHAHAHAHAHAHAHAHAHAHAHAH

sigh. Once again people not understanding. said i'd prefer a 7990 as a gamer, because.. just like all gamers... I want the most powerful thing on the face of the planet.

But you guys just look at it, go omg you need a 50,000 rpm fan without understanding what the fuck i was even talking about.

i said dual 7850 would be likely target for a theoretical dual GPU scenario... but as a gamer a 7990 would be my "wet dream"

seriously, pay attention.

Actually it is likely. They want to position these as set-top boxes. Power consumption plays a roll in that. No one wants a 300+W space heater running all day.

No it's not likely. It's absolutely not likely. power has to be at manageable levels where being a set-top box is possible. 300W is a manageable level. My PC is smaller than a 360, is quieter than a 360, puts out less heat than a 360 and pulls about 290W at load. My HDTV cable box pulls 250W. Electronics in general are escalating in power draw. Microsoft won't be putting hundreds of millions of dollars into R&D for a next generation Nuon. It's just not a likely scenario. They won't go overboard, but it's not going to be retarded. 300-350W is not retarded, but alas based on my dual GPU scenario. My inkling would be 250-300W as their power target.

Actually, it's the other way around. R600 was based on Xenos. However, it doesn't use VLIW5 stream processors like R600 does.

I hate having this debate, though. It always goes in circles. :/

Xenos is based on the R600. think of it like an early release of the design. Just because Xenos released first doesn't mean it was the initial "foray" into unified shaders and that the R600 was based on Xenos. People have been getting constantly confused by this...

Huh? You've got it backwards. The first R500 PC parts were released in Oct. 2005, but the high end R500 parts that could actually outperform the Xenos(X1900/X1950) didn't release until a year later.

Sure you could argue that it had unified shaders which did not come until later on PC, but it was a R500 chip in every other way.

once again, Xenos is not R500.


No it isn't, it's an R500 chip in every way aside from having unified shaders.

You have no idea how confused this quote is making me. seriously. it's hurting my head. absolutely. hurting. my. head.

unified shaders aren't something you just "slap" onto a gpu...

no, seriously. R500 and R600/Xenos are not even similar. Feature set (like DX support)? sure. But how things are handled are radically different.
 
sigh. Once again people not understanding. said i'd prefer a 7990 as a gamer, because.. just like all gamers... I want the most powerful thing on the face of the planet.

But you guys just look at it, go omg you need a 50,000 rpm fan without understanding what the fuck i was even talking about.

i said dual 7850 would be likely target for a theoretical dual GPU scenario... but as a gamer a 7990 would be my "wet dream"

seriously, pay attention.
Oh I was aware what you were saying ... and it's still laughable ... as is a dual 7850.

No it's not likely. It's absolutely not likely. power has to be at manageable levels where being a set-top box is possible. 300W is a manageable level. My PC is smaller than a 360, is quieter than a 360, puts out less heat than a 360 and pulls about 290W at load. My HDTV cable box pulls 250W. Electronics in general are escalating in power draw. Microsoft won't be putting hundreds of millions of dollars into R&D for a next generation Nuon. It's just not a likely scenario. They won't go overboard, but it's not going to be retarded. 300-350W is not retarded, but alas based on my dual GPU scenario. My inkling would be 250-300W as their power target.
no
 
No it's not likely. It's absolutely not likely. power has to be at manageable levels where being a set-top box is possible. 300W is a manageable level. My PC is smaller than a 360, is quieter than a 360, puts out less heat than a 360 and pulls about 290W at load. My HDTV cable box pulls 250W. Electronics in general are escalating in power draw. Microsoft won't be putting hundreds of millions of dollars into R&D for a next generation Nuon. It's just not a likely scenario. They won't go overboard, but it's not going to be retarded. 300-350W is not retarded, but alas based on my dual GPU scenario. My inkling would be 250-300W as their power target.

Is that serious? I thought cable boxes pull 25W.
 
Is that serious? I thought cable boxes pull 25W.
He's loony tunes

powerconsumption2.png

Newer FiOS DVR's use the following:

6416: approx 35-40W
7216: approx 33-35W
7232: approx 24-25W



SD cable boxes are typically around 12-15W, and I'd imagine HD ones are < 20W.
 
I still don't understand why a console can't launch at $499 and be successfull. I understand that it won't work, I just don't get why. The iPad costs more than $500, and it is updated every year. Buying a console is at least a 5 year investment, much more bang for your buck.
 
I still don't understand why a console can't launch at $499 and be successfull. I understand that it won't work, I just don't get why. The iPad costs more than $500, and it is updated every year. Buying a console is at least a 5 year investment, much more bang for your buck.

because all you can do is play nerdy games. with the ipad you can be creative. not to mention that you can replace alot of things with your ipad, including your mobile game systems
 
I still don't understand why a console can't launch at $499 and be successfull. I understand that it won't work, I just don't get why. The iPad costs more than $500, and it is updated every year. Buying a console is at least a 5 year investment, much more bang for your buck.

People are predisposed to consoles being family toys. People do not want to spend $500 on a toy and then another 60 minimum to enjoy it.
 
I still don't understand why a console can't launch at $499 and be successfull. I understand that it won't work, I just don't get why. The iPad costs more than $500, and it is updated every year. Buying a console is at least a 5 year investment, much more bang for your buck.

The big difference is people don't see consoles as necessities yet. And while I'm of the opinion most people buy an ipad to use as a high tech fancy toy more than anything else, they're definitely marketed as a "change your entire fucking lifestyle with this over-priced and soon to be outdated tablet." Having sold a ton of ipads last year for best buy, most people I sold them to believed it would replace their laptop (despite how vehemently I told them it wouldn't). Maybe once consoles reach set top box status, people's attitudes will change.


An iPad get the "Apple premium" and can charge a lot more. Every other tablet is forced to sell at $200-300.

That's because apple is an asshole company that takes advantage of its loyal customers. Not to say they don't make solid products, just that they get away with shit other manufacturers only dream of.
 
People are predisposed to consoles being family toys. People do not want to spend $500 on a toy and then another 60 minimum to enjoy it.

i sincerely think this changed the moment baby boomers started watching netflix on them.

for the younger its not a toy.
 
Status
Not open for further replies.
Top Bottom