Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
Good to see some logic in here...

For me the decision is clear, they need to not pack in Kinect. I'm almost sure they wont, either. Why give up $150 per stand alone Kinect sale? It makes no sense.
It makes plenty of sense if they feel universal usage is more important to the Windows ecosystem overall than the relatively small profit margins they'd see selling them separately.
 
I am no expert on this, but I believe that R&D is an ongoing process. So when 720 eventually launches the bulk of those costs have already been absorbed in previous fiscal statements and offset by the profit of other projects like 360.

Those costs don't have to be made back. They are already been dealt with.

So that entire line of thought concerning R&D's money needs to made back is wrong. I believe it's just us who think that way.

Bingo. There isn't a 20 billion dollar "Xbox deficit" that needs to paid back eventually, those losses were covered back in the fiscal years they happened by MS overall profitability, and a company like MS pisses away money on ongoing R&D anyway.

The whole "Xbox is in the hole" thing comes from posters that think companies are run like their childhood lemonade stand.
 
I am no expert on this, but I believe that R&D is an ongoing process. So when 720 eventually launches the bulk of those costs have already been absorbed in previous fiscal statements and offset by the profit of other projects like 360.

Those costs don't have to be made back. They are already been dealt with.

So that entire line of thought concerning R&D's money needs to made back is wrong. I believe it's just us who think that way.

They don't just spend money on R&D without a budget, they set a conservative sales target, set an amount per console they want to spend on R&D, and then multiply the two together to come up with an overall budget(extremely simplified version of what goes on). The goal is to break even on that amount of consoles by having future profits be greater than past losses.

Bingo. There isn't a 20 billion dollar "Xbox deficit" that needs to paid back eventually, those losses were covered back in the fiscal years they happened by MS overall profitability, and a company like MS pisses away money on ongoing R&D anyway.

The whole "Xbox is in the hole" thing comes from posters that think companies are run like their childhood lemonade stand.

We aren't talking about past losses though, we are talking about an ongoing project that hasn't even been released yet. They without a doubt are expecting the projects future profits to pay for the current R&D losses.
 
So is nVidia out of the console game altogether for next gen?

That's what a lot of rumors seem to be pointing to. I hope they're true, not because I have anything against nVidia but because AMD seems to be better at higher quality GPUs without having as big of power draws. It would also help keep the architecture pretty similar between the consoles in terms of GPU rendering so multi platform stuff might be a bit easier to do, if MS and Sony have similar goals for their GPUs.
 
They don't just spend money on R&D without a budget, they set a conservative sales target, set an amount per console they want to spend on R&D, and then multiply the two together to come up with an overall budget(extremely simplified version of what goes on). The goal is to break even on that amount of consoles by having future profits be greater than past losses.
The goal was actually prevent loss of sales of Windows to Sony.
 
They don't just spend money on R&D without a budget, they set a conservative sales target, set an amount per console they want to spend on R&D, and then multiply the two together to come up with an overall budget(extremely simplified version of what goes on). The goal is to break even on that amount of consoles by having future profits be greater than past losses.

Well yeah there isn't a bottom less money pit they can use for R&D ;)

What I meant was that the costs have already been spend and accounted when the first 720 gets sold. All of 720's R&D costs have been put on the balance and were offset in previous years by 360 profits.

The costs 720 now makes will be handled with in their Q3 2011 report and will be offset by profits 360 now makes. Those costs won't exist anymore when the console launches....
 
I thought they had a somewhat murky relationship due to certain agreements for RSX

Maybe they do, I don't really know TBH. I just have a feeling Sony will want/need to keep thing's as simple as possible. Better the devil you know?

Whatever CPU/GPU combo they've gone for, I'd imagine it's a done deal. Just waiting for leaks..........*Taps fingers*
 
Cmd. Pishad'aç;32837506 said:
I'm curious to see how such a machine with this specs would stand out graphically against Wii U...

Wii-U rumored Specs:

3 OOE Cores with SMT
at least 1 GB of GDDR5 memory + some amounts of other types of ram and Edram.
1 teraflop AMD GPU

Xbox rumored Specs:

6 cored CPU
2 gigs of fast DDR3 ram, mostly like 2133 class
2 AMD GPUs with at least 1 Gb of GDDR5 memory and at least 3 teraflops of processing power.

3-4 times the real world performance, less if Nintendo sticks with 720p.
 
Will next console ship with hdmi cables?
does anyone actually care about this anymore?





Wii-U rumored Specs:

3 OOE Cores with SMT
at least 1 GB of GDDR5 memory + some amounts of other types of ram and Edram.
1 teraflop AMD GPU

Xbox rumored Specs:

6 cored CPU
2 gigs of fast DDR3 ram, mostly like 2133 class
2 AMD GPUs with at least 1 Gb of GDDR5 memory and at least 3 teraflops of processing power

3-4 times the real world performance, less if Nintendo sticks with 720p.
Remember that whatever value is listed for VRAM, the usable amount is cut in half if they are using dual GPU's

they aren't
 
does anyone actually care about this anymore?



Remember that whatever value is listed for VRAM, the usable amount is cut in half if they are using dual GPU's

they aren't

Hehe, nothing gets delivered with cables anymore. And if they have a standard hdmi cable it doesnt matter anymore. Won't be surprised if HDMI is the only output...

Wii-U rumored Specs:

3 OOE Cores with SMT
at least 1 GB of GDDR5 memory + some amounts of other types of ram and Edram.
1 teraflop AMD GPU

Xbox rumored Specs:

6 cored CPU
2 gigs of fast DDR3 ram, mostly like 2133 class
2 AMD GPUs with at least 1 Gb of GDDR5 memory and at least 3 teraflops of processing power.

3-4 times the real world performance, less if Nintendo sticks with 720p.

I still don't believe the dual gpu setup. Probably gotten from an early dev kit to simulate the performance of the future chip...
 
Hehe, nothing gets delivered with cables anymore. And if they have a standard hdmi cable it doesnt matter anymore. Won't be surprised if HDMI is the only output...



I still don't believe the dual gpu setup. Probably gotten from an early dev kit to simulate the performance of the future chip...


I don't know why anybody gave these rumours any credence in the first place. DDR3 and dual GPUs should have set off warning signs for everybody.
 
Is there any specific reason NVidia seems to be so disinterested in this market? I understand it's probably not the most lucrative niche around, but there is still money to be made.

Perhaps it's just me making too much out of something.
I suspect it may be less to do with NVidia interest and more to do with a lack of interest from the console manufacturers.

The pricing model for Xbox left a bad taste in MS's mouth, and I suspect the performance and cost of RSX may have had a similar effect on Sony. Well, maybe not quite how things ended with MS, but on the other hand I doubt Sony felt it went so well they'd just skip testing the waters with alternatives.

The fact NVidia has been having issues competing vs ATI (AMD) in terms of performance/watt probably doesn't help things. We have no reason to assume things will change this upcoming GPU gen. ATI simply seems to be the better fit for set-top boxes.
 
We aren't talking about past losses though, we are talking about an ongoing project that hasn't even been released yet. They without a doubt are expecting the projects future profits to pay for the current R&D losses.

This has been covered before:

Losses are not cumulative in the way I am perceiving people here understanding it. The loss is only per fiscal year -- that's the running total and once a new fiscal year starts, the loss counter resets at $0. (for the purposes of our discussion in this thread) So, but this virtue, the division isn't "approaching $4bn in debt" for these purposes, they're only as much in debt as they are in the current fiscal year.

Like everything else, the Xbox-Next's ongoing (probably since 2005) R&D budget gets reset every year when the new fiscal year starts, it doesn't add up to one giant bill that the Xbox division has to pay back "or else" after launch.
 
Is there any specific reason NVidia seems to be so disinterested in this market? I understand it's probably not the most lucrative niche around, but there is still money to be made.

Perhaps it's just me making too much out of something.


Its not that they're disinterested, its the fact that they've burnt so many bridges over the years. If you were Sony, would you have been happy with the outdated POS Nvidia pawned off to you?

Its well documented how Nvidia royally screwed over MS with the original Xbox and considering the lengths AMD went to deliver above and beyond with Xenos, why exactly would Microsoft consider anyone else?

CELL's probably going to bork up BC for Sony anyway, might as well make a clean break on the GPU side, given the chance.

At this point, the safest bet is that AMD will be in all three consoles next generation, just like IBM was in all three this generation. For Sony's direction, you need to look at what they're doing with Vita.
 
It'll be really funny if MS opted for an updated cell cpu with OOE PPE in their next xbox.
They would have three or more of them to keep backwards capability with 360. The SPEs were the stars of PS3's architecture (saved Sony's ass when it comes to graphics).

Maybe Msnerd is getting somewhere with the Loop drivel, since the assisitive ARM cores does sound a lot like SPEs to me.
 
Its not that they're disinterested, its the fact that they've burnt so many bridges over the years. If you were Sony, would you have been happy with the outdated POS Nvidia pawned off to you?

Its well documented how Nvidia royally screwed over MS with the original Xbox and considering the lengths AMD went to deliver above and beyond with Xenos, why exactly would Microsoft consider anyone else?

CELL's probably going to bork up BC for Sony anyway, might as well make a clean break on the GPU side, given the chance.

At this point, the safest bet is that AMD will be in all three consoles next generation, just like IBM was in all three this generation. For Sony's direction, you need to look at what they're doing with Vita.

I wonder, setting aside the conflict of interest (if any were to exist), how will PS4 ensure a different GPU than Xbox 3's or vice-versa if both get their GPU designed by AMD (if it isn't a fully custom designed one that is)?
 
I wonder, setting aside the conflict of interest (if any were to exist), how will PS4 ensure a different GPU than Xbox 3's or vice-versa if both get their GPU designed by AMD (if it isn't a fully custom designed one that is)?

Since MS will own the IP for the GPU, Sony should be careful on not getting sued. Vice Versa.
 
Wii-U rumored Specs:

3 OOE Cores with SMT
at least 1 GB of GDDR5 memory + some amounts of other types of ram and Edram.
1 teraflop AMD GPU

The parts with a strike I would consider educated guesses over rumors. The parts without are either actual rumors or things confirmed.
 
One thing seems to be the case is that PC textures probably won't be going past 2048*2048*4 anytime soon so the next systems probably will be good with 2-4gb of ram. This range of ram will be enough to allow PC level texturing on the next gen. Combined with enough edram to do free AA'ed deferred rendering and the games will look great for a few years.
 
This has been covered before:



Like everything else, the Xbox-Next's ongoing (probably since 2005) R&D budget gets reset every year when the new fiscal year starts, it doesn't add up to one giant bill that the Xbox division has to pay back "or else" after launch.
For tax, financial statement, etc. purposes, sure ... but as a business things aren't quite that simple.

Now for the people keeping a running tally that includes the original Xbox losses, certainly that is silly ... but of course a company looks back at the current generation as a whole when planing the design, marketing, and targeting of their next console. Obviously if market analysis shows the current generation is going to end in the red, they don't just go 'hey guys, let's do the same thing this time'. That's crazy talk.

Similar, if successive generations are in the red, they're going to reconsider whether it makes sense to go on unless their continued market presence (even at a loss) is a net positive for the company due to its impact on other sectors of the company.
 
I wonder, setting aside the conflict of interest (if any were to exist), how will PS4 ensure a different GPU than Xbox 3's or vice-versa if both get their GPU designed by AMD (if it isn't a fully custom designed one that is)?
ATi has been working with Nintendo for generations.

More to the point, IBM did everyone's CPU last gen ... and likely this as well. Companies as big as IBM and AMD are well equipped at handling IP concerns and the like.
 
Good to see some logic in here...

For me the decision is clear, they need to not pack in Kinect. I'm almost sure they wont, either. Why give up $150 per stand alone Kinect sale? It makes no sense.

For all the criticism I think MS handled Kinect well by seperating it. I'm sure it's resulted in a lot more profit for them as well. It's simple, IF you want Kinect, pay for it.

If anything, get the component costs down and cut Kinect to $99, that will drive adoption while still providing the benefits of being stand alone as detailed above.


It's actually not logical at all. Basic corporate strategy is that long-term competitive advantage doesn't necessarily mean maximizing profitability. If Kinect is where MS seeks to build its unique niche audience and is their key competitive strength then it should be packed-in and used as a centerpiece. If it isn't then it will only be what it can be, a sideshow, a mere accessory. Now GAF may like that line of thinking, but simply speaking it won't happen. Kinect is now a part of MS strategy and will be integrated as a full component of the Xbox brand and system from now on.
 
It's actually not logical at all. Basic corporate strategy is that long-term competitive advantage doesn't necessarily mean maximizing profitability. If Kinect is where MS seeks to build its unique niche audience and is their key competitive strength then it should be packed-in and used as a centerpiece. If it isn't then it will only be what it can be, a sideshow, a mere accessory. Now GAF may like that line of thinking, but simply speaking it won't happen. Kinect is now a part of MS strategy and will be integrated as a full component of the Xbox brand and system from now on.
This is before considering its usefulness in the 3 screens and the cloud ecosystem. Everyone using Bing to aggregate media, etc = wins for MS as a whole.
 
Idk why MS is still beating around the bush in the console wars.

Just release a box with Windows 10 that has a noob-friendly upgradeable video card (basically a PC for newbs, just like what consoles are anyways) and its game over.
 
Idk why MS is still beating around the bush in the console wars.

Just release a box with Windows 10 that has a noob-friendly upgradeable video card (basically a PC for newbs, just like what consoles are anyways) and its game over.

Why? People seem happy rebuying gadgets like the iPad. More importantly next-next gen likely will see introduction of a cloud service that makes upgradeable boxes a quaint notion.
 
Why? People seem happy rebuying gadgets like the iPad. More importantly next-next gen likely will see introduction of a cloud service that makes upgradeable boxes a quaint notion.

You answered your own question. Someone will make the console of forever eventually anyways. The way I see it, its always been a race to see who can do it first.
 
I expect Sony and MS to enforce 1080p in TRC, possibly 3D as well (most likely only Sony on the later point).

If MS does not enforce 1080p, Sony won't. They would be shooting themselves in the foot.
 
I expect Sony and MS to enforce 1080p in TRC, possibly 3D as well (most likely only Sony on the later point).

If MS does not enforce 1080p, Sony won't.

Enforcing 1080p is stupid. In action heavy sequences you won't notice the difference between 720p and 1080p. I did rather them enforce 60fps.
 
Idk why MS is still beating around the bush in the console wars.

Just release a box with Windows 10 that has a noob-friendly upgradeable video card (basically a PC for newbs, just like what consoles are anyways) and its game over.

Windows 10?

kutaragi4d.jpg


I thought only Sony had 4D technology?

500x_kenfuntimes.jpg
 
Enforcing 1080p is stupid. In action heavy sequences you won't notice the difference between 720p and 1080p. I did rather them enforce 60fps.

Same with 60FPS, both have to enforce it, or at least MS. Sony can't afford to have games on PS4 looking worst just to run better if MS doesn't do it.
 
I wouldn't count on it

It was rumored a while ago that AMD had won contracts on all 3 Next Gen Consoles.

and after a quick google search, this was the first mention of it I found (July 7, 2011)
http://www.hardocp.com/article/2011/07/07/e3_rumors_on_next_generation_console_hardware
The Big GPU News
What looks to be a "done deal" at this point is that AMD will be the GPU choice on all three next generation consoles. Yes, all the big guns in the console world, Nintendo, Microsoft, and Sony, are looking very much to be part of Team AMD for GPU. That is correct, NVIDIA, "NO SOUP FOR YOU!" But NVIDIA already knew this, now you do too.

There are going to be game spaces that NVIDIA does succeed in beyond add in cards and that will likely be in the handheld device realm but we do not see much NVIDIA green under our TV sets. NVIDIA was planning to have very much underwritten its GPU business with Tegra and Tegra 2 revenues by now, but that is moving much slower than the upper brass at NVIDIA wishes. Tegra 2 penetration has been sluggish to say the least.

AMD has always been easier to work with than NVIDIA on the console front. Well that may not be exactly true, but Microsoft did not spend months in arbitration with NVIDIA over Xbox 1 GPU and MCP costs back in 2002 and 2003. I always felt as though that bridge was burned.

All 3 are already using IBM for their CPU's so it would only make things easier on devs if they all used the same manufacturer for their GPU's too.
 
This article claims that Nvidia confirmed it is working on a console:

As far as console business is concerned, Jen-Hsun confirmed that the company is working on an next-generation console. Chances are, the only chips inside the unnamed console should come from Santa Clara. Sony or Microsoft? We'll leave that guessing game to others.

http://www.brightsideofnews.com/new...r-is-a-64-bit-arm-processor-architecture.aspx

The article is on Project Denver/Maxwell--64bit ARM w/256 cuda core GPU with the implication it is for PS4.

Two major reasons why nVidia wanted to develop the processor - with or without Microsoft's dedication to base Windows 8 on ARM architecture - were Tesla and console business.
 
I expect Sony and MS to enforce 1080p in TRC, possibly 3D as well (most likely only Sony on the later point).

If MS does not enforce 1080p, Sony won't. They would be shooting themselves in the foot.

I don't expect them to enforce anything maybe 720p and Sony might do 3D for it's first party but nothing else.
 
Enforcing 1080p is stupid. In action heavy sequences you won't notice the difference between 720p and 1080p. I did rather them enforce 60fps.
You've got to be kidding.

I'm not advocating requiring it, but you've got to be nuts if you think native 1080p vs 720p content is not noticeable. This isn't a movie that's down-sampled from a higher resolution source. Unless you have a tiny TV or sit in another room while playing (yes, I'm exaggerating for effect), it's going to be quite obvious.


Granted, my hope is they actually make games more like PC titles where you can configure performance. That way, everyone wins.
 
You've got to be kidding.

I'm not advocating requiring it, but you've got to be nuts if you think native 1080p vs 720p content is not noticeable. This isn't a movie that's down-sampled from a higher resolution source. Unless you have a tiny TV or sit in another room while playing (yes, I'm exaggerating for effect), it's going to be quite obvious.


Granted, my hope is they actually make games more like PC titles where you can configure performance. That way, everyone wins.

Rage drops it's resolution by half during intense gameplay. I haven't heard too many complaints about the resolution.
 
Status
Not open for further replies.
Top Bottom