Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
Probably a little less than 360. And that came in at 525$ on day 1.

It's going to cost a hell of a lot less than that. Pretty much the entire point of this discussion is how important it's going to be to keep the future systems relatively close in cost to their sale price and be able to price drop them quickly.

That's where an important source of money is. If they can sell 100$ of apps to each console owner in the life span of the console they have their 35$ back.

Well, a) you can't, really, and b) this is still an entirely wrong way to think about the cost of manufacture on a newly-launching console.

Pretty sure E&D division has been profitable for the past few years, or maybe it was just Xbox. Those losses are in the past, sunk costs incurred to get into the business. Nobody is counting that against them any more.

That's not entirely right. Costs on the OG Xbox are indeed entirely sunk and irrelevant at this point, yes. Costs on the 360, however, are still quite relevant. Game consoles operate on a generational cycle where companies take on massive losses at the start in terms of R&D, manufacturing costs, loss on hardware, and other upfront costs which are then (if they're successful) counteracted by profits at the high-yield tail years of the gen. Proper accounting for the business is going to (at least in part) consider each generation as a whole and determine the return-on-investment that the late-gen profits produce on those upfront costs.

In the case of the 360, they'll probably wind up being very slightly net positive for the 360 project as a whole, but not producing a profit at the level Microsoft really needs its divisions to produce going forward. The per-year profitability of the department these last few years (in the console's naturally profitable endgame years) won't really matter for their status next gen unless they can continue to lower costs and increase revenues for the next console cycle as a whole.

And only fanboys/us do the accounting that way. In the real world the losses have been written off and it makes a profit

Long-term investments always have to be accounted for as a whole when judging their value and rate of return.

It's true that on a quarterly basis, past losses are sunk costs, but all that means is that Microsoft isn't going to shut down the Xbox division while it's still making a quarterly profit. The actual success of the 360 as a system is still rightfully judged in a manner amortizing over its whole lifespan.

You need to factor in R&D.

Err, no I don't. I'm not talking about R&D, and it's entirely irrelevant to this discussion. R&D is a fixed cost and the entire issue of packing in an extra peripheral has to do with how it increases marginal cost.
 
It makes plenty of sense if they feel universal usage is more important to the Windows ecosystem overall than the relatively small profit margins they'd see selling them separately.

I'm not sure they think that. Kinect is 'accessible' gaming. Thats when they've sold into and exhausted most core gamers, and they want to refresh the platform to keep the money coming in.

Early days in a new console you don't need stuff like that, the core tech will sell to the early adopters.

Regardless of what MS say about kinect being the future of gaming, its just PR guff to push more accessories in mid-cycle.
 
Bingo. There isn't a 20 billion dollar "Xbox deficit" that needs to paid back eventually, those losses were covered back in the fiscal years they happened by MS overall profitability, and a company like MS pisses away money on ongoing R&D anyway.

The whole "Xbox is in the hole" thing comes from posters that think companies are run like their childhood lemonade stand.


Right. So can we use the same argument for Sony please? All their massive losses, and PS3 being so behind - all written off by now, and PS4 starts with a clean slate too.
 
I expect 3D to be a huge selling point next generation so taking that into account I could see games having two modes: 720p, 30fps in 3D and for 2D either 720p, 60fps or 1080p, 30fps.
 
I suspect it may be less to do with NVidia interest and more to do with a lack of interest from the console manufacturers.

The pricing model for Xbox left a bad taste in MS's mouth, and I suspect the performance and cost of RSX may have had a similar effect on Sony. Well, maybe not quite how things ended with MS, but on the other hand I doubt Sony felt it went so well they'd just skip testing the waters with alternatives.

The fact NVidia has been having issues competing vs ATI (AMD) in terms of performance/watt probably doesn't help things. We have no reason to assume things will change this upcoming GPU gen. ATI simply seems to be the better fit for set-top boxes.


How much of the performance/price of RSX was down to NVidia, and how much down to Sony being in a hurry after realising 2xCELL wouldn't work? If the latter, Sony might still be up to working with Nvidia, and Nvidia might be more flexible regarding customisation etc facing the alternative of an AMD whitewash.

And I thought recent Nvidia cards were drastically improving the performance per watt? The 560 seems to be pretty good in that regard.
 
I expect 3D to be a huge selling point next generation so taking that into account I could see games having two modes: 720p, 30fps in 3D and for 2D either 720p, 60fps or 1080p, 30fps.

1080p/60 would be the target for 2D games. That should be much more trivial than this generation due to the new tech. As usual some devs will push things too far and you'll end up with 1080p/30 which is fine too.

If 1080p/60 is fairly standard and achievable, then for 3D modes you can just switch down to 720p. Fillrate is about the same for 1x1080p Vs 2x720p screens
 
Right. So can we use the same argument for Sony please? All their massive losses, and PS3 being so behind - all written off by now, and PS4 starts with a clean slate too.

Sony operates the same exact way, doesn't mean that they aren't going to make some major changes on how they approach the PS4, but it's not going to start out it's life getting angry letters from creditors over unpaid PS3 debt.
 
Right. So can we use the same argument for Sony please? All their massive losses, and PS3 being so behind - all written off by now, and PS4 starts with a clean slate too.

Uh, it's a little different when Sony seems to have a hard time turning a profit at all. So no, it's not exactly the same.
 
I don't see why it would be DX11 based. I'm sure it'll have a very symbiotic relationship with Windows 8. Could easily see it based off of DX12. Although how much any of that matters is debatable.

Just by looking at history, both Xboxes were fitted with GPUs capable of some DX features not available at the time of release so yes, it wont be a DX11 part (11.5 or even 12.5 are more likely).
 
1080p/60 would be the target for 2D games. That should be much more trivial than this generation due to the new tech. As usual some devs will push things too far and you'll end up with 1080p/30 which is fine too.

If 1080p/60 is fairly standard and achievable, then for 3D modes you can just switch down to 720p. Fillrate is about the same for 1x1080p Vs 2x720p screens
Yeah but then there's the polygon count to take into account... it's not so easy to just halve that as it is resolution.

Essentially, going from one 1080p screen updated 30 fps to two 720p screens will cost the same in terms of shading pixels however the polygon count is doubled thanks to two separate images being displayed. Apparently some developers are working on ways to mitigate this load though through some clever rendering tricks. We might end up with some developers opting for something like 640p for 3d mode just to get a perfect like for like version aside from the resolution. (because with unified shaders if you are shading less pixels then you have more shaders available for geometry) Also I've heard that resolution is less important when viewing in 3d for some reason.

I don't know about 1080p 60fps though... I think there will be as many games doing that as there are doing true 720p 60fps this generation, which isn't many. 480p is to 720p as 720p is to 1080p and the jump from this generation to next might not be as high as the last generation was to this one.
 
Yeah but then there's the polygon count to take into account... it's not so easy to just halve that as it is resolution.

yeah, fair enough, I was simplifying


I don't know about 1080p 60fps though... I think there will be as many games doing that as there are doing true 720p 60fps this generation, which isn't many. 480p is to 720p as 720p is to 1080p and the jump from this generation to next might not be as high as the last generation was to this one.

nah, no problem. I think people are taking this gen's experience and worrying overly. in the case of 1080p/60 polycount doesn't change, its purely fillrate, and that shouldn't be a problem at all. Worst case IMO will be 1080p/30, like this gen is 720p/30. Hardly any major titles won't be 1080p next gen.
 
nah, no problem. I think people are taking this gen's experience and worrying overly. in the case of 1080p/60 polycount doesn't change, its purely fillrate, and that shouldn't be a problem at all. Worst case IMO will be 1080p/30, like this gen is 720p/30. Hardly any major titles won't be 1080p next gen.
From what I understand that's the problem though.. at 60fps the poly count doubles as you have to render twice as many per second... not to mention shade four times as many pixels! (at 1080p 60fps vs. 720p 30fps(124 million versus 27 million)

So as a really rough example if the new consoles are lets say 8 times more powerful in terms of unified shader ops (and just looking at shader ops in a vacuum) then Gears of War 4 for Xbox 3 running at 1080p and at 60fps would only have 2* prettier pixels (twice the shader ops per pixel per frame) and about 4 times the triangles as the 360 version... not a huge jump imo... not enough to last another 10 years (though launch games may all start at 1080 and 60 and slowly transition to lower frame rates and resolutions to up the eye candy over the years so as to compete with PC gaming hardware).

Of course this is all very crude and I'm not a developer so I may be way off.
 
How much of the performance/price of RSX was down to NVidia, and how much down to Sony being in a hurry after realising 2xCELL wouldn't work? If the latter, Sony might still be up to working with Nvidia, and Nvidia might be more flexible regarding customisation etc facing the alternative of an AMD whitewash.

And I thought recent Nvidia cards were drastically improving the performance per watt? The 560 seems to be pretty good in that regard.

It depends on what its doing

The gtx 560 uses anywhere from 294w to 320w in crysis vs the 6950 that uses 292w. In furmark the gtx 560 uses anywhere from 349w to 391w compared to the 6950 using 320w.

In crysis at 1920x1200 the gtx 560 highest clocks tested got 44 fps while the 6950 got 50fps

http://www.anandtech.com/show/4344/nvidias-geforce-gtx-560-top-to-bottom-overclock/4

Infact at those resolutions the radeon 6870 falls between the top two gtx 560 cards yet the radeon 6870 uses just 277w while playing crysis vs the gtx 560 base which uses 294w and the highest that uses 320w .


AMD is certianly ahead with its power consumption
 
Well, I'd take 720p with a rock solid 60fps over 1080p with a flaky, variable frame rate any day...

For me nothing is more distracting whilst gaming than an inconsistent frame rate...apart from the girlfriend walking through the room naked, that is.
 
I'm not sure they think that. Kinect is 'accessible' gaming. Thats when they've sold into and exhausted most core gamers, and they want to refresh the platform to keep the money coming in.
I think they see it as more than gaming now. They may wish for it to be the main interface into Bing as part of 3 screens and the cloud.





How much of the performance/price of RSX was down to NVidia, and how much down to Sony being in a hurry after realising 2xCELL wouldn't work? If the latter, Sony might still be up to working with Nvidia, and Nvidia might be more flexible regarding customisation etc facing the alternative of an AMD whitewash.
They never realistically were considering 2xCELL from all reports I've read. As for how they feel about NVidia, you may be right (I tried to allude to that but didn't do a good job) ... it's likely they don't have the same reservations as MS. What I was trying to say though is that I doubt things went so swimmingly that they feel in any way tied to NVidia. I think Sony will go with whatever seems like the best deal - regardless of who it's from. Had Tegra 3 worked out for Vita, maybe things would be different.

As for NVidia being more flexible ... I think they need to be. That's been one of the issues with them from what I've read. And if history is anything to go by, I suspect ATi will continue to be much more open to custom silicon.

And I thought recent Nvidia cards were drastically improving the performance per watt? The 560 seems to be pretty good in that regard.
They still trail behind ATi
 
It depends on what its doing

The gtx 560 uses anywhere from 294w to 320w in crysis vs the 6950 that uses 292w. In furmark the gtx 560 uses anywhere from 349w to 391w compared to the 6950 using 320w.

In crysis at 1920x1200 the gtx 560 highest clocks tested got 44 fps while the 6950 got 50fps

http://www.anandtech.com/show/4344/nvidias-geforce-gtx-560-top-to-bottom-overclock/4

Infact at those resolutions the radeon 6870 falls between the top two gtx 560 cards yet the radeon 6870 uses just 277w while playing crysis vs the gtx 560 base which uses 294w and the highest that uses 320w .


AMD is certianly ahead with its power consumption

Just small clarification, the Watt consumption you are referring to is the one of a whole system (CPU, Motherboards etc...), not the GPU only.

It does not make your comment any less relevant, it's just for the sake of clarification, console hardware would certainly run on a lower basis (with less "useless" parts for their specific purpose and a CPU architecture that would most probably based on ARM, with a much lower consumption than x86 CPU's for the same computing power).

And we are not even discussing the fact that console manufacturer would probably never go for a model that would require a lot of power consumption, to keep the form-factor and PSU to a decent and cost-savy level (which therefore gives them the choice to either go for a smaller circuit print or to down the performance).
 
It does not make your comment any less relevant, it's just for the sake of clarification, console hardware would certainly run on a lower basis (with less "useless" parts for their specific purpose and a CPU architecture that would most probably based on ARM, with a much lower consumption than x86 CPU's for the same computing power).

Are you saying power consumption limitations are determining next-gen console CPU's are based in ARM? Maybe I'm not following your example.
 
[Nintex];32856969 said:
True, but not on everything. That's like Dirt 2 that used tesselation... for the flags.

The terrain in Halo Wars was tessellated too. That's two Halo games with tessellation. Viva Pinata makes extensive use of tessellation too and uses a deferred renderer.
 
The terrain in Halo Wars was tessellated too. That's two Halo games with tessellation. Viva Pinata makes extensive use of tessellation too and uses a deferred renderer.

Forza 3 also used tesselation for the garage stuff and all, I imagine Forza 4 does the same. But this generation we haven't seen tesselation like in that Nvidia vid or what you'll get from Crysis 2(flat concrete walls with 2 billion polygons sucking away performance lol).
 
[Nintex];32857490 said:
Forza 3 also used tesselation for the garage stuff and all, I imagine Forza 4 does the same. But this generation we haven't seen tesselation like in that Nvidia vid or what you'll get from Crysis 2(flat concrete walls with 2 billion polygons sucking away performance lol).

Both the ups and downs nvidia sponsored :p
 
If graphics are designed for ~1080p30 on most games in the future there should just be an option to flip 720p for maximum framerate.
 
O boy

Bring on HALO 5 with aliens like that to kill in glorious tesselation magic. O my, i don't even.....

Problem is I remember hearing about this sort of damage system(not using tessalation mind you) when the ps3 was gearing up for release. Some zombie tech demo or something.

I just wonder if developers are going to really be willing to use resources for this sort of thing when they can just forgo it and make slightly prettier backgrounds or prettier but less interactive and more superficial character models.
 
Problem is I remember hearing about this sort of damage system(not using tessalation mind you) when the ps3 was gearing up for release. Some zombie tech demo or something.

I just wonder if developers are going to really be willing to use resources for this sort of thing when they can just forgo it and make slightly prettier backgrounds or prettier but less interactive and more superficial character models.

Depends on the company? Epic's samaritan demo had something similar with the character's rock skin ability. Gears 3 features mutating enemies though that isn't from tessellation.
 
Microsoft is not going with Nvidia again. Did everyone forget what happened to the Xbox 1?
Stupid Microsoft lawers who didn't bother to check how h/w is licensed in console world before making a deal with NV? Btw, NV is working with SCE in the very same way as AMD with MS right now. Considering that NV and MS has a lot of contacts on ARM/Tegra and PC GPU fronts I wouldn't rule out MS choosing NV GPU for their console again just because of their history with Xbox 1.
 
Stupid Microsoft lawers who didn't bother to check how h/w is licensed in console world before making a deal with NV? Btw, NV is working with SCE in the very same way as AMD with MS right now. Considering that NV and MS has a lot of contacts on ARM/Tegra and PC GPU fronts I wouldn't rule out MS choosing NV GPU for their console again just because of their history with Xbox 1.

gpu will be with amd, 90% of it because of BC and ease of fwd programming reasons
 
Stupid Microsoft lawers who didn't bother to check how h/w is licensed in console world before making a deal with NV? Btw, NV is working with SCE in the very same way as AMD with MS right now. Considering that NV and MS has a lot of contacts on ARM/Tegra and PC GPU fronts I wouldn't rule out MS choosing NV GPU for their console again just because of their history with Xbox 1.
Its been said that none of the next consoles will have an Nvidia GPU in it. I believe it too.
 
Doubtful. Since the majority of consoles are build around TV tech, they will stick to 720 and 1080. Also most TV's are still 16:9, so 16:10 is pretty much reserved for PC users.

No one is suggesting the console is going to output at 1280x1080 but games with the 3D rendered at ~1280x1080 with a native 1080p HUD is something I expect to be common. It will scale well to both 720p and 1080p and with a decent AA solution, I doubt many consumers are going to care about the difference. Its still a pretty decent step up from the sub 720p + garbage AA solution that is almost the norm on current consoles.

Much better use of AF is going to make as big a difference as anything.
 
What does Gaf think of this possibility? (It's Pachter but still..)

http://www.industrygamers.com/news/...il-2014-rumors-for-2012-are-just-plain-silly/

Updated Xbox 360 to keep the Wii-U at bay and then launch a full fat next-gen machine in 2014 @22NM?

Possible?

Doubt it, for the time simple fact that I highly doubt that MS wants to go up against Sony directly. Launching a year early this gen definitely caused MS some problems, but I don't think they attribute that just to launching early, but to being unprepared and rushing. With 7 years under their belt, a 2012 launch wouldn't be rushed at all. Not to mention launching earlier than PS4 by at least a year and tying the Xbox720 launch to Windows 8 would pretty much be the 1-2-3 combo to Sony.

My bet is still on 2012, with Summer 2013 being the absolute latest and a sound "No way in HELL" to 2014.


EDIT: Also, Xbox 360 has had how many different iterations by this point? Wouldn't releasing another modified Xbox360 with different internals to allow Windows 8 integration just be more trouble than it's worth? They might as well release the next gen Xbox at that point...
 
Status
Not open for further replies.
Top Bottom