Graphene transistors...
SIGH...alas Graphene does not have a natural band-gap and the type of transistors used in traditional computing chips are not possible to be made with the material. *prays for silicine*
Graphene transistors...
No, no it's not.
Yes it is, it will likely cost them over $10 billion dollars to ship out the first 10 million consoles between the per unit manufacturing costs, marketing and R&D. Adding 35 per unit is $350 million which is almost meaningless, especially if it's the difference between selling or not.
You can bet your ass that Nintendo would have loved to have been able to add $35 per unit in costs in order to sell all the 3DS at $250. When you throw in the fact that the next 360 will retail for probably $400 it's not close.
I personally don't think launch customers will see added value in Kinect(they likely already own one if they have any interest.) so I don't think they add it initially but if there is value in it $35 wouldn't stand in the way.
What in the hell are you talking about?
How would 2 GB of ram have anything to do with the operating performance of the software? LOD is indicative of the power, usually, as opposed to memory constraints. Tearing is also a result of power(in the sense the developer couldn't get the game code running well enough to activate Vsync) - same with framerate...
2 GB is an over 4x improvement for textures (as I have explained before) which would basically be basically as good as it really needs to get for the next generation
It would cost more than double to go to 4 GB- and considering the very minute improvement you'd get from 4 GB over 2, it's simply not worth the price
So I ask again- what the hell are you talking about lol
Let them bleed more money! More is meaningless!
Let them bleed more money! More is meaningless!
Yes it is, it will likely cost them over $10 billion dollars to ship out the first 10 million consoles between the per unit manufacturing costs, marketing and R&D. Adding 35 per unit is $350 million which is almost meaningless, especially if it's the difference between selling or not.
You can bet your ass that Nintendo would have loved to have been able to add $35 per unit in costs in order to sell all the 3DS at $250. When you throw in the fact that the next 360 will retail for probably $400 it's not close.
I personally don't think launch customers will see added value in Kinect(they likely already own one if they have any interest.) so I don't think they add it initially but if there is value in it $35 wouldn't stand in the way.
ahahahahahahaha no.
$35 on 6 million sales during the launch window is $210,000,000. $35 over 50m lifetime sales is nearly $2 billion.
As a comparison, this generation we've regularly seen companies shave off components that cost less than $5 in order to eke out extra margin on their consoles, because the economics on these systems are just that tight.
Now, again, that doesn't mean that we won't see the 720 be packed with Kinect -- but if we do, it's a direct tradeoff against either the system's stats or the MSRP.
You're right! My bad. How stupid of me to think that nearly 10% of the retail price is significant!
But you are completely right. There is no such thing as free.
Right, that's my entire point. If people are trying to work out the maximum power you could squeeze out of a $399 box in 2012, they're shooting too high if they think Kinect will also be included.
I actually don't have a position about what the better choice for MS would be. It might honestly be more profitable for them to sell a model without Kinect and let people use the one from their 360 or buy it separately when they want it -- or it might be better in the long run to force it into place on all units so it can see broader use by software. Not sure which is the better call.
If this is true, wtf is going on? These are $800+ cards. Microsoft is going to try and utterly obliterate the WiiU and PS4 before their production.
EDIT: The speculation is still sound.
Not sure why the retail price matters since we are talking about the costs and $35 is going to be less than 5% that.
Well to be fair...
1) those $35 bucks are likely to drop even further over time, especially when you can order parts in extremely large quantities....
2) It gives you common factor across all SKU's. Something you can build on, a simple to use 'touch' interface on every tv. (Great for Metro apps for Win 8 and 720)...
3) gives you a competitive edge, which results in more sales...
4) could, but to a much lower effect, allow for longer sales...
But you are completely right. There is no such thing as free. You are paying for it somehow. I personally think though that MS is willing to take an extra hit for it because it will enable a common gesture based interface. An interface that's easy to use for everyone. Controllers scare people who don't normally use them. It will be much easier to navigate the console and Win 8-style apps through Kinect.
Not sure why the retail price matters since we are talking about the costs and $35 is going to be less than 5% that.
The discussion was based around the idea that Kinect ads enough value to selling consoles at higher prices. If they can sell more at a higher price, then yep, it's meaningless.
Except that's not true - and I'm pretty sure we've discussed this before. 1080p60 FP is already supported for single link under the HDMI 1.4 optional formats specification. The 3.4 Gbit/s max bandwidth of HDMI 1.3/1.4 has always been enough to do it.
The current issue is I'm not sure Silicon Image has released Tx/Rx cards clocked high enough to support it as yet. However I would expect them to be available for next gen.
The question then becomes what TV manufacturers will support it? You'll need to be in the market for a new high-endish TV in a year or two to get the initial ones. I'm probably waiting on a new TV for this very reason. Hopefully next or the following year's Sharp Elites support it. Then I'll sell a kidney and get one.
That's just it though ... BD3D is 1080p24 FP. Lot less bandwidth than 1080p60 FP. Current TV's don't support that.
What? The 720 is not going to cost $700 to manufacture.
Right, that's my entire point. If people are trying to work out the maximum power you could squeeze out of a $399 box in 2012, they're shooting too high if they think Kinect will also be included.
I actually don't have a position about what the better choice for MS would be. It might honestly be more profitable for them to sell a model without Kinect and let people use the one from their 360 or buy it separately when they want it -- or it might be better in the long run to force it into place on all units so it can see broader use by software. Not sure which is the better call.
[Nintex];32834225 said:MS bleeding money you're joking right? Even if they lose in a big way they still win, they're making money of Android of all things.
That's just it though ... BD3D is 1080p24 FP. Lot less bandwidth than 1080p60 FP. Current TV's don't support that.
The entertainment division is in red $6b or $7 billion, and Xbox=$5bil red, they might be in the black every month now but Xbox loss a shit load of money in the past 10 years...
What? The 720 is not going to cost $700 to manufacture.
There's no such thing as selling more consoles in the launch window; you're supply constrained and every extra dollar you spend on manufacturing is a dollar lost. Once you get out of launch window, it might have an effect, but you still have to balance it against other costs: Kinect might add sales, but in that case why not cut $35 from the silicon budget to offset it, thereby earning the extra Kinect sales and still not losing millions of extra dollars?
Like, if your thought process here was even remotely accurate, we'd see companies throwing around $50 price cuts far more than they do.
The entertainment division is in red $6b or $7 billion, and Xbox=$5bil red, they might be in the black every month now but Xbox loss a shit load of money in the past 10 years...
Oh man I love these pre-announcement times and the faulty reasoning it brings...
There a couple of things at play here... for arguments sake let's say MS will do this the conventional route, the smart route...
They contract AMD to design a chip for them. This is most likely what is now a higher mid range next gen chip (not for sale at the moment) with elements of the chip after that.
AMD pockets money for that, straight up in the pocket. I believe I read somewhere sometime it was 100 million for designing Xenos (could be significantly less, the other figure in my head was only 20 million) and MS got the IP of the chip for that price. They do have to pay some royalties though. Basically what happens then is that AMD/ATI have done their job (although they might give support for die shrinks). It's up to Microsoft to produce the chip. So the 'only' costs for MS are the wavers of silicon and royalties.
What happened with Xenos, the chip in the 360, was that they designed a chip in R500 range (X1800/X1900) with elements that were only introduced in the chip after that. MS got a say in what they wanted so for instance the chip was tailored to suit eDram MB and in comparison a lot of shader units. (But moderately clocked to keep power consumption reasonable).
Comparative cards like the X1800XT and X1950GT would run you between somewhere in the ballpark of 400$ when the 360 launched.
The same goes for the main CPU. Xenon in the case of 360 was contracted by MS at IBM. IBM designed the chip for a certain amount of money and gave MS the IP so MS could do with it what it wanted.
Those one time fees not withstanding, the main thing that determines the price of those chips are the yields (and some royalties to be paid). In the beginning it won't be cheap (lots of broken processing units) but as the process matures yields will get better. That $400 graphics card cost MS $140 when it launched, and it dropped 40% in the first year according to estimates.
In 2005 there was similar talk about how the box would be a million bucks because of the parts. But it won't be... The costs for MS aren't anywhere near the price you as a consumer would pay for similar PC parts.
great post. Hopefully provides a little food-for-thought for those thinking MS are going to newegg.com and buying graphics cards off the shelf. They've already shown they can get tech ahead of time, and customise it significantly for the task they want for it.
Assuming heavy customisation this time around again - what could be done to get a 6970 level performance cost less, be smaller, consume less power? I'm thinking you only need to drive one TV at 1080p, so no need for eyefinity or 2500x1500 monitor resolutions, so filtrate can be cut dramatically and still perform well.
So, you think the next XBox will cost more than $700 to make, but will be sold for $400?
What? The 720 is not going to cost $700 to manufacture.
What exactly is the function of a big, hot, powerful multicore CPU w/ high floating point capability in a next-gen console if AI and physics can be handled by the GPU?
You need to factor in R&D. When companies spend billions of dollars developing a product they have a preset amount of units that they will distribute those costs over. Once sales hit past the number costs really shrink and that's when you see them starting to nickle and dime things to reduce manufacturing costs.
Manufacturing costs are only one piece of the puzzle, marketing and R&D end up being huge chunks of the initial costs for consoles. PS3 and 360 are at the current state where manufacturing costs are almost all of the cost of them shipping out units so adding $35 to their manufacturing costs at the moment would be a huge deal but for launch consoles manufacturing is only a piece of the puzzle.
The point I was alluding to is the initial design is always contingent on expectations for future cost reductions (die shrinks). While as DopeyFish pointed out the following generation is at even greater risk, there is concern for this gen in terms of die shrinks. Expectations do not appear to be in line with last gen. That affects the planned transistor budget.I see 200W as the maximum TDP regardless of fab size. Is it possible to go higher next generation? Maybe with advances in cooling tech and a better designed ASIC they can push that to 250W.
Can you even do 4GB in 8 chips? Before even factoring the actual cost of the extra RAM, the impact on MoBo costs is already pretty significant.4 GB wouldn`t cost a LOT. Would it cost more? Of cause. But not THAT much.
The entertainment division is in red $6b or $7 billion, and Xbox=$5bil red, they might be in the black every month now but Xbox loss a shit load of money in the past 10 years...
You need to factor in R&D. When companies spend billions of dollars developing a product they have a preset amount of units that they will distribute those costs over. Once sales hit past the number costs really shrink and that's when you see them starting to nickle and dime things to reduce manufacturing costs.
Manufacturing costs are only one piece of the puzzle, marketing and R&D end up being huge chunks of the initial costs for consoles. PS3 and 360 are at the current state where manufacturing costs are almost all of the cost of them shipping out units so adding $35 to their manufacturing costs at the moment would be a huge deal but for launch consoles manufacturing is only a piece of the puzzle.
dude ... that's like $2 billion in costs$35 is an almost meaningless addition to the total costs of the system. If it's really the difference between selling or not selling they will just add it on to whatever the current stats are.
No, don't worry. They're spending so much already that spending more is perfectly fine!dude ... that's like $2 billion in costs
Just a random thought here, but if MS are now going to pushing the 720 more as a living room device/entertainment hub, I'd have thought they'd start targetting a lower price. Simply because they will be competing directly with Apple TV's and the like. Sure the ATV doesn't do games, but are families going to pay more just to play games?
I realise they have the gamer market to help push the console via word of mouth but if they are going to market it as some sort of entertainment device, I'd have thought somewhere like US$400 is really, really pushing the limits.
No, don't worry. They're spending so much already that spending more is perfectly fine!
The storys also rumouring that the machine is to have a hex-core CPU with 2GB of DDR3 RAM and has been in the works since 2005 the year Xbox 360 released.
The cost will be offset by the amount of money saved from buying memory on Newegg.
Assuming the Tx cards are available at system launch ... it would be nice to offer it as an option with reduced detail.interesting, didn't realise that. However, if its optional, and you're starting from scratch with TVs, I don't think it'd get much traction. And I'd be happy with 3D sticking to 720p at the same equivalent quality as the 2D 1080p version of the game. Stretching to 1080p for 3D might mean sacrificing detail etc.
i doubt that. i bet microsoft went to amd or whoever last year and was like, "ok, what ya got? we wanna buy something."
It would cost more than double to go to 4 GB- and considering the very minute improvement you'd get from 4 GB over 2, it's simply not worth the price
You seem to be missing the 'FP' (frame-packing) in my post. It was listed after the 1080p24 and the 1080p60. 1080p60 3D would be sending 1920x2205 @ 60Hz. Which obviously requires significantly more bandwidth than 24Hz. That's what I was referring to in what you quoted.Except that's not entirely true. When 3D is encoded on bluray the two frames are placed on top of each other with 45 black pixels between them. So you're really sending a 1920 x 2205 image at 24fps. Which is less than the 1080p/60 bandwidth, but it's really close.
There isn't a hard limit like that. They don't cut things that close. I'm pretty sure all HDMI 1.4a Tx/Rx are clocked fast enough that it can handle an extra 45 rows of 0's (which I think is compressed BTW) @ 60Hz. The bigger issue is whether it's designed to sync to it, and whether a TV's processor even knows what to do with it.So if you did 3D at 1080p and 30fps you'd be exceeding the 1080p/60 bandwidth limit. Which most cat2 cables should be able to handle, it's the current hdmi transmitters/receivers that probably can't.
PS3 normally uses FP ... though I think there are some games that use alternate methods (particularly on 360).I'm not sure if when doing a 3D game if they do the frame on top of each other thing, not sure how the PS3/360 push out their 3D to the TV.
I was just thinking about Microsoft's decision to go for 512MB of DDR3 instead of 256MB.
If they stuck with 256MB would this generation have lasted this long?
Right, that's my entire point. If people are trying to work out the maximum power you could squeeze out of a $399 box in 2012, they're shooting too high if they think Kinect will also be included.
I actually don't have a position about what the better choice for MS would be. It might honestly be more profitable for them to sell a model without Kinect and let people use the one from their 360 or buy it separately when they want it -- or it might be better in the long run to force it into place on all units so it can see broader use by software. Not sure which is the better call.
No, don't worry. They're spending so much already that spending more is perfectly fine!
The cost will be offset by the amount of money saved from buying memory on Newegg.
Free shipping!
Drive performance is the primary limiter in a streaming engine, not memory size.Then talk to Yerli from crytek and ask him why Crysis 2 on ps3 and xbox 360 has that much pop ups and he will explain you that the reason is the small RAM and therefore they had to use slow streaming, while the PC doesn`t have those lod problems because they can preload everything to the RAM.
The question is what the hell you are talking. LOL.
i doubt that. i bet microsoft went to amd or whoever last year and was like, "ok, what ya got? we wanna buy something."