Stephen Colbert
Banned
What is Project Denver?
Project Denver is NVIDIAs combined cpu/gpu architecture due out in 2013. It embeds the powerful cortex a15 cpu from ARM with the Maxwell gpu architecture from Nvidia onto one unified chip. Its one of the biggest undertakings that Nvidia has ever done.
While this will be shopped around as an independent GPU, due to the a15 integration, the GPU will be more than capable of running a full featured OS, calculating physics, AI and all general functions that a gaming console would need to be able to perform.
Here is additional information about the capabilities of the cortex a15 cpu being embedded into Maxwell
http://arm.com/about/newsroom/arm-u...r-to-dramatically-accelerate-capabilities.php
Here is additional information about the Maxwell GPU
http://techreport.com/discussions.x/19675
Gaming hardware this powerful could easily sustain Sony another decade.
How much would it cost?
Fermi retailed for $299 at launch. The retailer price always include a significant wholesaler markup, a large retailer markup, marketing cost and shipping and packaging costs, which combined usually account for about half the final retail price of any product. For a major partner buying (or licensing) the chips in bulk direct from the factory, the cost will be significantly lower, as low as $150 per chip. All sony would need to do is add in a small $20 SSD, and 3-4gb of XDR or DDR5 ram and they would be set.
Not having to go out and purchase a seperate CPU shaves a huge amoung of the cost from the PS4. A $399 pricetag for the PS4 wouldnt be out of the question if the console is packing nothing more than this chip to act as both the cpu and gpu, as well as a blu ray drive and a few inexpensives things like Wifi and Bluetooth. And by licensing a third party developed gpu and having the third party swallow all the R&D cost, they can do the same thing they did with the Vita, release an enormously powerful device at not an absurd price point while still not taking a significant loss on each console sold.
When will it come out?
This chip is due out in 2013. If I were Sony of Microsoft, I would target a fall 2014 release date for the next console just in case of unanticipated delays, to give devs more time with final dev kits, and to reduce costs by waiting till the production is full steam ahead and all the kinks are worked out before jumping in.
The timeline for Project Denver makes sense for Sony. Sonys first party development is still very active on the PS3, they are showing regular year on year growth, and they can still lower the price of the PS3 to boost sales. I dont think they will be ready to move on to the Playstation 4 until 2014 at the earliest, until the 22nm die size is available and can allow them to make a significant leap forward technologically at minimal cost. Games like Killzone and Uncharted look fantastic already. They also have the Vita to focus on making successful, and the longer they wait, the closer they get to bringing their investment with the PS3 back into black.
2013 also ties in with a dramatic die shrink to 22nm, that is expected to happen in that year, for ARM, Nvidia, AMD and PowerVR. As an aside, such a dramatic die shrink for ARM and PowerVR could easily coincide with a whole host of other devices, from a iPhone Mini, to a Playstation Vita redesign and an Xperia Play 2 smartphone packing the same internals as the Vita. So 2013 should a very interesting year to keep an eye out for releases in general. Letting the dust settle on the impact of this die shrink and releasing this device at the end of 2013, or early 2014 would be a wiser move than rushing something to market in 2012 built on a dated 32 nanometer process.
Why would Nvidia license out their newest architecture to a game console?
Money for one thing. Subsiding their investment for another. But most importantly, it would be a sure fire way to ensure that developers go out and code engines specifically to take advantage of this architecture and the enormous bandwidths it offers. On the PC, where they have to code games to support a dozen different architectures, the engines usually ignore unique features such as this. But on the console, this isnt the case, the best developers code down to the metal and build engines that specifically utilize the architectures strengths. Those same engines could then be ported over to the PC to produce stunning looking games that will ensure this architecture succeeds.
Why would Sony opt for Project Denver?
Sony has a long tradition of pushing technology to its limits, and in turn leveraging the PlayStation brand to help its own technologically superior formats succeed. This is why the PlayStation brand consistently draws in the tech enthusiasts, the early adopters, and gets retailers like Best Buy to bundle their consoles with their newest televisions. If Sony had the PlayStation brand to leverage back when they were launching the BetaMax, that format war would have gone in a different direction as well.
People take away the wrong lesson from the PS2 and PS3. The mistakes Sony made with both systems was to invest in developing a brand new processor largely on their own (the Emotion Engine in the PS2 and the Cell in the PS3) rather than simply seek out ATI or NVIDIA well ahead of time and work with them to deliver the best possible processor they can. Both the EE and the Cell were ridiculously expensive to develop, a nightmare to program for, and I think the PS2 and PS3 would have been just as successful without them. Without the Cell, the PS3 could have probably launched with a price tag of around $499 for the premium model and come roaring out of the gate.
Blu Ray by itself was a net gain for Sony. HDMI was a net gain for Sony. The only investment that didnt pay off for them was the Cell. Yet the PS3 despite all the criticism, the obscene launch price, and giving the 360 a full year head start, is still neck and neck with the 360 in terms of sales and is beloved by most gamers. There is no need to throw out Sonys process. All Sony needs to do is contain the costs to launch their console under a price tag of $499 for the premium model and $399 for the barebones model.
Project Denver would also be powerful enough to decode HVDs at 4k resolution aka film resolution video. It makes far too much sense for movie studios to use these HVDs to distribute their films to movie theater than the current massive expensive spindles they use now. The more movie theaters that use these discs, the more the costs fall and the more viable it becomes for consumers. Initially, people with projectors in their basements will want it. Soon after, people with 4k resolution tvs (which will be fairly common since that is the easiest way to do glassless 3D at 1080p) will also want it.
They could easily pair the PS4 with a DualMove + Eye HD A Dual Shock that splits in half into two Move controllers with a target release date in 2014, $399 for the basic sku, $499 for the premium model, all without losing a penny on each console sold.
Project Denver is NVIDIAs combined cpu/gpu architecture due out in 2013. It embeds the powerful cortex a15 cpu from ARM with the Maxwell gpu architecture from Nvidia onto one unified chip. Its one of the biggest undertakings that Nvidia has ever done.
While this will be shopped around as an independent GPU, due to the a15 integration, the GPU will be more than capable of running a full featured OS, calculating physics, AI and all general functions that a gaming console would need to be able to perform.
Here is additional information about the capabilities of the cortex a15 cpu being embedded into Maxwell
http://arm.com/about/newsroom/arm-u...r-to-dramatically-accelerate-capabilities.php
Here is additional information about the Maxwell GPU
http://techreport.com/discussions.x/19675
http://raytracey.blogspot.com/2011/03/some-details-about-project-denver.htmlIn theory, Project Denver cores inside the Maxwell GPU die should enjoy access to 2+TB/s of internal bandwidth and potentially beyond currently possible 320GB/s of external memory bandwidth (using 512-bit interface and high-speed GDDR5 memory). If nVidia delivers this architecture as planned, we might see quite a change in the market - given that neither CPUs from AMD or Intel don't have as high system bandwidth as contemporary graphics cards."
With such extremely fast memory bandwidth between the ARM CPU and the Maxwell GPU (both on the same die), real-time ray tracing of dynamic scenes will benefit greatly because building and rebuilding/refitting of acceleration structures (such as BVHs) is still best handled by the CPU (although there are parallel implementations already, see the HLBVH paper by Pantaleoni and Luebke or the real-time kd-tree construction paper by Rui Wang et al.)
David Luebke (Nvidia graphics researcher and GPU ray tracing expert) said in a chat session preceding the GTC 2010 conference in September:
"I think Jacopo Pantaleoni's "HLBVH" paper at High Performance Graphics this year will be looked back on as a watershed for ray tracing of dynamic content. He can sort 1M utterly dynamic triangles into a quality acceleration structure at real-time rates, and we think there's more headroom for improvement. So to answer your question, with techniques like these and continued advances in GPU ray traversal, I would expect heavy ray tracing of dynamic content to be possible in a generation or two."
This would imply that the Maxwell generation of GPUs would be able to raytrace highly dynamic scenes and that path tracing of dynamic scenes could be feasible as well. A pretty exciting thought and much sooner than expected.
Gaming hardware this powerful could easily sustain Sony another decade.
How much would it cost?
Fermi retailed for $299 at launch. The retailer price always include a significant wholesaler markup, a large retailer markup, marketing cost and shipping and packaging costs, which combined usually account for about half the final retail price of any product. For a major partner buying (or licensing) the chips in bulk direct from the factory, the cost will be significantly lower, as low as $150 per chip. All sony would need to do is add in a small $20 SSD, and 3-4gb of XDR or DDR5 ram and they would be set.
Not having to go out and purchase a seperate CPU shaves a huge amoung of the cost from the PS4. A $399 pricetag for the PS4 wouldnt be out of the question if the console is packing nothing more than this chip to act as both the cpu and gpu, as well as a blu ray drive and a few inexpensives things like Wifi and Bluetooth. And by licensing a third party developed gpu and having the third party swallow all the R&D cost, they can do the same thing they did with the Vita, release an enormously powerful device at not an absurd price point while still not taking a significant loss on each console sold.
When will it come out?
This chip is due out in 2013. If I were Sony of Microsoft, I would target a fall 2014 release date for the next console just in case of unanticipated delays, to give devs more time with final dev kits, and to reduce costs by waiting till the production is full steam ahead and all the kinks are worked out before jumping in.
The timeline for Project Denver makes sense for Sony. Sonys first party development is still very active on the PS3, they are showing regular year on year growth, and they can still lower the price of the PS3 to boost sales. I dont think they will be ready to move on to the Playstation 4 until 2014 at the earliest, until the 22nm die size is available and can allow them to make a significant leap forward technologically at minimal cost. Games like Killzone and Uncharted look fantastic already. They also have the Vita to focus on making successful, and the longer they wait, the closer they get to bringing their investment with the PS3 back into black.
2013 also ties in with a dramatic die shrink to 22nm, that is expected to happen in that year, for ARM, Nvidia, AMD and PowerVR. As an aside, such a dramatic die shrink for ARM and PowerVR could easily coincide with a whole host of other devices, from a iPhone Mini, to a Playstation Vita redesign and an Xperia Play 2 smartphone packing the same internals as the Vita. So 2013 should a very interesting year to keep an eye out for releases in general. Letting the dust settle on the impact of this die shrink and releasing this device at the end of 2013, or early 2014 would be a wiser move than rushing something to market in 2012 built on a dated 32 nanometer process.
Why would Nvidia license out their newest architecture to a game console?
Money for one thing. Subsiding their investment for another. But most importantly, it would be a sure fire way to ensure that developers go out and code engines specifically to take advantage of this architecture and the enormous bandwidths it offers. On the PC, where they have to code games to support a dozen different architectures, the engines usually ignore unique features such as this. But on the console, this isnt the case, the best developers code down to the metal and build engines that specifically utilize the architectures strengths. Those same engines could then be ported over to the PC to produce stunning looking games that will ensure this architecture succeeds.
Why would Sony opt for Project Denver?
Sony has a long tradition of pushing technology to its limits, and in turn leveraging the PlayStation brand to help its own technologically superior formats succeed. This is why the PlayStation brand consistently draws in the tech enthusiasts, the early adopters, and gets retailers like Best Buy to bundle their consoles with their newest televisions. If Sony had the PlayStation brand to leverage back when they were launching the BetaMax, that format war would have gone in a different direction as well.
People take away the wrong lesson from the PS2 and PS3. The mistakes Sony made with both systems was to invest in developing a brand new processor largely on their own (the Emotion Engine in the PS2 and the Cell in the PS3) rather than simply seek out ATI or NVIDIA well ahead of time and work with them to deliver the best possible processor they can. Both the EE and the Cell were ridiculously expensive to develop, a nightmare to program for, and I think the PS2 and PS3 would have been just as successful without them. Without the Cell, the PS3 could have probably launched with a price tag of around $499 for the premium model and come roaring out of the gate.
Blu Ray by itself was a net gain for Sony. HDMI was a net gain for Sony. The only investment that didnt pay off for them was the Cell. Yet the PS3 despite all the criticism, the obscene launch price, and giving the 360 a full year head start, is still neck and neck with the 360 in terms of sales and is beloved by most gamers. There is no need to throw out Sonys process. All Sony needs to do is contain the costs to launch their console under a price tag of $499 for the premium model and $399 for the barebones model.
Project Denver would also be powerful enough to decode HVDs at 4k resolution aka film resolution video. It makes far too much sense for movie studios to use these HVDs to distribute their films to movie theater than the current massive expensive spindles they use now. The more movie theaters that use these discs, the more the costs fall and the more viable it becomes for consumers. Initially, people with projectors in their basements will want it. Soon after, people with 4k resolution tvs (which will be fairly common since that is the easiest way to do glassless 3D at 1080p) will also want it.
They could easily pair the PS4 with a DualMove + Eye HD A Dual Shock that splits in half into two Move controllers with a target release date in 2014, $399 for the basic sku, $499 for the premium model, all without losing a penny on each console sold.