• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

onQ123

Member
I didn't say it, it came from fudzilla resecption is mixed on ps4's power, one developer said it was on par, which is all we have really heard about for wii u except for named developers saying its more power, still any system that isn't 10x more powerful could be considered on par with current gen

lol what?


but something to remember is when the Xbox 360 1st came out some games looked like HD Xbox /PS2 games so a dev saying so far it's on par with current gen could just mean he haven't figured out how to push it past current gen.
 

Tarin02543

Member
Has there been some kind of breakthrough in cooling solutions these past 6 years that will give us cool and quiet consoles next gen?
 

z0m3le

Banned
lol what?


but something to remember is when the Xbox 360 1st came out some games looked like HD Xbox /PS2 games so a dev saying so far it's on par with current gen could just mean he haven't figured out how to push it past current gen.

Exactly what I mean, that 10x thing is just the idea that xbox360 was 10x more powerful than xbox. I think you nailed exactly what is going on right now.
 

SkylineRKR

Member
Those rumors are ridiculous anyway, a just slightly more powerful Ps4 that cannot even play Ps3 games? lol


You make good points, but I wouldn't discount Microsoft's online strength. I think Live is a big reason Sony cant consistently outsell 360 despite arguably stronger exclusive software. I dont think brand loyalty is especially strong for either company, so next Gen could be different. But right now in the US Microsoft has a positive feedback loop through its online community.

When going up against MS, Sony doesn't have the benefit of the Live services, CoD timed exclusive DLC... those kind of things. The Ps2 ignored online play while MS was already laying a foundation with the original Xbox, they are a step ahead. And now they are a step ahead with their motion controls as well, will Sony innovate or pull off a Move?

MS also proved that you don't need a whole batch of exclusives to succeed. They ditch out a Gears of War, and the next year a Halo, and then a Forza. That along with a strong third party support and some exclusive promotion for the right titles (CoD) is seemingly enough to overpower Sony in the USA.

Sony could for example, make GT a 2 year franchise in the year that Uncharted does not get a release. Having said that, Sony could also keep with Uncharted instead of letting ND creating another IP that has yet to find its audience first. I like how Sony is keeping with the core gamer by releasing stuff like Twisted Metal, but it seems it does not really help them.
 

theBishop

Banned
IMO, Wii U is gonna be a tough sell as things stand.

It basically seems like it threw the Motion control idea right out of the window, games wise it will be competing directly with 360 and PS3, with probably fewer 3rd party content that visually doesn't justify the leap.

I don't know if that touchscreen controller is gonna turns people's head around like the Wiimote did.

The odd thing is they're still talking about the WiiMote as supported hardware. Remember, as far as we know today, WiiU only supports one tablet. Maybe they'll get multiple tablets working before release, but everything we're hearing says player 2 will be using a Wiimote.

It seems to me like Nintendo is trying to straddle both audiences with WiiU. They're shooting for parity with 3rd party PS3/360 support basically day-1. So hopefully that will keep 'core' Nintendo fans satisfied.

But the Wii-branding, the (very) similar hardware design, and Wiimote support presumably for new games suggests a continued commitment to the Wii Sports audience.
 

Pistolero

Member
Wii U already is clearly superior to ps360, also it will definitely grow a large userbase in 2 years if no competition exists. Quickly looking at known specs and we have an out of order executing CPU with 3 cores multithreaded, over 1gb ram and 32mb embedded ram on the gpu. Modern sharers for much better lighting and tessellation as well... I thought the bird demo was pretty impressive since it was rendered twice, on early underclocked devkits

Unless Nintendo proves their ability to invest in an online offering that matches those of its rivals, I'll keep thinking that they do not represent a threat to either Sony or Microsoft. They could launch earlier and be sell quite a lot, but it is my firm belief that it wouldn't jeopardize Durango and Orbis success...
 
Do you realize the non-sense of this ?
Wii U on par/moderately superior to the current systems = okay
But PS4 = PS3 in term of power ?
So for you ,PS4 is a super slim PS3 ?

The irony is that you're spewing the same nonsense...why would either scenario Wii U simply being on par or PS4 = PS3 be any more likely than the other? Neither really make sense
 
IMO, Wii U is gonna be a tough sell as things stand.

It basically seems like it threw the Motion control idea right out of the window, games wise it will be competing directly with 360 and PS3, with probably fewer 3rd party content that visually doesn't justify the leap.

I don't know if that touchscreen controller is gonna turns people's head around like the Wiimote did.

Wii U have wii mote and the pad got a mini bar in it (the black bar with the camera)
 

Lord Error

Insane For Sony
Whatever the specs are, Epic has already set my expectations for the minimum I want out of a high profile next gen game with Samaritan

Will be disappointed with any less
This is very true. The good news is that there was already a report that the demo was running on next Xbox dev kits. If that level of visuals simply won't be achievable on PS4 even at 720p, (which is definitely true with these specs) and if it launches at the same time as MS, they will be making a big mistake. Instead of creating a product that practically no one will care about, they shouldn't even bother doing hardware, it would be better for everyone.

"except we wanted to do it in realtime. on a consumer gpu. on vanilla directx 9. no cuda/compute, no geometry shaders." -from that video link you gave me. -basically you could probably do that with some OpenCL code to work right on the GPU, maybe you haven't heard of PPUs, but that is basically what I am talking about, it's not even a problem for these systems to have the compute cores, it's built into HD4000 series and up, and is really good in the HD7000 series cards.
I don't know what else to tell you except that I saw it running at about 5FPS on my weaker DX9/early DX10 GPU, and he shows that his approach, slow as it may sound, is still much faster than any available Open CL solution at the time. I have no doubt that 6670 or the A3850 APU would be able to do this in some capacity, but it will be insufficient for anything serious. When it comes to these complex simulations you'd obviously want them running concurrently with tons of other things the game normally has to do, so the power needs to be there for it as well.
 

z0m3le

Banned
This is very true. The good news is that there was already a report that the demo was running on next Xbox dev kits. If that level of visuals simply won't be achievable on PS4 even at 720p, (which is definitely true with these specs) and if it launches at the same time as MS, they will be making a big mistake. Instead of creating a product that practically no one will care about, they shouldn't even bother doing hardware, it would be better for everyone.


I don't know what else to tell you except that I saw it running at about 5FPS on my weaker DX9/early DX10 GPU, and he shows that his approach, slow as it may sound, is still much faster than any available Open CL solution at the time. I have no doubt that 6670 or the A3850 APU would be able to do this in some capacity, but it will be insufficient for anything serious.

Actually... a GPU computing physics is vastly superior to a CPU doing the same, the entire concept of PhysX PPUs and Nvidia's buy out of said company was because of this. OpenCL can do this far superior with far less power thanks to parallel processing making GPUs math monsters.

He even says that they purposefully didn't want to use Cuda or OpenCL code, meaning the GPUs didn't process the physics in that demo.
 

onQ123

Member
This is very true. The good news is that there was already a report that the demo was running on next Xbox dev kits. If that level of visuals simply won't be achievable on PS4 even at 720p, (which is definitely true with these specs) and if it launches at the same time as MS, they will be making a big mistake. Instead of creating a product that practically no one will care about, they shouldn't even bother doing hardware, it would be better for everyone.


I don't know what else to tell you except that I saw it running at about 5FPS on my weaker DX9/early DX10 GPU, and he shows that his approach, slow as it may sound, is still much faster than any available Open CL solution at the time. I have no doubt that 6670 or the A3850 APU would be able to do this in some capacity, but it will be insufficient for anything serious.

have you seen a game made for a console with a APU + GPU of these specs? because if not there is no way that you could no that.
 

theBishop

Banned
When going up against MS, Sony doesn't have the benefit of the Live services, CoD timed exclusive DLC... those kind of things. The Ps2 ignored online play while MS was already laying a foundation with the original Xbox, they are a step ahead. And now they are a step ahead with their motion controls as well, will Sony innovate or pull off a Move?

I'm very curious to see Microsoft's vision for online gaming. Sony started this generation with poor online support, but now Vita is doing things that Xbox Live does not. And they're still offering these features for free. If PS4 does nothing but match Vita, Sony will be starting in a pretty good position. Unless Microsoft has some game-changing vision, which is definitely possible.

Despite Kinect's commercial success, its software support is still pretty weak. I'm not at all convinced by motion controls based on this generation. But I don't think there's anything fundamentally bad about Move compared to other implementations. The software just isn't there.

MS also proved that you don't need a whole batch of exclusives to succeed. They ditch out a Gears of War, and the next year a Halo, and then a Forza. That along with a strong third party support and some exclusive promotion for the right titles (CoD) is seemingly enough to overpower Sony in the USA.

I don't agree. Microsoft was very strategic about tying up exclusives in the first 2 years. Bioshock, Mass Effect, Splinter Cell, etc. Exclusives mattered to MS early in the cycle, but they basically stopped seeking exclusives after 2008.
 

Lord Error

Insane For Sony
He even says that they purposefully didn't want to use Cuda or OpenCL code, meaning the GPUs didn't process the physics in that demo.
He uses GPU for that in the demo as well, not the CPU. So far, Nvidia has failed to demonstrate anything better performing even on their higher GPUs.

I'm bringing these complex simulations up for discussion as they are one clearly obvious thing that games could benefit from, with no extra production cost, and something that would separate things visually and technically from old machines - if there's enough grunt to get them working. But to make them work, hardware has to be not just able to render them in some controlled demo situations, but have them running in parallel with tons of other things games normally have to do. It all boils down to convincing people whether the new hardware is worth spending money on. If you can't convince them that visually it's worth the upgrade, there should be something else, like a good new control option was when Wii came out. Otherwise, most people, who are not visually discerning enthusiast will think 'what's the point really'.
 
Actually... a GPU computing physics is vastly superior to a CPU doing the same, the entire concept of PhysX PPUs and Nvidia's buy out of said company was because of this. OpenCL can do this far superior with far less power thanks to parallel processing making GPUs math monsters.

He even says that they purposefully didn't want to use Cuda or OpenCL code, meaning the GPUs didn't process the physics in that demo.
My understanding of OpenCL is that it makes writing code for GPUs and SPUs easier as well as allowing the same code to run on multiple different types of processors. It's upper level and is supposed to run as fast as native but allows forward compatibility.

You do not need OpenCL or Cuda to use the GPUs for math. AMD provides libraries so compilers have support for OpenCl as well as C++.
 

z0m3le

Banned
He uses GPU for that in the demo as well, not the CPU. So far, Nvidia has failed to demonstrate anything better performing even on their higher GPUs.

I'm bringing these complex simulations up for discussion as they are one clearly obvious thing that games could benefit from, with no extra production cost, and something that would separate things visually and technically from old machines - if there's enough grunt to get them working. But to make them work, hardware has to be not just able to render them in some controlled demo situations, but have them running in parallel with tons of other things games normally have to do. It all boils down to convincing people whether the new hardware is worth spending money on. If you can't convince them that visually it's worth the upgrade, there should be something else, like a good new control option was when Wii came out. Otherwise, most people, who are not visually discerning enthusiast will think 'what's the point really'.

I agree with what you are saying, but the physic in that demo can EASILY be offloaded to the APU's GPU, freeing up the CPU and GPU to handle the other things, it actually can be handled better in this way than with an ultra high end pc that doesn't have a cuda or opencl option.

Basically these specs are deceptively low, really it would be a very big step for stuff like Physics and AI.

My understanding of OpenCL is that it makes writing code for GPUs and SPUs easier as well as allowing the same code to run on multiple different types of processors. It's upper level and is supposed to run as fast as native but allows forward compatibility.

You do not need OpenCL or Cuda to use the GPUs for math. AMD provides libraries so compilers have support for OpenCl as well as C++.

OMG I actually don't understand where you guys have been living, go look up a physX video, that is basically what OpenCL will allow, physics is usually handled on the CPU, that is what CUDA and OpenCL does, is allow the GPU to do things normally only possible on CPUs, and Physic is something that GPUs are vastly superior at doing.

http://www.youtube.com/watch?v=6GyKCM-Bpuw here is batman with physX Tacked on.
 

Lord Error

Insane For Sony
OMG I actually don't understand where you guys have been living, go look up a physX video, that is basically what OpenCL will allow, physics is usually handled on the CPU, that is what CUDA and OpenCL does, is allow the GPU to do things normally only possible on CPUs, and Physic is something that GPUs are vastly superior at doing.
I think he knows that (and so do I). I don't think that's what he was saying, and it's definitely not what I was saying (that CPU was to be used for physics calculations).

This is a nice, and seemingly well informed article, but it does neglect the situation in which Vita used a fairly old hardware, just the high end version of it. He's assuming Sony would use whatever latest 'budget' hardware is available at the time of the launch but what he neglects is the fact that even such hardware is going to be more expensive than years old budget hardware that they'd have according to these specs.
 
IMO, Wii U is gonna be a tough sell as things stand.

It basically seems like it threw the Motion control idea right out of the window, games wise it will be competing directly with 360 and PS3, with probably fewer 3rd party content that visually doesn't justify the leap.

I don't know if that touchscreen controller is gonna turns people's head around like the Wiimote did.

I agree. I don't think they can capture that "blue ocean" market again. Not only that but I don't think they can't get the "core" audience without an established online. People are already on either XBL or PSN.
 

i-Lo

Member
As GPUs become more capable and acquires better software to handle general performance tasks, how far are we from having a dual GPU system without the CPU?
 

gaming_noob

Member
I agree. I don't think they can capture that "blue ocean" market again. Not only that but I don't think they can't get the "core" audience without an established online. People are already on either XBL or PSN.

The elementary kids who grew up with the Wii are entering high school just as PS4/720 come out, and will be looking to move away to different styles of games...at least that's what I did when I jumped from NES/SNES/N64 to Playstation. I would kill to relive those times when I realized Nintendo games weren't the only good games out there.
 
The elementary kids who grew up with the Wii are entering high school just as PS4/720 come out, and will be looking to move away to different styles of games...at least that's what I did when I jumped from NES/SNES/N64 to Playstation. I would kill to relive those times when I realized Nintendo games weren't the only good games out there.

Most the people I know that have wii's either don't play them except for Brawl or GC games or they are within the first 3 grades of school. Xenoblade has caught the attention of a few I know, but some are just dicks that pirate it even though its available in NA now... That's another story though.
 
As GPUs become more capable and acquires better software to handle general performance tasks, how far are we from having a dual GPU system without the CPU?
Never happening. While gpus are good for calculations, it'd probably be bad to have it handling IRQs from other devices such as the I/O controller for a game controllers.

Phsyics, lighting, and calculating data trends = perfect for GPU. Handling events from multiple devices, O/S scheduling, and other general items = best served by CPU.
 

Melchiah

Member
That's not the case anymore, maybe in the PS2 days, and even then it didn't have those sales per game.

http://www.edge-online.com/news/singstar-sales-hit-20-million
December 03, 2009

Sony’s SingStar karaoke series has reached 20 million copies sold worldwide.

Developed by SCEE London Studios and first launched in 2004, SingStar is the platform holder’s most successful social franchise.

The series is most popular in Europe, with PAL sales topping 16 million units.
 

i-Lo

Member
Never happening. While gpus are good for calculations, it'd probably be bad to have it handling IRQs from other devices such as the I/O controller for a game controllers.

Phsyics, lighting, and calculating data trends = perfect for GPU. Handling events from multiple devices, O/S scheduling, and other general items = best served by CPU.

I see. Thanks for breaking it down for a lay person such as myself.
 
Do you think Sony revealing ps4 this e3 will benefit them? Sort of like how Microsoft revealed 360 first? maybe even release it first because that is what MS did since they know that there was no chance against ps2.
 
This is very true. The good news is that there was already a report that the demo was running on next Xbox dev kits. If that level of visuals simply won't be achievable on PS4 even at 720p, (which is definitely true with these specs) and if it launches at the same time as MS, they will be making a big mistake. Instead of creating a product that practically no one will care about, they shouldn't even bother doing hardware, it would be better for everyone.

Epic quotes 2.5 TFlops as the requirement for 1080p. 1.2 TFlops is enough to do Samaritan at 720p.

This is a nice, and seemingly well informed article, but it does neglect the situation in which Vita used a fairly old hardware, just the high end version of it. He's assuming Sony would use whatever latest 'budget' hardware is available at the time of the launch but what he neglects is the fact that even such hardware is going to be more expensive than years old budget hardware that they'd have according to these specs.

It makes sense using a mid range Sea Island GPU will offer more bang per buck in 2013 than a 2011 Southern Islands based GPU. Same goes with the APU. The capabilities are increasing rapidly. The Trinity APU's GPU is like 50% faster than the current generation. I think the final silicon will inevitably exceed the Orbis rumor specs. You want to play it safe with the first dev kits, and put the bare minimum that will be in the box. Well there were rumors of Wii-U devs grumbling about downgrades.


Although I've said before, Samaritan is not a benchmark for the max the next gen systems will achieve. It's not a highly optimized game engines. It's just some project they came up with in a short time frame. It's not even that impressive to me.
 

KageMaru

Member
lol what?


but something to remember is when the Xbox 360 1st came out some games looked like HD Xbox /PS2 games so a dev saying so far it's on par with current gen could just mean he haven't figured out how to push it past current gen.

That's because they were ps2/xbox games in HD. No one was mistaking games like PGR3, Condemned, and CoD2 for HD ps2 games.

Also no one bases a console's power on potential. If any system is just marginally more powerful than the PS360, it will always be marginally more powerful, even years down the line.
 

theBishop

Banned
That's because they were ps2/xbox games in HD. No one was mistaking games like PGR3, Condemned, and CoD2 for HD ps2 games.

Also no one bases a console's power on potential. If any system is just marginally more powerful than the PS360, it will always be marginally more powerful, even years down the line.

Heh, I didn't want to bite on this, but yeah. It's pretty absurd to think any engineer employed by a known (or unknown) studio wouldn't understand that 360 is dramatically more powerful than Xbox1 even if their own game is an uprezzed port.

As if Japanese studios are trying desperately to make their PSP games look better on Vita and just can't crack the nut.
 

Raistlin

Post Count: 9999
Yeah, I'd imagine software coded for a CPU+GPU+APU setup would be very impressive, but I was thinking about whether it would be hard to write code for a setup like that, in terms of what would the GPU be doing, what would the CPU be doing, and what would be offloaded to the APU while the other two are doing their thing. Wouldn't it be somewhat similar to what some of Sony's first-party devs do when offloading GPU tasks to the CELL SPUs? And considering how most third party developers don't bother with this due to the difficulty, would it be any easier on the PS4?

It isn't a CPU+GPU+APU setup. The CPU is in the APU. This is a APU + discrete GPU setup.

In terms of coding, ATi has already implemented load balancing to automatically distribute GPU rendering between the two (essentially it's an asymmetric crossfire). So Sony could simply work with them to do derive an optimized custom variant since this is a fixed HW setup. Or as some of postulated, they could always set it up such that APU's GPU is dedicated to compute and/or some other specific type of rendering (plus I suspect the OS will use some of the resources for multitasking).
 

onQ123

Member
That's because they were ps2/xbox games in HD. No one was mistaking games like PGR3, Condemned, and CoD2 for HD ps2 games.

Also no one bases a console's power on potential. If any system is just marginally more powerful than the PS360, it will always be marginally more powerful, even years down the line.

Call of Duty 2 is one of the games that I was thinking about when I was typing that & I didn't say anyone was mistaking the games for HD ps2 games.

trust me some of the 1st demos of the PS4\ Xbox Next are not going to be leaps & bounds above The Last of Us at 1st sight because it's going to take some time to get used to the new hardware & get in touch with what can be done.
 

onQ123

Member
Heh, I didn't want to bite on this, but yeah. It's pretty absurd to think any engineer employed by a known (or unknown) studio wouldn't understand that 360 is dramatically more powerful than Xbox1 even if their own game is an uprezzed port.

As if Japanese studios are trying desperately to make their PSP games look better on Vita and just can't crack the nut.


that's because the devs have been making games above Vita specs for over 7 years now.
 

Raistlin

Post Count: 9999
but something to remember is when the Xbox 360 1st came out some games looked like HD Xbox /PS2 games so a dev saying so far it's on par with current gen could just mean he haven't figured out how to push it past current gen.
If some of the dev statements are to be believed, while the Wii U GPU may be more powerful overall, it appears to be implementing a custom shader model that's missing some of the features found in PS3/360.

That means certain effects are either more difficult or potentially not possible. I suspect that's where the differing opinions are coming from.





The odd thing is they're still talking about the WiiMote as supported hardware. Remember, as far as we know today, WiiU only supports one tablet. Maybe they'll get multiple tablets working before release, but everything we're hearing says player 2 will be using a Wiimote.
IIRC, when being questioned there was some talk from Nintendo regarding a 2nd tablet being a possibility, but they weren't sure of the viability. So it may come to pass, but yes ... certainly multiplayer games beyond 2 players implies WiiMotes (though I suspect they'll be somewhat upgraded in order to get more accessory sales).

Pushing beyond 2 tablets just doesn't seem possible. There's issues in terms of processing power (though depending on how it's being used that may not be an issue) ... but I suspect the real long-pole is wireless bandwidth. We're only now seeing low latency wireless tech that supports 1080p (and it's not with all the bells and whistles), and it's still pretty pricey. There are simply limits in what they can transmit at a reasonable cost.





As GPUs become more capable and acquires better software to handle general performance tasks, how far are we from having a dual GPU system without the CPU?
GPU's are great at doing parallel processes. Previously they were much more specialized for T&L and other graphics processing, but we've moved to a point where they are now much more programmable ... so non-graphics things can also be done (ie. 'compute'). However, they are still only really optimal for computing things that are inherently parallelizable.

For processes that need very fast linear processing, optimized branch prediction, out-of-order operations, etc ... a CPU is needed.
 

theBishop

Banned
that's because the devs have been making games above Vita specs for over 7 years now.

Well, yeah but that's not the point. Having to come to grips with a new generation of development is a challenge, you're right about that. But that doesn't mean the raw hardware capabilities are completely mysterious. Programers have tons of hardware profiling tools available. It's pretty easy to see if you're using hardware effectively.

That's how Naughty Dog is able to say after Drake's Fortune "we're only using x % of the PS3 on this game".
 

theBishop

Banned
IIRC, when being questioned there was some talk from Nintendo regarding a 2nd tablet being a possibility, but they weren't sure of the viability. So it may come to pass, but yes ... certainly multiplayer games beyond 2 players implies WiiMotes (though I suspect they'll be somewhat upgraded in order to get more accessory sales).

Pushing beyond 2 tablets just doesn't seem possible. There's issues in terms of processing power (though depending on how it's being used that may not be an issue) ... but I suspect the real long-pole is wireless bandwidth. We're only now seeing low latency wireless tech that supports 1080p (and it's not with all the bells and whistles), and it's still pretty pricey. There are simply limits in what they can transmit at a reasonable cost.

Yep, Nintendo hasn't shed much light on the wireless tech powering the tablet, and I haven't seen any leaks/rumors about it either. Considering the range is restricted to the same room, I expect it to be based on WirelessHD. That tech should have ample bandwidth to support 2 854*480 screens.

But reports from last year suggest Nintendo was struggling to get the streaming tech to work over a single tablet. If the report is correct, dev units were tethered. That's not a huge shock for any development hardware, but if one tablet is taking a lot of effort, 2 might be a bridge too far.
 
The elementary kids who grew up with the Wii are entering high school just as PS4/720 come out, and will be looking to move away to different styles of games...at least that's what I did when I jumped from NES/SNES/N64 to Playstation. I would kill to relive those times when I realized Nintendo games weren't the only good games out there.

Oh wow, we're really still getting posts like this. Feels like I've jumped back a decade
 
Oh wow, we're really still getting posts like this. Feels like I've jumped back a decade

I'm not sure what the particular objection is to that notion.

It's no secret the brand cultivated by Nintendo for the Wii appealed primarily to both expanded audience demographics and younger demographics.

A usage study of console demographics found the audience skewed younger.

It's part of the reason for their success.
 

Combichristoffersen

Combovers don't work when there is no hair
It isn't a CPU+GPU+APU setup. The CPU is in the APU. This is a APU + discrete GPU setup.

In terms of coding, ATi has already implemented load balancing to automatically distribute GPU rendering between the two (essentially it's an asymmetric crossfire). So Sony could simply work with them to do derive an optimized custom variant since this is a fixed HW setup. Or as some of postulated, they could always set it up such that APU's GPU is dedicated to compute and/or some other specific type of rendering (plus I suspect the OS will use some of the resources for multitasking).

I see. Should make it easier to code for, I guess, if there's not a separate CPU in addition to the APU and GPU. Thanks :)
 

TTP

Have a fun! Enjoy!
Yep, Nintendo hasn't shed much light on the wireless tech powering the tablet, and I haven't seen any leaks/rumors about it either. Considering the range is restricted to the same room, I expect it to be based on WirelessHD. That tech should have ample bandwidth to support 2 854*480 screens.

But reports from last year suggest Nintendo was struggling to get the streaming tech to work over a single tablet. If the report is correct, dev units were tethered. That's not a huge shock for any development hardware, but if one tablet is taking a lot of effort, 2 might be a bridge too far.

Searching around for WirelessHD stuff I ended up reading about this wireless electricity tech developed and patented by WiTricity which basically allows to transfer power over much greater distance than what magnetic induction allows. "From a centimeter to several meters", apparently. And it's totally safe too as it basically uses magnetic fields the same strength as the Earth one.

That got me thinking (again) about a stereoscopic head mounted display as a possible component of the PS4. WirelessHD + Wireless electricity would allow to build a light (no big battery/connectors needed) wireless HMD that charges continuously.

Wonder if it's feasible.

*keepsondreaming*
 

Raistlin

Post Count: 9999
Searching around for WirelessHD stuff I ended up reading about this wireless electricity tech developed and patented by WiTricity which basically allows to transfer power over much greater distance than what magnetic induction allows. "From a centimeter to several meters", apparently. And it's totally safe too as it basically uses magnetic fields the same strength as the Earth one.

That got me thinking (again) about a stereoscopic head mounted display as a possible component of the PS4. WirelessHD + Wireless electricity would allow to build a light (no big battery/connectors needed) wireless HMD that charges continuously.

Wonder if it's feasible.

*keepsondreaming*

While that would be pushing things on so many levels ... one has to wonder if wireless/battery-free controllers could be viable the gen after next. Or at least in 3rd party controllers
 

onQ123

Member
Searching around for WirelessHD stuff I ended up reading about this wireless electricity tech developed and patented by WiTricity which basically allows to transfer power over much greater distance than what magnetic induction allows. "From a centimeter to several meters", apparently. And it's totally safe too as it basically uses magnetic fields the same strength as the Earth one.

That got me thinking (again) about a stereoscopic head mounted display as a possible component of the PS4. WirelessHD + Wireless electricity would allow to build a light (no big battery/connectors needed) wireless HMD that charges continuously.

Wonder if it's feasible.

*keepsondreaming*

by time we find out the side effects it will be too late Electric Babies everywhere.
 
AMD Bets the Farm on CPU-GPU Integration Strategy

AMD's future Heterogeneous System Architecture (HSA) will start appearing in AMD products in 2013,

AMD is working to make its next-generation architecture an open industry standard for the developer community. Rather than attach its own branding to HSA, the company says it wants to make it clear as day that this is an open platform, even if the new architecture actually fuses x86 cores with its graphics technology more than ever before.

"If you truly want to make something stick, you have to build the entire ecosystem around it," said Lisa Su, senior vice president and general manager of AMD's Global Business Units. "HSA is really trying to bring the software ecosystem together with the hardware ecosystem. It's a big step and it will take a lot of energy across the industry, but we feel it's the right thing to do."

Patrick Moorhead, president and principal analyst of Moor Insights and Strategy, called HSA a "swing for the fences move." "If they hit," he said, "they hit big, because there's nobody who can pull these two [processor technologies] together other than AMD."

Chips based on the Heterogeneous System Architecture will be a lot more than just a CPU slapped next to a GPU. The first graphics chips with very early HSA capabilities are supposed to come out this year but the HSA party really gets started in 2013.

That's when AMD plans to release a pair of APUs code named Kaberi and Kavini that will sport a CPU-GPU combo that shares a unified memory cache. Those chips will also feature unified address space for the CPU and GPU components, the latter of which will use pageable system memory via CPU pointers.

In 2014, AMD intends to take HSA from the architectural integration stage to the system integration stage. In simple terms, that means computers that will know how to throttle up the CPU portion of the APU that runs them and dial down the graphics component for scalar processing tasks, while doing the opposite for parallel processing work that's more suited to the GPU.
And the following supports AMD wanting Game consoles to use the AMD APU and HSA which they are supporting with OpenCL and C++ compiler libraries.
Still, HSA's viability is chained to AMD's ability to nurture and grow a dedicated developer ecosystem around it, he warned. "If it doesn't work out, what you're left with is this amazing hardware platform without software and applications to take advantage of it," Moorhead said.
I'd guess Sony and possibly Microsoft ("there's nobody who can pull these two [processor technologies] together other than AMD") got a really really good deal on HSA APUs. Just need some really fast memory.

AMD supplied PDF showing features in APU chipset with MANY future game console related must have features Carefully read all features!!!!!!

Display port 1.2 => Can direct drive head mounted display or glasses, inexpensive DP to HDMI adaptors available.
USB 3.0
Low power idle AMD Zero core power technology
DISCRETE DIGITAL MULTI-POINT AUDIO Audio The first GPUs that can simultaneously output multiple, independent audio streams
Infinity view with multiple monitors

AMD “TRINITY” DOCKING SOLUTION – ONE CABLE, MANY BENEFITS
 Single cable – Mini-DisplayPort Connector
 Four simultaneous displays
 Blu-ray protected content
 GPU acceleration
 3D stereoscopic capable
 Full bandwidth USB 3.0
 Dock powers notebook
 Non-proprietary
 Cross-platform compatible
 Low cost
 Targeted 2H12 production*

200+ apps including:

 Sony Vegas™ Pro 11 is one of the top video
editing applications used by professionals and
enthusiasts worldwide
 Vegas™ Pro 11 features OpenCL™ GPU
acceleration for accelerated video effects, video
preview and project rendering
 Get up to 5x faster video preview and video
render with a system powered by an AMD FX-
8150 processor and AMD Radeon™ HD 6970
graphics

 BlueStacks runs all your favorite Android apps on
your Windows PC and syncs apps from your
Android phone to your PC

 AMD announced investment in BlueStacks as
part of AMD Fusion Fund
 BlueStacks App Player being optimized for AMD
VISION Technology

AMD HSA PDF

COMMITTED TO OPEN STANDARDS
 AMD drives open standards
 Compete on the best implementation
 Open standards are the basis for large ecosystems
 Open standards always win over time
 SW developers want their applications to run on multiple platforms from multiple hardware vendors

HETEROGENEOUS SYSTEM ARCHITECTURE – AN OPEN PLATFORM
Open Architecture, published specifications
 HSAIL virtual ISA
 HSA memory model
 HSA system specification
ISA agnostic for both CPU and GPU
Inviting partners to join us, in all areas
 Hardware companies
 Operating Systems
 Tools and Middleware
 Applications
HSA Foundation to guide the architecture

MAKE GPUs EASIER TO PROGRAM: PRIMARY PROGRAMMING MODELS
Khronos OpenCL™
 Premier programming environment for heterogeneous computing today
 AMD is a key contributor to OpenCL™ at Khronos
 HSA features and architecture make OpenCL™ more efficient
 May initially enable some HSA features with extensions
Microsoft®C++AMP
 Integrated in Visual Studio and Windows®8 Metro
 Addresses the huge population of Visual Studio developers
 Elegant extension of C++ through two new keywords: “restrict”, “array_view”
 HSA provides a natural roadmap for relaxing “restrict” and using all of C++

OPENCL™– WHAT IS IT?
 OpenCL™– Open Compute Language
 Industry standard programming language for parallel computing
 Specification by Khronos (Open, royalty-free like OpenGL™)
 Provides a unified programming model for CPUs, GPUs, Smart Phones, Tablets, Servers (Cloud)…
 Allows software devs to write software once, runs cross-platform
 Supported by all major hardware & software vendors
 AMD, Intel, Nvidia, Apple, ARM, Imagination Technologies, etc

OPENCL™ AND HSA
 HSA is an optimized platform architecture for OpenCL™
 Not an alternative to OpenCL™
 OpenCL™ on HSA will benefit from
 Avoidance of wasteful copies
 Low latency dispatch
 Improved memory model
 Pointers shared between CPU and GPU
 HSA also exposes a lower level programming interface, for those that want the ultimate in control and performance
 Optimized libraries may choose the lower level interface
 

joshwaan

Member
Display port 1.2 => Can direct drive head mounted display or glasses, inexpensive DP to HDMI adaptors available.


Low power idle AMD Zero core power technology
DISCRETE DIGITAL MULTI-POINT AUDIO Audio The first GPUs that can simultaneously output multiple, independent audio streams


Sounds good I can see why Sony going with the tech :)

Thanks for posting that jeff_rigby sounds like great tech from AMD right there :)
 
AMD's HSA memory model is what ultimately will really allow GPGPU to rival Cell in utility. GCN cores also make the most sense with their architectural enhancements for compute-based work loads. If there is also a discreet GPU, it almost doesn't matter what vendor it comes from or its particular architecture. Ideally it will be dedicated to pure rendering tasks.

Assuming these rumors have validity, my hope would be for a 4 module, 8 thread Steamroller APU with 384-512 GCN shaders and at least 2GBs of high speed memory. The GPU could be a Pitcairn variant, or even a strait 28nm shrink of the 6970 with at least 1 GB of VRAM. I think that could be an exceptional platform for next gen games that will differentjate themselves both in terms of visuals, but also in the scope, variety and fidelity of experiences.
 

Globox_82

Banned
AMD's HSA memory model is what ultimately will really allow GPGPU to rival Cell in utility. GCN cores also make the most sense with their architectural enhancements for compute-based work loads. If there is also a discreet GPU, it almost doesn't matter what vendor it comes from or its particular architecture. Ideally it will be dedicated to pure rendering tasks.

Assuming these rumors have validity, my hope would be for a 4 module, 8 thread Steamroller APU with 384-512 GCN shaders and at least 2GBs of high speed memory. The GPU could be a Pitcairn variant, or even a strait 28nm shrink of the 6970 with at least 1 GB of VRAM. I think that could be an exceptional platform for next gen games that will differentjate themselves both in terms of visuals, but also in the scope, variety and fidelity of experiences.

sounds meh. Need more powerful machine if they want it to last 7-8 years on the market. Otherwise, 2-3 years from now everyone will start moving to the PCs
 

McHuj

Member
sounds meh. Need more powerful machine if they want it to last 7-8 years on the market. Otherwise, 2-3 years from now everyone will start moving to the PCs

I'm not sure a 7-8 year plan is in the best interest of the companies involved. Nor do I think it will happen again, I think this was an anomaly.

The tech industry can change so quickly and if you're waiting until year 3 of your product until you make money, you can be really fucked if some disruptive technology enters the market.
 
Display port 1.2 => Can direct drive head mounted display or glasses, inexpensive DP to HDMI adaptors available.


Low power idle AMD Zero core power technology
DISCRETE DIGITAL MULTI-POINT AUDIO Audio The first GPUs that can simultaneously output multiple, independent audio streams


Sounds good I can see why Sony going with the tech :)

Thanks for posting that jeff_rigby sounds like great tech from AMD right there :)
I can see Microsoft including an AMD APU also. Rumors may be partially correct and the CORE of next generation consoles from Microsoft AND Sony will be an AMD APU. Beyond that, a second GPU, memory choices and additional DSPs for accessories may be custom.
 
Top Bottom