yep but sorry PC wont remain in the future
The article basically talks about AMD's magic lamp that allows seamless sharing of resources between GPU and CPU
What does this mean in layman's terms?
Which AMD is planning to release for PCs in 2H 2013. I don't see your point.
I have a feeling you think hUMA is a Playstation exclusive technology? It is far from it.
this is not just for PC ..this is for APUs for both PS4 and PC
junior
http://news.msn.com/science-technology/research-firm-pc-sales-fall-14-percent-in-first-quarter
dont just post a random pic to seek attention
So PS4 will match Titan?
The CPU and GPU are able to access a shared pool of memory instead of the memory having to be partitioned between the two. Which means if in certain situations the GPU needs more memory than the CPU, this can be accomplished easily, and vice versa.
What I'm confused about is why this is news. Cerny's been talking about the PS4 having unified memory architecture since it was announced.
So PS4 will match Titan?
Which is what I am saying. I have no idea what this: "yep but sorry PC wont remain in the future" means
let's place bets on how many ps4s north korea buys to power their missilesLol if you think Titan has a chance; the PS4 will blow it out of the water.
no pc devs
no pc devs
no pc devs
no pc devs
let's place bets on how many ps4s north korea buys to power their missiles
whats the funny part about that. a platform that is on its demise and soon to be replaced by tablets and smartphones. Where do you see PC games on NPD/PAL charts?
but AMD still can't compete in the PC market, they were destroyed by Intel. I dont see how this will turn around though.
What does this mean though. Is it like having GDDR6 ram? would it feel like theres more than 8gb of ram?
The feel of 30GB of GDDR8
Lol if you think Titan has a chance; the PS4 will blow it out of the water.
It'll be the most efficient 30fps possible.
Old news.
Sharing the memory space is not something new AMD just came up with, it has been known about for ages.
idk if sarcasm but hUMA is not going to magically make PS4 as powerful as a 4.5TFlops TITAN which also has a faster memory bandwidth than PS4, it is just going to make PS4 a lot more efficient than using PCIe.
But no...
Sony and Microsoft have no chance at making a system these days. AMD and Nvidia are sitting on tens of thousands of man years and patents and are so far ahead of everybody else it's not funny.
I remember some company tried to create a high end GPU a few years ago, not Intel a smaller company, thing was a disaster. Just the drivers alone for AMD/NVidia surely have who know how many man years invested into them.
The PS2 days of Sony/Toshiba being able to design major parts of a system are long gone. It's way too complex now. Hell arguably the PS3, where Sony basically just only designed half the CPU along with IBM, none of the GPU, showed that. Because the CPU was the bad part of the PS3 according to most.
it was difficult not bad
the cell iwhat carrid the ps3
What does this mean in layman's terms?
Doesn't the 360 already have a unified architecture?
This got a good chuckle out of me.
Skyrim launching without proper CPU optimizations which made it lose 40% performance (yet still perform good on decent PCs) until some modder fixed it and Bethesda adopted his work in a patch (rather than do it properly with their means to access the game engine which would have yielded even further boosts).
Well, I think they used the mod because it had the exact same performance boost and the exact same lighting bug introduced on my PC (lights appeared to dim a tiny bit as you first approach them within a certain distance, but don't do it again if you go back and forth so it's not a LOD issue, and I know the mod first introduced it as I tested with and without and actually held off using the mod because I found it jarring, but in the end had to settle for it) but I guess it could be a different implementation still. But the modder did say he expected Bethesda to offer way bigger performance boosts (100% was cited, vs the mod's rough 40%) if they do it right.The problem was that the idiots at Bethesda did not set the compiler flags for the use of SSE2 and above (a beyond simple thing to do), forcing the game to only run with scalar floating point support (x87).
I doubt they just used the mod to fix it (would of taken more work than just setting the flags).
So, is this just for GPUs on CPU dies? Like intel's HD crap? Why even bother unless they intend to eventually offer their flagship GPU cores with such solutions in gaming setups. Would those even be small enough? Not that I'd jump on that and lose the ability to upgrade the components separately and at will. My current PC should last me through the next gen well enough with just a GPU & cooling/OC upgrade somewhere down the line.
It seems like PC has to have this memory bottleneck as long as it remains the open custom platform it is unless they find a way to share memory fast through the motherboard. It's not like consoles are without bottlenecks so are going to be so superior by utilizing such technology as some seem to believe. Like the PS4's CPU. I suppose with many titles being primarily for consoles the game engines could end up unoptimized for the hardware.
That always happens in some cases though, like GTAIV running like shit on PC (if you want it to look good unlike on consoles), or Skyrim launching without proper CPU optimizations which made it lose 40% performance (yet still perform good on decent PCs) until some modder fixed it and Bethesda adopted his work in a patch (rather than do it properly with their means to access the game engine which would have yielded even further boosts).
This is AMD's tech...
this is not just for PC ..this is for APUs for both PS4 and PC . I have a feeling that you are reading this is for APU only right
junior
http://news.msn.com/science-technology/research-firm-pc-sales-fall-14-percent-in-first-quarter
dont just post a random pic to seek attention
but who's box has it first?
GPUs have had MMUs capable of reading pages as seen from the OS/app for generations now. I suppose HSA has some extra sauce, otherwise this is a non-news.On the original Xbox and such (which has unified memory) I believe you had to portion up the memory so that the GPU access this part and the CPU access another part, they couldn't both read/write to the same range which a HSA system can.
GPUs have had MMUs capable of reading pages as seen from the OS/app for generations now. I suppose HSA has some extra sauce, otherwise this is a non-news.
...but it's Sony's RAM.
Old news.
Sharing the memory space is not something new AMD just came up with, it has been known about for ages.
It was never implemented. This year's APUs will be the first to fully use this architecture [together with nextgen consoles].
Really hard to take AMD seriously anymore.
Before anyone gets hyped just remember bulldozer. Remember AMDS cpu history for the past 5 years or so.
They have been making mediocre cpus, and gpus for the past 5 years they need to get their stuff together.
Nvidia and intel need real competition.