Looking at that whole conversation, I'm not sure why aegies is discouraging comparison of the GPUs by pointing at that slide, or by suggesting the numbers being put out about Durango and Orbis are as airy-fairy as those in that slide.
1080p is all you need! When 4K is right around the corner.
32MB of ram is all anyone will need! No one will ever use more than that.
Looking at that whole conversation, I'm not sure why aegies is discouraging comparison of the GPUs by pointing at that slide, or by suggesting the numbers being put out about Durango and Orbis are as airy-fairy as those in that slide.
DDR3 on 386bit would be crazy [~quad speed of x360 main ram pool]... but the math is correct.
As for esram, theoretically it is slower than edram but it has better latencies. Maybe it will have 102GB/s after all...
I know, im just posting "math" that fits. I fully expect 128bit DDR3.
It's not even EDRAM apparently. It's the sister, yeah I don't know if she's any hotter.
If I had to guess this won't be UMA. OS ram will be OS ram(2gb slow ddr 3? - maybe for Kinect and OS). APP ram will be app ram(6gb faster ddr 3?)
Well I think that's enough for 2xMSAA "free" but I doubt it would be used for AA, but instead just getting timely renders out without bogging down the system.Well was anybody expecting more than 1080p? Sounds like that rez will be the ceiling. At least for big games.
I don't know how they'd add up, but the ESRAM would be what communicates with the CPU/GPU while the DDR3 would be communicating with CPU and transferring data to the ESRAM. So you can't just add those two numbers together.Well I believe you, but you could explain MAN!
Really dumb question but tesselation, will the Orbis have anything similar to it?
The point here is:You could've left the sarcasm out.
Ideally, this is what we need to be shooting for.Stereo 1080p or 4k.
The raw teraflop numbers for either console aren't going to tell you much about what they're actually capable of.
The point here is:
Why settle for something that should be standard in 2013? 1080p @ 60fps should be the minimum going forward or the console will already be yesterdays news in 2014. And I don't mean that in terms of "something is always going to be better" because in technology it's already outdated as soon as you buy it pretty much.
Ideally, this is what we need to be shooting for.
The raw teraflop numbers for either console aren't going to tell you much about what they're actually capable of.
Not considering bottlenecks.... I guess 170GB/s look very very close to 192 GB/s. On paper.
thuway please tell me what you know. PM me. I promise I won't tell anyone!!
![]()
Really dumb question but tesselation, will the Orbis have anything similar to it?
Yes, since they both seem to be based on the same line of GPUs.
The point here is:
Why settle for something that should be standard in 2013? 1080p @ 60fps should be the minimum going forward or the console will already be yesterdays news in 2014. And I don't mean that in terms of "something is always going to be better" because in technology it's already outdated as soon as you buy it pretty much.
Ah okay, thanksYes, since they both seem to be based on the same line of GPUs.
Thanks.Orbis GPU fully supports DX 11 class features. Obviously Sony will use opengl though, but it's the same thing.
I genuinely lol'd.
Btw, can anyone be kind enough to explain in lay person terms what exactly is the function of ES or EDRAM? I mean is it the last stop for rendering info to travel to before it actually gets shown on screen and consecutively, the empty space of ES/ED RAM is once again filled with final rendering data to be displayed on screen within a 30th or 60th of a second? Does it mean that the memory consumed by each frame of an image including mesh data, physics data, animation data, lighting data and texture data can at max amount to 32MB, ergo, resolution and AA solutions would have to be adjusted to ensure staying within that limit?
Does it go for both PS4 and XB3?
Not a fair game! My inbox is waiting before yours!
The point here is:
Ideally, this is what we need to be shooting for.
I just really hope E3 is exciting. I remember being glued to gametrailers.com during E3 2006. every night I'd download all of the videos and just watch it for hours on end. I was fucking blown away by Rainbow Six Vegas and that game ended up totally delivering for me (unfortunately, the sequel did not).
How about very basic.
Its like a memory supercharger. Allowing very small super fast bursts of speed to the output and if used correctly and wisely allows the use of additional effects to the image before the frame hits the screen.
Ok but I'm serious, thuway
I'll just share the info with shagg. And shagg will share only with me, deal??
You should be happy.
Isn't Durango's gpu GCN2? DX 11.1 or whatever
How does this work?sorry to disappoint, no ninjas here.
i had a feeling Jehova was lying and i did what i did to expose him.
next time dont believe random dudes.
1080p with 4xSSAA requires about 4x the pixel processing of 4k w/no AA, and has better IQ in all likelihood. Lets get there first before we worry about bumping up the resolution.1080p is all you need! When 4K is right around the corner.
32MB of ram is all anyone will need! No one will ever use more than that.
It's enough.
I'll show you mine if you show me yours. expect a phone call.
How does this work?
You've told him his source was in trouble. If he was lying and he doesn't have any source, how could your message be effective in any way?
Unless I'm missing something, it seems more believable the info was indeed from some "legit" source (but the info could be wrong, anyway).
Now when you think about it... Yeah. You're right. Explain yourself deanos.
Oh, is Orbis supposedly based on HD7xxx? It's getting hard to keep track of some of these rumors. Either way, tessellation is supported.
Also had only 6 posts on b3d and is also banned there now.
As I said - tinfoil hats.
Also had only 6 posts on b3d and is also banned there now.
As I said - tinfoil hats.
You should be happy.
I don't think 4k gaming is something you're going to see much of. Not natively, anyway. 4K is next gen's 1080p.
So Sony will support it more? (I'm sure they would anyway.. nothing major by any means). I wonder if they can manage GT6 in (at least an option for it) 4k at 60fps... Surely PS4 is at least more powerful than 4 PS3's networked.
So Sony will support it more? (I'm sure they would anyway.. nothing major by any means). I wonder if they can manage GT6 in (at least an option for it) 4k at 60fps... Surely PS4 is at least more powerful than 4 PS3's networked.
These numbers are not like the '2 TFlop' PS3 number though, I wouldn't dismiss them as similar. They are a reflection of programmable resources, of what will actually run the shaders programmers write. Nothing in isolation tells you anything about performance but that part of a GPU is responsible for the performance of a large part of the render pipeline. If we are comparing GCN compute units to GCN compute units or even to a slightly lesser extent GCN to GCN2, those parts of the GPUs will be more directly comparable than any two console GPUs before.
What? GT6, a graphics powerhouse running at 4K?
Such a big bump in rez will have to be traded by the lack of detail in the game world.
What? GT6, a graphics powerhouse running at 4K?
Such a big bump in rez will have to be traded by the lack of detail in the game world.
GT6 at 4K is possible. But would graphics fidelity in that mode be compromised?
I don't think 4k gaming is something you're going to see much of. Not natively, anyway. 4K is next gen's 1080p.
As far as I know, Orbis compute units are GCN, not GCN2 derivative.
No one really wants GT5 4k next gen. It's just a checkbox on a feature list. The mainstream is going to want something that looks like Project Cars in the PC Screenshot thread.