DF: Orbis vs Durango Spec Analysis

Unless you know the full components of both consoles... It's not that easy.

Because I sure as hell don't.

If we're gonna believe the leaks about special DMEs, we might as well believe the rest of what they tell us. That paints a picture of Orbis being significantly more powerful AND more efficient.
 
If we're gonna believe the leaks about special DMEs, we might as well believe the rest of what they tell us. That paints a picture of Orbis being significantly more powerful AND more efficient.

In the end I still think it will be a wash with the PS4 edging slightly. But if your theory is true let's see if it holds up.

No one's been considering them secret magic sauce.

People have, however, been trying to interpret the line in the leaked docs to mean that they can't be used for graphical tasks for some reason.

The user ultragpu considers it. His "alter ego" on system wars goes into another tangent of craziness with it... But that's another story.
 
In the end I still think it will be a wash with the PS4 edging slightly. But if your theory is true let's see if it holds up.

Imagine it will be whatever you want to imagine it will be. We're discussing the rumored specs and your constant interjections about the parity you imagine must exist despite all indications to the contrary contribute nothing.
 
Doesn't matter which is more powerful. This will be the microtransaction generation. Have to make up for development costs somehow and small studios need to stay afloat. We will also more than likely see ads that offset costs or used in free versions of games. Yeah it will be cool for easier and quicker for in game purchases but not worth it for us gamers.
 
Imagine it will be whatever you want to imagine it will be. We're discussing the rumored specs and your constant interjections about the parity you imagine must exist despite all indications to the contrary contribute nothing.

Well like I said. I really hope you're right about that too.

Since none of our opinions have yet to be proven.
 
The user ultragpu considers it. His "alter ego" on system wars goes into another tangent of craziness with it... But that's another story.

Regardless, even if those 4 CUs can't be used for rendering, they will be dedicated to functions that would otherwise need gpu resources to handle their load.

Those 1.2 TFs on the Durango GPU will need precious resources consumed by compute tasks.

Regardless, I've heard that these compute units can certainly help in areas like physics, lighting, etc... Stuff that doesn't directly impact rendering, but can have a huge impact on graphical quality.
 
Regardless, even if those 4 CUs can't be used for rendering, they will be dedicated to functions that would otherwise need gpu resources to handle their load.

Those 1.2 TFs on the Durango GPU will need precious resources consumed by compute tasks.

Regardless, I've heard that these compute units can certainly help in areas like physics, lighting, etc... Stuff that doesn't directly impact rendering, but can have a huge impact on graphical quality.

I'm sure first party will amaze using those 4 CUs.
 
In the end I still think it will be a wash with the PS4 edging slightly. But if your theory is true let's see if it holds up.



The user ultragpu considers it. His "alter ego" on system wars goes into another tangent of craziness with it... But that's another story.

32 ROPS + GDDR5 should mean the 360's traditional advantages this gen (AA, transparency) should at least apply to PS4.
 
Regardless, even if those 4 CUs can't be used for rendering, they will be dedicated to functions that would otherwise need gpu resources to handle their load.

Those 1.2 TFs on the Durango GPU will need precious resources consumed by compute tasks.

Regardless, I've heard that these compute units can certainly help in areas like physics, lighting, etc... Stuff that doesn't directly impact rendering, but can have a huge impact on graphical quality.

I just hope they're not reserving those CUs for their next Dual Camera stuff!
 
I just hope they're not reserving those CUs for their next Dual Camera stuff!

That would only make sense in games that actually used the camera which few probably will.

I can see Sony using a on chip solution for the Dual Camera though it shouldnt be too difficult or expensive
 
Damn... B3D just shot down a thread speculating on the 4 CUs in PS4 being used as "magic sauce".

http://beyond3d.com/showthread.php?t=63049

Does everything have to be straightforward with Durango or Orbis? We need quirks. It would make the architecture of both of these consoles boring in design.

I don't think we've ever expected them to be magic in themselves, just that having 'extra' computing power on tap is a good thing.


With the talk of target resolutions etc, how do the differs number of ROPs affect things? Seems like twice the number of ROPs on orbis would be more an enough for 1080p (almost overkill) but will they potentially be starved as complexity goes up and the shaders are doing more? I don't really understand the relationship between shaders and ROPs - where does your filtrate come from?
 
I don't think we've ever expected them to be magic in themselves, just that having 'extra' computing power on tap is a good thing.


With the talk of target resolutions etc, how do the differs number of ROPs affect things? Seems like twice the number of ROPs on orbis would be more an enough for 1080p (almost overkill) but will they potentially be starved as complexity goes up and the shaders are doing more? I don't really understand the relationship between shaders and ROPs - where does your filtrate come from?

fillrate comes from the ROPs it describes how fast the GPU can pretty output the data of a shader into memory.
 
after buying myself a monster PC and playing maxed-out games in 1080p, my opinion has changed. I want every next-gen game to be in native 1080p. All of them. It should be mandatory, even.

I also have the latest-tech gaming pc and a 27 inch quality panel (omg Dark Souls in 1080p @ 60fps..). But 720p on my PC vs 720p on my xbox 360 looks completely different. I don't think it can be compared.

On my PC 720p looks like a mess, and only native 1080p looks clean, while on my xbox 360 (and PS3) 720p look pretty nice and clear. The only thing I can think of is that the scalers in the consoles are pretty fucking awesome..

If so, I don't think 1080p should be mandatory in itself.
 
On my PC 720p looks like a mess, and only native 1080p looks clean, while on my xbox 360 (and PS3) 720p look pretty nice and clear.

The same here. In fact, sub-720p can also look stellar - the 360 version of Alan Wake looks simply gorgeous and as sharp as it needs to be, despite the grabs looking awful when viewed on my monitor.
 
I also have the latest-tech gaming pc and a 27 inch quality panel (omg Dark Souls in 1080p @ 60fps..). But 720p on my PC vs 720p on my xbox 360 looks completely different. I don't think it can be compared.

On my PC 720p looks like a mess, and only native 1080p looks clean, while on my xbox 360 (and PS3) 720p look pretty nice and clear. The only thing I can think of is that the scalers in the consoles are pretty fucking awesome..

If so, I don't think 1080p should be mandatory in itself.
Are you playing both on the same display device? If so, then it's not a PC/consoles difference, it's a difference between your display devices and seating arrangements. If not, then it must be the power of auto-suggestion -- the scaling is just scaling after all (as could be demonstrated by 1080p captures directly from the consoles).
 
Interesting stuff from a dev(?). Edit: he is not a dev. So most definitely BS. I keep it here nevertheless.

Source: http://beyond3d.com/showpost.php?p=1704979&postcount=949 / http://forum.teamxbox.com/showthread.php?t=681841&page=107

The most likely rumors/leaks we have atm tell us with very few uncertain terms that Durango's graphics setup is wholly incomparable to Orbis/PC setups. Not because one or the other is dramatically more powerful, but because the former is geared towards a much more sophisticated, targeted approach to rendering and the latter is going for effectively brute force.

It is now clear why the various insiders said what they did about the flops and bandwidth arguments being worthless and about efficiency. The 'special sauce' is the entire design philosophy targeting the removal of highly redundant processing inherent in modern graphics engines in the first place.

Make no mistake, the DME's and eSRAM are most certainly NOT there primarily to alleviate DDR3 bandwidth issues, as we all previously had assumed. DDR3 was NOT chosen just to be cheap, it was chosen more likely because higher bandwidth was simply not needed and to help with the thermodynamics. Think about it like this...

Sony went with a largely off the shelf, simple design for decent power at reasonable costs. MS could have very easily gone with the exact same setup. But they didn't. They evidently looked at the obvious setup and opted to spend more money at higher costs engineering a very elaborate, highly customized architecture that will be challenging to manufacture. Why? None of that helps the OS or even Kinect stuff one bit as far as anyone has been able to tell. So even in purely gaming terms they opted out of the cheap, easy, straight forward approach. They wouldn't spend all that extra money and effort and risk if both their engineers and AMD's didn't feel it was worthwhile.

It's not about one magical piece of hardware to make up the bandwidth of Flops difference as a 'secret sauce'. Their ace in the hole is designing the whole architecture to be 100% catered towards virtualized texturing, robust tiling, virtual geometry, low cost post processing fx, etc.



Zombie, read up on the Durango's display planes via these patents:

http://www.faqs.org/patents/app/20120159090
http://www.faqs.org/patents/app/20110304713

MS is all but guaranteed to lock down system wide standards for both framerates and pseudo resolution. You will likely get 1080p HUD and foregrounds standard with 30fps locked, also standard. The way they get there is to significantly reduce the processing needed in most modern game engines by allowing dynamic res scaling for the background which can also be blurred via advanced DoF for 'free' (as far as GPU is concerned).

No. They aren't taking shortcuts, they are implementing methods for system-wide QoS options that otherwise can't exist with any reasonable flexibility. We were all thinking backwards on this. They didn't add this stuff to make up for any disadvantage against Orbis, which was super weak long after they made their design (circa summer 2012 devs apparently were livid about its weakness according to insiders). Their decisions were made long before that. You can't really use brute force to give devs both flexibility and retain system wide standards for fps/res. There has to be a built in method for handling that at the hardware level, which is what the display planes are.

It is no more a shortcut than not having it. The difference is with the display planes you save yourself processing power when ya don't need/want it as you can get DoF and other stuff for free and you don't get stuck with the entire frame at a lower res/clip. Modern games are going to all be using DoF of some sort anyhow, even if it's just in the background artwork. This is a more flexible, vastly more efficient approach that allows for QoS guarantees without significantly altering the actual fidelity of the stuff you are focused on in-game (foreground).

Hell, in theory they could take this to its logical extreme with Kinect 2.0 packed in and possibly even do their foveated rendering stuff (5-6 fold boost in performance). Hmmm...in fact, perhaps their reported performance gains in that study could guide expectations roughly for what these display planes might yield. I wonder how pared back those gains will be compared to that (5-6 fold). Starting to wonder if that foveated rendering stuff might be similar in implementation here. Might have to cross check that paper with the display plane patents tomorrow.
 
It's FUD to say 720p upscaled to 1080p is inherently bad.

I didn't state PS4 is capable of Pixar IQ. A Pixar bluray at 720p blows away any game rendered natively at 1080p.

I merely proposed that resolution alone is not an indicator of IQ, which is not FUD. With enough horsepower dedicated to AA, AF, etc, a 720p title can look cleaner and better than 1080p. That is not FUD, it is fact.

This "720p = bad" shit needs to go away; the quality of the source material is the most important facet.

Unless you're proposing console games down sampling from 4k or above, you don't have a point to make.
 
What are the odds Orbis's better specs will translate into higher resolutions and/or framerates and not all new gameplay experiences when compared to Durango?
 
Wow, whole new ways of rendering to make up for the differences in power between the platforms? Why am I reminded of the cube mapping argument for Wii?
 
NO

insanely unrealisticly high resolutions downsampled to 720p is as far removed from 720p as it can possibly be. Calling that 720p and comparing that to rendering something in 720p is just stupid.

You used pixar movies as some kind of evidence that 720p can look good and insinuating that it has any bearing on videogames or any realtime graphics, you were being (very, very) disingenous.
actual 720p never looks good in games.

Rendering a game at a resolution way above 1080p and then downsampling to 720p to get 'great image quality' makes no sense either since the cost is exactly the same as outputting it at that high res, it would be cheaper to render at 1080p instead and you wouldn't have to lose fine detail by displaying less pixels.
And by just rendering it at 1080p (or 1080p with AA) you wouldn't get scaling artifacts or added input lag.... You downsample to native res, and you downsample to native res only because your monitor can't display the higher resolutions you downsample from.

And no, shitty post process AA will not make something look as good or better as 1080p, it will only serve to degrade image quality even further.

What you are doing is arguing semantics with the stupid pixar comparison which has no bearing on how games are rendered. aka fud.
Unless you plan to offline render the game at some hilarious resolution, then watch a video of the game being played... then by all means enjoy your video of a game.

Unless you're proposing console games down sampling from 4k or above, you don't have a point to make.

You're both indirectly proving my point. Rendering technique has a massive impact on IQ, more so than the output resolution. The point I'm making (and both of you are choosing to ignore) is that 720p isn't inherently bad; depending upon the rendering techniques it can look as good, if not better than 1080p. As you both state, rendering at 4K and downsampling to 720p will produce superior results. It doesn't matter whether or not that's possible, the best choice, etc; that's not the issue being debated. Resolution is just one piece of the IQ jigsaw.

It's a provable premise and has been proved on this forum (can't remember the thread). Sure, once a certain difference in resolution is reached, it's not possible but 720p versus 1080p is close enough. Gaming on a 50" 1080p plasma from 8 feet away, some might say GT5 in 720p (4xMSAA) has better IQ than in 1080p (2xQAA + TA).

And "lag"? Sounds like someone needs to invest in a better TV.

I guess this is drifting off topic and maybe the reference to Pixar was a bad idea (that always seems to set folks off), so I'll leave it now.

Edit: I don't even know why this is being debated. This is a console thread, consoles have fixed hardware and its a fact that you can do more at 720p than 1080p.
 
Here's the major problem that I see.

MS is really going to press a specific rendering set up on most developers.

The more information that comes out the more it looks to be pushing tiled based forward rendering.

here's some techdemos.

https://www.youtube.com/watch?v=C6TUVsmNUKI
https://www.youtube.com/watch?v=6DyTk7917ZI
https://www.youtube.com/watch?v=M04SMNkTx9E
https://www.youtube.com/watch?v=5cLOLE9Tn-g (this is the old larabee demo, but it's believed to be a tile based render aswell)

the major benefits


if MS's machine is specialized for this type of rendering pipeline, then the ports could potentially suffer. The problem is that it limits developer freedom, that said i think they want to specialize d3d so that it can work more homogonously across platforms.

I'd appreciate some clarifications on this matter. I'm not too sure how Durango would get more benefits from this rendering approach than Orbis. Is it because ESRAM has less latency than GDDR5?

Comments on B3D focus on how quick data transfer would enable tiles to feed the ESRAM. I can understand that but with Orbis, such transfer wouldn't be needed in the first place. The tiles would be already available in an address space the GPU would be able to operate on so I don't think DMEs would be a great benefit over Orbis here. I guess the main structural advantage that remains for Durango in that regard would be the ESRAM's low latency. Low latency would allow Durango's GPU to perform super quick access to tile data e.g. more processing operations. Right? Wrong?
 
I'm not too sure how Durango would get more benefits from this rendering approach than Orbis. Is it because ESRAM has less latency than GDDR5?
One main benefit a forward+ renderer has over tiled defferred rendering (bf3 etc.) is reduced memory bandwidth usage, which Durango lacks compared to Orbis.
 
I think he is refering to kinect 2.0.

I'm sure he is. I just found his statement rather amusing. I guess that's where these console wars are headed once the hardware battle is settled.

The Orbis version runs smoother and has better image quality, but the Durango version offers "new gameplay experiences!".
 
Durango lets you jack into the Matrix. All you need is 3 Kinects, a pair of Augmented Reality glasses, wireless surround sound headphones and a 360 degree Fun-jector.

Just be sure you don't move out of the 2 foot square that all these devices converge on. You could lose a toe.
 
Did we get this much analysis prior to the launch of both consoles at the start of current gen.

It makes me nervous to rely on this information, but its great reading nevertheless.
 
I'm sure he is. I just found his statement rather amusing. I guess that's where these console wars are headed once the hardware battle is settled.

The Orbis version runs smoother and has better image quality, but the Durango version offers "new gameplay experiences!".

Kinect 2.0 + (Timed?) Exclusive DLC vs better IQ and frame rate. Kind of like what happened a lot this gen, only the other way around lol.
 
All this stuff on Durango seems like Band Aids made to compensate for the downsides of ddr3.....

It seems like the desire to have 8GB RAM was the primary objective and everything else was a secondary afterthought.

This large pool of RAM drove the entire design of the machine it seems
 
Top Bottom