TacticalFox88
Banned
Question, with these specs how good can Ray Tracing theoretically can be?
Question, with these specs how good can Ray Tracing theoretically can be?
Unless you know the full components of both consoles... It's not that easy.
Because I sure as hell don't.
Damn... B3D just shot down a thread speculating on the 4 CUs in PS4 being used as "magic sauce".
http://beyond3d.com/showthread.php?t=63049
If we're gonna believe the leaks about special DMEs, we might as well believe the rest of what they tell us. That paints a picture of Orbis being significantly more powerful AND more efficient.
No one's been considering them secret magic sauce.
People have, however, been trying to interpret the line in the leaked docs to mean that they can't be used for graphical tasks for some reason.
In the end I still think it will be a wash with the PS4 edging slightly. But if your theory is true let's see if it holds up.
Imagine it will be whatever you want to imagine it will be. We're discussing the rumored specs and your constant interjections about the parity you imagine must exist despite all indications to the contrary contribute nothing.
The user ultragpu considers it. His "alter ego" on system wars goes into another tangent of craziness with it... But that's another story.
Regardless, even if those 4 CUs can't be used for rendering, they will be dedicated to functions that would otherwise need gpu resources to handle their load.
Those 1.2 TFs on the Durango GPU will need precious resources consumed by compute tasks.
Regardless, I've heard that these compute units can certainly help in areas like physics, lighting, etc... Stuff that doesn't directly impact rendering, but can have a huge impact on graphical quality.
Question, with these specs how good can Ray Tracing theoretically can be?
In the end I still think it will be a wash with the PS4 edging slightly. But if your theory is true let's see if it holds up.
The user ultragpu considers it. His "alter ego" on system wars goes into another tangent of craziness with it... But that's another story.
Higher filtering advantage too.32 ROPS + GDDR5 should mean the 360's traditional advantages this gen (AA, transparency) should at least apply to PS4.
Higher filtering advantage too.
Regardless, even if those 4 CUs can't be used for rendering, they will be dedicated to functions that would otherwise need gpu resources to handle their load.
Those 1.2 TFs on the Durango GPU will need precious resources consumed by compute tasks.
Regardless, I've heard that these compute units can certainly help in areas like physics, lighting, etc... Stuff that doesn't directly impact rendering, but can have a huge impact on graphical quality.
I just hope they're not reserving those CUs for their next Dual Camera stuff!
Damn... B3D just shot down a thread speculating on the 4 CUs in PS4 being used as "magic sauce".
http://beyond3d.com/showthread.php?t=63049
Does everything have to be straightforward with Durango or Orbis? We need quirks. It would make the architecture of both of these consoles boring in design.
I don't think we've ever expected them to be magic in themselves, just that having 'extra' computing power on tap is a good thing.
With the talk of target resolutions etc, how do the differs number of ROPs affect things? Seems like twice the number of ROPs on orbis would be more an enough for 1080p (almost overkill) but will they potentially be starved as complexity goes up and the shaders are doing more? I don't really understand the relationship between shaders and ROPs - where does your filtrate come from?
after buying myself a monster PC and playing maxed-out games in 1080p, my opinion has changed. I want every next-gen game to be in native 1080p. All of them. It should be mandatory, even.
On my PC 720p looks like a mess, and only native 1080p looks clean, while on my xbox 360 (and PS3) 720p look pretty nice and clear.
Are you playing both on the same display device? If so, then it's not a PC/consoles difference, it's a difference between your display devices and seating arrangements. If not, then it must be the power of auto-suggestion -- the scaling is just scaling after all (as could be demonstrated by 1080p captures directly from the consoles).I also have the latest-tech gaming pc and a 27 inch quality panel (omg Dark Souls in 1080p @ 60fps..). But 720p on my PC vs 720p on my xbox 360 looks completely different. I don't think it can be compared.
On my PC 720p looks like a mess, and only native 1080p looks clean, while on my xbox 360 (and PS3) 720p look pretty nice and clear. The only thing I can think of is that the scalers in the consoles are pretty fucking awesome..
If so, I don't think 1080p should be mandatory in itself.
The most likely rumors/leaks we have atm tell us with very few uncertain terms that Durango's graphics setup is wholly incomparable to Orbis/PC setups. Not because one or the other is dramatically more powerful, but because the former is geared towards a much more sophisticated, targeted approach to rendering and the latter is going for effectively brute force.
It is now clear why the various insiders said what they did about the flops and bandwidth arguments being worthless and about efficiency. The 'special sauce' is the entire design philosophy targeting the removal of highly redundant processing inherent in modern graphics engines in the first place.
Make no mistake, the DME's and eSRAM are most certainly NOT there primarily to alleviate DDR3 bandwidth issues, as we all previously had assumed. DDR3 was NOT chosen just to be cheap, it was chosen more likely because higher bandwidth was simply not needed and to help with the thermodynamics. Think about it like this...
Sony went with a largely off the shelf, simple design for decent power at reasonable costs. MS could have very easily gone with the exact same setup. But they didn't. They evidently looked at the obvious setup and opted to spend more money at higher costs engineering a very elaborate, highly customized architecture that will be challenging to manufacture. Why? None of that helps the OS or even Kinect stuff one bit as far as anyone has been able to tell. So even in purely gaming terms they opted out of the cheap, easy, straight forward approach. They wouldn't spend all that extra money and effort and risk if both their engineers and AMD's didn't feel it was worthwhile.
It's not about one magical piece of hardware to make up the bandwidth of Flops difference as a 'secret sauce'. Their ace in the hole is designing the whole architecture to be 100% catered towards virtualized texturing, robust tiling, virtual geometry, low cost post processing fx, etc.
Zombie, read up on the Durango's display planes via these patents:
http://www.faqs.org/patents/app/20120159090
http://www.faqs.org/patents/app/20110304713
MS is all but guaranteed to lock down system wide standards for both framerates and pseudo resolution. You will likely get 1080p HUD and foregrounds standard with 30fps locked, also standard. The way they get there is to significantly reduce the processing needed in most modern game engines by allowing dynamic res scaling for the background which can also be blurred via advanced DoF for 'free' (as far as GPU is concerned).
No. They aren't taking shortcuts, they are implementing methods for system-wide QoS options that otherwise can't exist with any reasonable flexibility. We were all thinking backwards on this. They didn't add this stuff to make up for any disadvantage against Orbis, which was super weak long after they made their design (circa summer 2012 devs apparently were livid about its weakness according to insiders). Their decisions were made long before that. You can't really use brute force to give devs both flexibility and retain system wide standards for fps/res. There has to be a built in method for handling that at the hardware level, which is what the display planes are.
It is no more a shortcut than not having it. The difference is with the display planes you save yourself processing power when ya don't need/want it as you can get DoF and other stuff for free and you don't get stuck with the entire frame at a lower res/clip. Modern games are going to all be using DoF of some sort anyhow, even if it's just in the background artwork. This is a more flexible, vastly more efficient approach that allows for QoS guarantees without significantly altering the actual fidelity of the stuff you are focused on in-game (foreground).
Hell, in theory they could take this to its logical extreme with Kinect 2.0 packed in and possibly even do their foveated rendering stuff (5-6 fold boost in performance). Hmmm...in fact, perhaps their reported performance gains in that study could guide expectations roughly for what these display planes might yield. I wonder how pared back those gains will be compared to that (5-6 fold). Starting to wonder if that foveated rendering stuff might be similar in implementation here. Might have to cross check that paper with the display plane patents tomorrow.
It's FUD to say 720p upscaled to 1080p is inherently bad.
I didn't state PS4 is capable of Pixar IQ. A Pixar bluray at 720p blows away any game rendered natively at 1080p.
I merely proposed that resolution alone is not an indicator of IQ, which is not FUD. With enough horsepower dedicated to AA, AF, etc, a 720p title can look cleaner and better than 1080p. That is not FUD, it is fact.
This "720p = bad" shit needs to go away; the quality of the source material is the most important facet.
Interesting stuff from a dev(?). Source: http://beyond3d.com/showpost.php?p=1...&postcount=949 / http://forum.teamxbox.com/showthread...81841&page=107
Not a dev, nor a reliable source according to B3D.
NO
insanely unrealisticly high resolutions downsampled to 720p is as far removed from 720p as it can possibly be. Calling that 720p and comparing that to rendering something in 720p is just stupid.
You used pixar movies as some kind of evidence that 720p can look good and insinuating that it has any bearing on videogames or any realtime graphics, you were being (very, very) disingenous.
actual 720p never looks good in games.
Rendering a game at a resolution way above 1080p and then downsampling to 720p to get 'great image quality' makes no sense either since the cost is exactly the same as outputting it at that high res, it would be cheaper to render at 1080p instead and you wouldn't have to lose fine detail by displaying less pixels.
And by just rendering it at 1080p (or 1080p with AA) you wouldn't get scaling artifacts or added input lag.... You downsample to native res, and you downsample to native res only because your monitor can't display the higher resolutions you downsample from.
And no, shitty post process AA will not make something look as good or better as 1080p, it will only serve to degrade image quality even further.
What you are doing is arguing semantics with the stupid pixar comparison which has no bearing on how games are rendered. aka fud.
Unless you plan to offline render the game at some hilarious resolution, then watch a video of the game being played... then by all means enjoy your video of a game.
Unless you're proposing console games down sampling from 4k or above, you don't have a point to make.
Here's the major problem that I see.
MS is really going to press a specific rendering set up on most developers.
The more information that comes out the more it looks to be pushing tiled based forward rendering.
here's some techdemos.
https://www.youtube.com/watch?v=C6TUVsmNUKI
https://www.youtube.com/watch?v=6DyTk7917ZI
https://www.youtube.com/watch?v=M04SMNkTx9E
https://www.youtube.com/watch?v=5cLOLE9Tn-g (this is the old larabee demo, but it's believed to be a tile based render aswell)
the major benefits
if MS's machine is specialized for this type of rendering pipeline, then the ports could potentially suffer. The problem is that it limits developer freedom, that said i think they want to specialize d3d so that it can work more homogonously across platforms.
What are the odds Orbis's better specs will translate into higher resolutions and/or framerates and not all new gameplay experiences when compared to Durango?
One main benefit a forward+ renderer has over tiled defferred rendering (bf3 etc.) is reduced memory bandwidth usage, which Durango lacks compared to Orbis.I'm not too sure how Durango would get more benefits from this rendering approach than Orbis. Is it because ESRAM has less latency than GDDR5?
What are the odds Orbis's better specs will translate into higher resolutions and/or framerates and not all new gameplay experiences when compared to Durango?
Please tell us about these new Durango gameplay experiences.
I think he is refering to kinect 2.0.
I'm sure he is. I just found his statement rather amusing. I guess that's where these console wars are headed once the hardware battle is settled.
The Orbis version runs smoother and has better image quality, but the Durango version offers "new gameplay experiences!".
Much has happened :-O.
Much has happened :-O.
Welcome back, home skillit.Much has happened :-O.
Much has happened :-O.
tell me more...Much has happened :-O.
Much has happened :-O.
Did someone drink all the sauce?