VGLeaks: Durango's Move Engines

So bandwidth and RAM aside, I take it the CPU/GPU's are roughly even when compared to one another?

GPU is morepowerful on the Orbis but from what devs and some of our own people here(neogaf B3d) that program it seems like we aren't talking 30 versus 60 FPS or something like that. More like 2xAA versus 4Xaa and or 16AF versus 8AF.
 
GPU is morepowerful on the Orbis but from what devs and some of our own people here(neogaf B3d) that program it seems like we aren't talking 30 versus 60 FPS or something like that. More like 2xAA versus 4Xaa and or 16AF versus 8AF.

To be exact we're talking iirc

50% more ALU
100% more ROPs (aa)
50% more texture units (af)

The extra ALU is a lot harder to quantify then the tex / ROPs
 
To be exact we're talking iirc

50% more ALU
100% more ROPs (aa)
100% more texture units (af)

The extra ALU is a lot harder to quantify then the tex / ROPs

Also 1.2 TF compared to 1.8 TF. The PS4 definitely has the leg up on Durango in terms of raw specs, how much this will translate in terms of games we don't know.
 
Also 1.2 TF compared to 1.8 TF. The PS4 definitely has the leg up on Durango in terms of raw specs, how much this will translate in terms of games we don't know.

Thats the 50% extra ALU which I came up with a way to quantify

At fully compute bound loads it going to be

40FPS verse 60FPS
20FPS verse 30FPS
 
GPU is morepowerful on the Orbis but from what devs and some of our own people here(neogaf B3d) that program it seems like we aren't talking 30 versus 60 FPS or something like that. More like 2xAA versus 4Xaa and or 16AF versus 8AF.

I think with 2.5 X the bandwidth for the main memory we could see some games that are 60FPS on PS4 vs 30FPS on the Xbox Next, but right now we really don't know how things will turn out.
 
I think with 2.5 X the bandwidth for the main memory we could see some games that are 60FPS on PS4 vs 30FPS on the Xbox Next, but right now we really don't know how things will turn out.

I am only going on those who have been breaking down the actual numbers and drop off. They could all be wrong. Durante was one of them. But the question was asked so I popped down what the people who seem to work on this stuff know. If your locking at say 30 and 60 I could see that though of course. And I would assume some games would of course.
 
To be exact we're talking iirc

50% more ALU
100% more ROPs (aa)
100% more texture units (af)

The extra ALU is a lot harder to quantify then the tex / ROPs

As highlighted in the Edge article, another big factor in favour of the PS4 is that Durango development has to be done through an abstraction layer. Devs won't be able to code to the metal to eke out the true, closed platform performance.
 
As highlighted in the Edge article, another big factor in favour of the PS4 is that Durango development has to be done through an abstraction layer. Devs won't be able to code to the metal to eke out the true, closed platform performance.

If they are not merely referring to using DirectX, the only way I could see that being true is if MS is wanting all of the next xbox games to be available on any other device that has that same abstraction layer.
 
There's more to it than just that though

There's ALLOT of speculation that Microsoft chose their design to suit some VERY specific development techniques(mega mesh's, mega\virtual texturing, tessellation, tiling ect) that were created this gen but were unable to be truly utilized due to hardware limitations.

It's pretty much the only way to explain why MS deiced to do what they did with the overall Durango design since there's several things that just make NO SENSE what so ever from a traditional design(and transistor cost) trade-off standpoint.

The vgleaks doc cites tiling volume textures. That immediately made me think of some really nifty Ms researches like this one: http://research.microsoft.com/en-us/um/people/hoppe/proj/gim/ and this one: http://research.microsoft.com/en-us/um/people/hoppe/proj/perfecthash/

Hopefully some of these researches will see to some games some day.
 
The vgleaks doc cites tiling volume textures. That immediately made me think of some really nifty Ms researches like this one: http://research.microsoft.com/en-us/um/people/hoppe/proj/gim/ and this one: http://research.microsoft.com/en-us/um/people/hoppe/proj/perfecthash/

Hopefully some of these researches will see to some games some day.
I hear you.
Take a look at the pdf the page before this one that I posted. I didn't realize I had posted the same stuff about 5 weeks ago but the Lionhead shit gives me goosebumps for some of its stuff. I am just sort of a tech freak and doing something radically different, even if it was say ID's mega textures is always fun stuff.
 
Also 1.2 TF compared to 1.8 TF. The PS4 definitely has the leg up on Durango in terms of raw specs, how much this will translate in terms of games we don't know.

Simply put, at the least, we can in all likelihood say goodbye to piss poor ports for which the developers ask the same money as the proper one, on PS4.
 
If they are not merely referring to using DirectX, the only way I could see that being true is if MS is wanting all of the next xbox games to be available on any other device that has that same abstraction layer.

The section from the Edge piece:

Though the architectures of the next-gen Xbox and PlayStation both resemble that of PCs, several development sources have told us that Sony’s solution is preferable when it comes to leveraging power. Studios working with the next-gen Xbox are currently being forced to work with only approved development libraries, while Sony is encouraging coders to get closer to the metal of its box. Furthermore, the operating system overhead of Microsoft’s next console is more oppressive than Sony’s equivalent, giving the PlayStation-badged unit another advantage.
 
Been following the enlightening discussion on B3D and started appreciating the uniqueness of Durango setup : Pretty smart design, powerful and cost effective. The performance gulf some've been imagining will probably not materialize... ;)
 
Been following the enlightening discussion on B3D and started appreciating the uniqueness of Durango setup : Pretty smart design, powerful and cost effective. The performance gulf some've been imagining will probably not materialize... ;)

Not in terms of third party releases.
 
Who cares just ignore them, I'm soooo over the whole "power argument" by now.....honestly for me overall the Durango is allot more interesting from a current and future design standpoint and potential outside of gaming.

We know literally NOTHING about the plans, features and services of either console lmao.
You are just as bad as any sony fanboys in here. (or people dissapointed that it isn't anything exciting)
 
The section from the Edge piece:

Yeah, I'd read that. I have a suspicion they are merely using the nVidia guy's blog post as their source for that bit. But if they are only allowing certain libraries, that definitely points to some sort of standard platform for multiple devices.

Been following the enlightening discussion on B3D and started appreciating the uniqueness of Durango setup : Pretty smart design, powerful and cost effective. The performance gulf some've been imagining will probably not materialize... ;)

Link plz?
 
The section from the Edge piece:
Sounding very good for Orbis. Sony's development tools have also apparently improved by large strides as well.

Been following the enlightening discussion on B3D and started appreciating the uniqueness of Durango setup : Pretty smart design, powerful and cost effective. The performance gulf some've been imagining will probably not materialize... ;)
Does the latter bolded really follow the former. Is the UMA of Orbis considered dumb design and ineffective? Has there been anything to suggest that these would actually bridge any potential performance deficit rather than simply alleviate potential deficiencies due to the choice of DDR3 as suggested by posts on here?
 
Been following the enlightening discussion on B3D and started appreciating the uniqueness of Durango setup : Pretty smart design, powerful and cost effective. The performance gulf some've been imagining will probably not materialize... ;)

With current specs, PS4 has:

150% processing power
200% ROPs
250% memory bandwidth

And you think we won't see a noticable difference? I believe it when I see it.
 
Yeah, I'd read that. I have a suspicion they are merely using the nVidia guy's blog post as their source for that bit. But if they are only allowing certain libraries, that definitely points to some sort of standard platform for multiple devices.



Link plz?

One of the 3
http://forum.beyond3d.com/showthread.php?t=62867&page=35

Last couple pages they walk through some of the things and how they could work together. Interesting tech talk and not to hard to grasp. I don't see it impacting the small difference that many of them point out will occur due to the faster Orbis GPU but it does show that MS wasn't just working on the bottlenecks and was(rumoured of course) doing some stuff to add some features. It will be interesting to see for sure.
 
Simply put, at the least, we can in all likelihood say goodbye to piss poor ports for which the developers ask the same money as the proper one, on PS4.
If anything, the power difference might even encourage multiplatform developers to be a little bit more ambitious with their games even if it means making compromises on Durango versions
 
If anything, the power difference might even encourage multiplatform developers to be a little bit more ambitious with their games even if it means making compromises on Durango versions

As big of PS fan I am, I do not want that to happen to XB3 only owners. I have seen PS3 only owners suffer because of this.
 
As big of PS fan I am, I do not want that to happen to XB3 only owners. I have seen PS3 only owners suffer because of this.
I dunno.. As a games fan, I'd much rather see developers not be limited by the lowest common denominator. I'd be happy to see them implementing visual extras that add to the immersion (better physics, more particle effects, locked 60fps, etc) if they can avoid sacrificing the core gameplay on Durango ports.
 
amazing how many computer engineers we have here on GAF ;)

I'm excited about both consoles, and I'm sure games on both machines will be in the same ballpark. Neither Sony or MS are new to this game. Although it does seem the rumored PS4 has a slight edge, which will mean insaaaaane 1st party titles on the PS4.
 
Yeah, I'd read that. I have a suspicion they are merely using the nVidia guy's blog post as their source for that bit. But if they are only allowing certain libraries, that definitely points to some sort of standard platform for multiple devices.



Link plz?

http://beyond3d.com/showthread.php?t=62867&page=33

Scroll through the last 3 pages of the thread.

Does the latter bolded really follow the former. Is the UMA of Orbis considered dumb design and ineffective? Has there been anything to suggest that these would actually bridge any potential performance deficit rather than simply alleviate potential deficiencies due to the choice of DDR3 as suggested by posts on here?

Not a knock against Orbis. If the current specs are THE REAL ONES, then the next Playstation will obviously have the edge, no doubt. What I'm saying is that Microsoft's solution is pretty elegant (If B3D more knowledgeable guys are to be trusted) and could serve some neat purposes : Tiling would be a perfect fit apparently.
 
I dunno.. As a games fan, I'd much rather see developers not be limited by the lowest common denominator. I'd be happy to see them implementing visual extras that add to the immersion (better physics, more particle effects, locked 60fps, etc) if they can avoid sacrificing the core gameplay on Durango ports.
Amen to that. However, they will still go for parity no doubt.
 
It's just speculation, but there has to be a reason why MS would be insisting on standard DX API's.

It could just be they want to make porting Xbox versions of games to PC even easier than 360 to PC so 3rd parties can justify doing Xbox (+PC) exclusives.

Or maybe, make Xbox games work on Windows 8 PCs that meet a minimum spec based roughly on the Xbox.
 
Amen to that. However, they will still go for parity no doubt.
I expect parity for multiplatform titles. Geared towards the lowest common denominator.

But hopefully one has more oomph and/or the systems are still different enough that in-house development studios can leverage the power/architecture to create some stunner first-party titles.
 
amazing how many computer engineers we have here on GAF ;)

I'm excited about both consoles, and I'm sure games on both machines will be in the same ballpark. Neither Sony or MS are new to this game. Although it does seem the rumored PS4 has a slight edge, which will mean insaaaaane 1st party titles on the PS4.

This is pretty much the way I feel though replace PS4 with XBOX3 and PS4 for insane 1st party. Seeing what both have created with current systems I don't see any chance of not being knocked about mentally by what they create both 3rd party and their 1st party games.
 
Thats the 50% extra ALU which I came up with a way to quantify

At fully compute bound loads it going to be

40FPS verse 60FPS
20FPS verse 30FPS

And if that was the case you would see Durango operating at a locked 30fps, cause that would be preferable to a fluctuating ~40fps for most cases(ex. the GoW devs prefer a variable framerate for their game). So in the end Orbis would be 60fps and Durango 30fps anyways, but not because Orbis can render at twice the framerate. Btw I'm not basing this off my own opinion(even though it makes total sense to me), but because this is what Timothy Lottes said.

Now COD operates on the perceived 60fps gameplay. It goes between 50-60fps, and I think averages 50-55fps. So anotherwards, I would think Durango would have to get over that avg 50fps "hump" in order to not be knocked down to a locked 30fps.

I hope both systems use some kind of adaptive vsynch. I dig it on the 680 I have.

How does that work?
 
And if that was the case you would see Durango operating at a locked 30fps, cause that would be preferable to a fluctuating ~40fps for most cases(ex. the GoW devs prefer a variable framerate for their game). So in the end Orbis would be 60fps and Durango 30fps anyways, but not because Orbis can render at twice the framerate. Btw I'm not basing this off my own opinion(even though it makes total sense to me), but because this is what Timothy Lottes said.

Now COD operates on the perceived 60fps gameplay. It goes between 50-60fps, and I think averages 50-55fps. So anotherwards, I would think Durango would have to get over that avg 50fps "hump" in order to not be knocked down to a locked 30fps.

I hope both systems use some kind of adaptive vsynch. I dig it on the 680 I have.

Edit:
I don't even know all the specifics. There are some websites that discuss it.

http://www.hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review#.URMyiG8jNW8

Though that is nvidia. I assume AMD has a similar idea. Though I could be wrong. I just love to tricks.
 
I am not sure...what your talking about exactly. But people take what they want from what they read and their wasn't any desperate thread bump by him...or anyone. So...
I re-read those three pages again and there isnt even remote suggestion that the data move engines would nullify the supposed gulf between the two consoles. While there could be multiple interpretations (it's cool!, meh, it's bad), the jump that Pisteloro took (gulf is virtually gone), is his own. That B3D folks were implying that is just falsely lending more credit to his assumption.

And I never said he bumped the thread, just his attempt here was almost as poor as that thread bump.
 
And when they are all doing compute, they aren't doing graphics, so its a trade off.

Plus I don't see anything forbidding Sony developers to use the remaining 14CUs for compute if they wanted to - its still a standard looking GCN GPU like the Durango one.


Durango
- 12CU for graphics, no compute performance.
- 8CU for graphics 4 for compute.
- 12 for compute, no graphics
Or any combination I guess

Orbis
- 14CU for graphics, 4CU for compute.
- 14 for graphics, 4 for more graphics (not quite getting full efficiency out of them that way maybe).
- 12 CU for graphics, 6 for compute

But it becomes the same as the shader issue on the current 360vsPs3, you can get lower latency, and higher effeciency, offloading tasks between cpu and gpu between downtime and cycles of one or the other.
 
It is my interpretation of why microsoft would choose Move Engines. And i could be totally wrong.
The move engines also have some tiling logic build in so i think they are going after the PowerVR model of Tiled Rendering and what they already had implemented in software with microsoft Talisman. I believe if the wiki is rigth intel Larrabee used tiled rendering.

I think that when they say "tiled" they are talking about the tiling that the GPU's memory controller does to optimize memory performance: http://fgiesen.wordpress.com/2011/01/17/texture-tiling-and-swizzling/

I could definitely see it being useful to take a tiled output buffer and move it via one of the move engines which can detile it back into linear order at the same time (or, for that matter, convert a linear buffer into tiled format during the move).
 
I re-read those three pages again and there isnt even remote suggestion that the data move engines would nullify the supposed gulf between the two consoles. While there could be multiple interpretations (it's cool!, meh, it's bad), the jump that Pisteloro took (gulf is virtually gone), is his own. That B3D folks were implying that is just falsely lending more credit to his assumption.

And I never said he bumped the thread, just his attempt here was almost as poor as that thread bump.

So I guess I read his post as the optimizations would lessen that difference which isn't a gulf to begin with.

I think that when they say "tiled" they are talking about the tiling that the GPU's memory controller does to optimize memory performance: http://fgiesen.wordpress.com/2011/01/17/texture-tiling-and-swizzling/

I could definitely see it being useful to take a tiled output buffer and move it via one of the move engines which can detile it back into linear order at the same time (or, for that matter, convert a linear buffer into tiled format during the move).
They talk a ton about it in the B3d forums and connect it to virtualized textures and so forth.
 
wow seems like a pretty good thing. Best of both worlds.

Ya for sure. I love LOVE this kind of stuff. I have a 680 for power but when I hear about any little trick like this I like it even more. I love that weird shit:) Feels like the first time I used SMAA in a game or hacked my 8500le ATI card by turning on the extra bits.
 
I re-read those three pages again and there isnt even remote suggestion that the data move engines would nullify the supposed gulf between the two consoles. While there could be multiple interpretations (it's cool!, meh, it's bad), the jump that Pisteloro took (gulf is virtually gone), is his own. That B3D folks were implying that is just falsely lending more credit to his assumption.

And I never said he bumped the thread, just his attempt here was almost as poor as that thread bump.

The last 2 pages are not about the whole "console warrior debate" personally I'm over that, I'm excited about both consoles, I actually have a life outside of video games and can afford to buy both consoles on launch day if I choose to do so.

The last few pages are about new development methods\technologies that match up with some of the otherwise unexplained non-traditional decisions MS has made with the design.

Now if you want to ignore all we know about the 360 and current DVR technologies the ESRAM, move engine, cacheing , mega texturing, mesh technologies and tiling bla bla bla and believe the 8GB of RAM are there for netflix and to run SQL and Microsoft Office then go ahead and move onto other threads these are not the droids your looking for.
 
Now if you want to ignore all we know about the 360 and current DVR technologies the ESRAM, move engine, cacheing , mega texturing, mesh technologies and tiling bla bla bla and believe the 8GB of RAM are there for netflix and to run SQL and Microsoft Office then go ahead and move onto other threads these are not the droids your looking for.

Microsoft made a conscious decision to go with a large pool of slower RAM, precisely due to OS functions that you're discounting as being insignificant (they aren't to Microsoft).

All of these extra hardware functions are there to mitigate the issues with having slower RAM.
 
Microsoft made a conscious decision to go with a large pool of slower RAM, precisely due to OS functions that you're discounting as being insignificant (they aren't to Microsoft).

All of these extra hardware functions are there to mitigate the issues with having slower RAM.

OK understood fine...........please explain to us what "OS functionality" requires 2-3GB of RAM while a game is being played?

Keep in mind even the most intensive Direct TV 1080p multi-interface HD-DVR's use a max 512MB of RAM and the current XB360 uses 32MB of RAM to accomplish everything it does today.
 
The last 2 pages are not about the whole "console warrior debate" personally I'm over that, I'm excited about both consoles, I actually have a life outside of video games and can afford to buy both consoles on launch day if I choose to do so.
Was that to zing me?

The last few pages are about new development methods\technologies that match up with some of the otherwise unexplained non-traditional decisions MS has made with the design.

Now if you want to ignore all we know about the 360 and current DVR technologies the ESRAM, move engine, cacheing , mega texturing, mesh technologies and tiling bla bla bla and believe the 8GB of RAM are there for netflix and to run SQL and Microsoft Office then go ahead and move onto other threads these are not the droids your looking for.
Except for tiling and some mentions of megatexture, I dont see much technical discussion there. But a lot of the discussion there also seem pretty meh, choice quote from there;
The move engines seem very underwhelming to me. They don't operate at full memory bus speed, and they all share bandwidth with each other and other on-chip devices. The decode/encode hardware all use terribly outdated algorithms. This has the smell of "better than nothing", with these specific algos presumably chosen because it could be implemented using a bare minimum of transistors.
But yes, like you implied I firmly believe MS is going to preload every 720 with 1 year free subscription of Office 365 (8GB DDR3).
 
Top Bottom