WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Isn't Injustice 1v1? How is that the same?

It doesn't have to be the same

But still, missing my point. Injustice is able to run on 1080p because of the fixed world size meaning less textures and polygons etc in total, other open world games with more detail can not and often have to dip below 720p. Just pointing to Smash and saying it runs on 1080p as being proof that the Wii U can handle other games at that res is silly.

And again, the proof is in the games.

Even with a smaller node in play, if you compare 40w to 120w, you're going to be able to deduce something pretty quickly.

Well, yeah, but that gets you down to the ballpark range. How do you think this thread got to such a high page count if it was so simple? :P

Can you show how you arrived at this figure?

java.util.random
 
The fact that Megachips is the supplier of the video encoding/decoding LSI is old news (i.e press release) however, the following that's included in a patent application published today (filed Oct 13, 2011 - Nintendo/Megachips as assignees) raised some -possibly stupid- questions here.

untitledb0a4k.png
untitled23gx4y.png


It talks about controlling/setting
/sacrificing
gamepad image quality to eliminate input lag, depending on style of game (fast moving Vs more static image).

It seems to me (and please correct me if I'm wrong) this just involves the encoding/decoding part and not the stuff GPU has to process/render. (i.e same GPU usage)

Really, how does the WiiU GPU handle the TV and Gamepad screens? As two separate "monitors" similar to a dual monitor PC setup?

Do devs have control over other quality settings (AF,AA,etc) regarding what's outputted on the gamepad screen or just the above streaming-quality levels? (assuming what's proposed in the patent app. is real/being actually used)

Hope I make sense :P
 
I still still say that is not true.

We do not know the actual true raw power of the GPU, It is still heavily speculated, and the CPU is overall more capable when used properly.

In RAW power, its over twice as strong as the last gen consoles. That combined with the modern feature sets would place it closer to the XboxOne in my book.


Well that's your choice I suppose. I'm just basing this off the facts we have in front of us.

Even in the most optimistic of speculations, it's closer to PS360. All evidence (ie reading this entire thread) points to it being basically impossible for it to be closer to the power of PS4Bone than PS360. Plus, as StevieP pointed out; Physics.

What evidence is there that its closer to PS4Bone, honestly? And why does it matter anyway? Nintendo clearly didn't design this thing to compete in terms of raw power with the other two. Iwata said as much himself I believe. They designed it (for better or worse) to compete with the gaming experience they can provide. And in any case, visually the gap will likely not be nearly as striking as last gen. (warning, that last sentence contains subjectivity)
 
Well that's your choice I suppose. I'm just basing this off the facts we have in front of us.

Even in the most optimistic of speculations, it's closer to PS360. All evidence (ie reading this entire thread) points to it being basically impossible for it to be closer to the power of PS4Bone than PS360. Plus, as StevieP pointed out; Physics.

What evidence is there that its closer to PS4Bone, honestly? And why does it matter anyway? Nintendo clearly didn't design this thing to compete in terms of raw power with the other two. Iwata said as much himself I believe. They designed it (for better or worse) to compete with the gaming experience they can provide. And in any case, visually the gap will likely not be nearly as striking as last gen. (warning, that last sentence contains subjectivity)

This gen will be much closer to the Gamecube Xbox PS2 than the PSWii60 gen in terms of noticeable visual difference between multiplatform games. From the information I've been able to glean. The worst visual gap will be between PS4 optimized games and wiiU optimized games but that will not even approach the difference between ps3 and wii games all things considered.

That's my opinion on the subject.
 
It would not be a simple comparison. The Wii U uses 45 and 40nm fabrication processes for the CPU and GPU, the One and PS4 use 28nm. Even if you looked at the TDP for the APU and CPU+GPU, it wouldn't do them justice as 28nm can pack a lot more transistors in smaller power envelopes.

krizzx, I take your ignoring my point about Injustice running 1080p on the 7th gen dinosaurs to mean you now agree? Running a game like that isn't as hard as large world games.

Injustice in 1080p?

As in the 2013 game Injustice Gods Among Us?

I don't remember that, i have the WiiU version and that's 720p as far as i can tell. Granted i don't have the PS360 version but according to people in the various Injustice threads here the WiiU version is the better looking one of the 3 so i really doubt that 1080p Injustice statement.
 
Injustice in 1080p?

As in the 2013 game Injustice Gods Among Us?

I don't remember that, i have the WiiU version and that's 720p as far as i can tell. Granted i don't have the PS360 version but according to people in the various Injustice threads here the WiiU version is the better looking one of the 3 so i really doubt that 1080p Injustice statement.

I was honestly wondering that same thing(Injustice native 1080p since when?), but the comparison was already so off base that I didn't see point in discussing it at all.

You can't get a more accurate correlation than Plastyation All-Stars Battle Royal to Super Smash Brothers U.
 
I also believe Wii U is little above PS360 in power, but the other features if used correctly will widen the perceived differences between current gen and Wii U.

I acknowledge the TDP and what has been revealed about the GPU in this thread.

The only thing I question and I wonder, is... what if Wii U has this modest GPU and a solid CPU, but the memory bandwidth is the star of the system. Could it achieve more than the GPU would suggest?
 
Exactly, they look better, while rendering on 1080p on the old 7th generation consoles. How they move has no relevance to performance. My point still stands then, those two consoles couldn't' dream of doing something like Halo 4 or TLoU at 1080p, but since Injustice is a fixed world size there is less for the hardware to do.

"Dynamic and alive" backgrounds are missing my point entirely. Everything in a smaller space has less total detail than a large open world game, does it not? I don't think the failure of communication is on my side, since you both seem to be playing into my point?

And I don't know if one of the above posts was directed at me, but I'm certainly not in the camp trying to make it seem weaker than the PS360. I'm just saying even those old now-weak consoles can do a fixed size side scroller at 1080p, which confirms those games are easier to render at higher resolution. This is not rocket science.

You're trying to pinhole the conversation to prove some point that everyone understands in the big picture of gaming overall, but is disingenuous to the direct comparison that krizzx was trying to make. If character models trump everything to the point to which you suggest for the purposes of your point, then all I can say is Rise of the Robots says hello... The two games chosen for the comparison were chosen for a very good and fair reason.

Also, we should probably move away from the 'More realistic looking = better' train of thought. We as gamers have got to stop giving automatic points to graphics that attempt photorealism as though a character that tries to look like a real human is inherently better looking than a more fantasy based character.
 
I'm pretty sure if Injustice was 1080p native the developers would have trumpeted that fact from the rooftops, this wouldn't be the first thing we heard of it.
 
The two games chosen for the comparison were chosen for a very good and fair reason.

I'm not sure I can agree with that. Performance doesn't depend on what type of game it is but the graphics engine running it. If one is forward rendered and the other deferred then that alone makes them completely incomparable. Or, one could be rendered in HDR and the other not.

A quick googling suggests that All-Stars is using the God of War engine which both defers rendering and uses HDR. I've no idea what's in Smash Bros.
 
I'm not sure I can agree with that. Performance doesn't depend on what type of game it is but the graphics engine running it. If one is forward rendered and the other deferred then that alone makes them completely incomparable. Or, one could be rendered in HDR and the other not.

A quick googling suggests that All-Stars is using the God of War engine which both defers rendering and uses HDR. I've no idea what's in Smash Bros.

All-stars uses the Bluepoint engine.

http://bluepointgames.com/technology/bluepoint-engine/overview/
http://www.mobygames.com/game-group/game-engine-bluepoint-engine
 
I'm not sure I can agree with that. Performance doesn't depend on what type of game it is but the graphics engine running it. If one is forward rendered and the other deferred then that alone makes them completely incomparable. Or, one could be rendered in HDR and the other not.

A quick googling suggests that All-Stars is using the God of War engine which both defers rendering and uses HDR. I've no idea what's in Smash Bros.

Speaking of deferred rendering. wsippel commented recently that the Wii U seems to be set up with deferred rendering in mind. Correct me if I'm wrong, but I believe most current gen games are forward rendered (including the early Wii U ports).

Are games moving more towards the deferred rendering direction overall? Any word on this for early PS4/xbone games?
 
Yeah.

Gamecube


OG Xbox


Bear in mind, this is when Nintendo did give it their all for graphics but the Gamecube was still weaker than Xbox.
Edit: The arms were covering the Xbox so I added another pic.

No they didn't GC was a downgrade from the specs it originally it had including a ton of edram that was unheard of it's time or now. Nintendo wanted the edram at 64MB. Neither system was fully pushed to say the least xbox, cause ms pulled the plug on it too early.

Despite all the paper specs both systems trade off in a bunch of areas and when it comes to texturing or polys the xbox didn't take on the GC with the products that came out.
 
Speaking of deferred rendering. wsippel commented recently that the Wii U seems to be set up with deferred rendering in mind. Correct me if I'm wrong, but I believe most current gen games are forward rendered (including the early Wii U ports).

Are games moving more towards the deferred rendering direction overall? Any word on this for early PS4/xbone games?

Many current generation games already use deferred rendering. Note that the Killzone 2 presentation is from '07 - lots of game engines support it now.

http://www.guerrilla-games.com/publications/dr_kz2_rsx_dev07.pdf
http://dice.se/publications/spu-based-deferred-shading-in-battlefield-3-for-playstation-3/
 
Speaking of deferred rendering. wsippel commented recently that the Wii U seems to be set up with deferred rendering in mind. Correct me if I'm wrong, but I believe most current gen games are forward rendered (including the early Wii U ports).

Are games moving more towards the deferred rendering direction overall? Any word on this for early PS4/xbone games?

Many current generation games already use deferred rendering. Note that the Killzone 2 presentation is from '07 - lots of game engines support it now.

http://www.guerrilla-games.com/publications/dr_kz2_rsx_dev07.pdf
http://dice.se/publications/spu-based-deferred-shading-in-battlefield-3-for-playstation-3/

No one claimed otherwise. I could be wrong, but I don't think that was what he was asking.

That information itself is of no use without more context.

Edit to add more information: Yes. A lot of engines use deferred rendering, people didn't wait for next gen to start on it though. "Most" games might still be forward rendered on current gen but it's not like 99%/1%.

I'm not sure I see anything in the WiiU that makes me think that it was built for deferred rendering. It's nice that they have 32MB of eDRAM vs. the 10MB in the 360 and maybe it can render to more render targets at once? I guess I see those as general GPU improvements and not "it's MADE for deferred rendering"...allthough it is true the technique became popular while the system was in R&D. If there are limitations that force all buffers to be in eDRAM for the GPU to see it, it seems like that would be less good? We've never gotten much information about that, aside from the fact that devs do not have direct access to the eDRAM, right?
 
Well that's your choice I suppose. I'm just basing this off the facts we have in front of us.

Even in the most optimistic of speculations, it's closer to PS360. All evidence (ie reading this entire thread) points to it being basically impossible for it to be closer to the power of PS4Bone than PS360. Plus, as StevieP pointed out; Physics.

What evidence is there that its closer to PS4Bone, honestly? And why does it matter anyway? Nintendo clearly didn't design this thing to compete in terms of raw power with the other two. Iwata said as much himself I believe. They designed it (for better or worse) to compete with the gaming experience they can provide. And in any case, visually the gap will likely not be nearly as striking as last gen. (warning, that last sentence contains subjectivity)

The Wii U is like a perfected 7th gen console. The GPU is quite a piece of work for what, 15 or 20 watts at the max. Makes you wonder what they could have pulled off without the gamepad and they opted for a beefier GPU....or even just designed a more robust cooling system (making the system larger) and ran the GPU faster.
 
Many current generation games already use deferred rendering. Note that the Killzone 2 presentation is from '07 - lots of game engines support it now.

http://www.guerrilla-games.com/publications/dr_kz2_rsx_dev07.pdf
http://dice.se/publications/spu-based-deferred-shading-in-battlefield-3-for-playstation-3/

Most, if not all, major engines are deferred nowadays. UE3, RAGE, Frostbite, and CryEngine3 being prominent examples.

Xbox 360's eDRAM was originally supposed to allow MSAA on all titles through the use of tiling. The move towards deferred rendering (which is incompatible with tiling) was pretty much the reason it stopped.
 

I don't get it.

He asked if "most" games were forward rendered.

You simply stated that there were many which didn't really answer that question. That's just pointing to the simple existence of such games, not an amount. No one questioned if such games existed. We are all aware of that I think.
 
I don't get it.

He asked if "most" games were forward rendered.

You simply stated that there were many which didn't really answer that question. That's just pointing to the simple existence of such games, not an amount. No one questioned if such games existed. We are all aware of that I think.

Going by this list:

http://en.wikipedia.org/wiki/Deferred_shading

I'd guess that today, the majority of AAA games use some form of deferred shading (light pre-pass or full deferred shading).
 
Most, if not all, major engines are deferred nowadays. UE3, RAGE, Frostbite, and CryEngine3 being prominent examples.

Xbox 360's eDRAM was originally supposed to allow MSAA on all titles through the use of tiling. The move towards deferred rendering (which is incompatible with tiling) was pretty much the reason it stopped.
Xenos did not officially support tiling until later versions of its sw stack. Before that tiling was all via the game engines, in the exact same way that article describes - basically MS put the algorithm in the system stack. Bottomline being, Xenos is not a TBDR design - all tiling on it incurs costs. Xenos' successor Yamato (aka Adreno) is more of a tiler, but again, it's hw-assisted tiling, and not a TBDR per se.
 
Xenos did not officially support tiling until later versions of its sw stack. Before that tiling was all via the game engines, in the exact same way that article describes - basically MS put the algorithm in the system stack. Bottomline being, Xenos is not a TBDR design - all tiling on it incurs costs. Xenos' successor Yamato (aka Adreno) is more of a tiler, but again, it's hw-assisted tiling, and not a TBDR per se.

So uhh, out of curiosity, how much later are you talking here?
 
Going by this list:

http://en.wikipedia.org/wiki/Deferred_shading

I'd guess that today, the majority of AAA games use some form of deferred shading (light pre-pass or full deferred shading).

Going by that list, the majority "don't".

30 games in total used deffered lighting this gen.

16 used deferred shading.

9 game engines have support for it.

According to the page, it is heavily memory intensive and geared toward DX10+ hardware which the PS3/360 were not. They could do it, but only to a certain degree.

He was correct. The majority of games used forward rendering.
 
Comparing Bayonetta to Bayonetta 2 is inherently flawed simply because Bayonetta came out four years ago, and was Platinum's first major iteration of the Platinum Engine. On the 360/PS3 they had the chance to refine the engine three times over with Vanquish, Max Anarchy, and Revengeance. And they've further refined it for Wii U hardware with The Wonderful 101. It will look better on the principle that the engine itself is more capable these many years later.
 
Going by that list, the majority "don't".

30 games in total used deffered lighting this gen.

16 used deferred shading.

9 game engines have support for it.

He was correct. The majority of games used forward rendering.

For someone who keeps telling people that they have reading comprehension problems...from that link:

"Deferred lighting in commercial games

Use of the technique has increased in video games because of the control it enables in terms of using a large amount of dynamic lights and reducing the complexity of required shader instructions. Some examples of games using deferred lighting are:"

That's right, there are games that use deferred rendering techniques that are not on that list! I may have even worked on a couple of them!

But if I am wrong, please count the total number of games up and prove me wrong! After all, it's just a hunch on my part - I'll be waiting :)
 
For someone who keeps telling people that they have reading comprehension problems...from that link:

"Deferred lighting in commercial games

Use of the technique has increased in video games because of the control it enables in terms of using a large amount of dynamic lights and reducing the complexity of required shader instructions. Some examples of games using deferred lighting are:"

That's right, there are games that use deferred rendering techniques that are not on that list! I may have even worked on a couple of them!

But if I am wrong, please count the total number of games up and prove me wrong! I'll be waiting :)

Come again?

I read that quite clearly. That is where I got the list. Where is the error in my statement?

Also I don't tell people they have "reading comprehension problems". Asking someone to reread something or telling them they read it wrong is not the same as saying they have a problem with comprehension. The problem was with their interpretation, as some often completely miss the point and purpose of various statements.
 
The Wii U is like a perfected 7th gen console. The GPU is quite a piece of work for what, 15 or 20 watts at the max. Makes you wonder what they could have pulled off without the gamepad and they opted for a beefier GPU....or even just designed a more robust cooling system (making the system larger) and ran the GPU faster.

I think the 550Mhz number for the GPU and the addition of tri-core CPU, is what make most place it in 7th gen. If the GPU has Compute cores, faster shader core clocks with 64kb SRAM blocks. A modern tessellation unit. eDRAM that could very well be faster than PS4 GDDR5 in bandwidth not clock. I would say, the GPU is misleading on first impression.
 
I think the 550Mhz number for the GPU and the addition of tri-core CPU, is what make most place it in 7th gen. If the GPU has Compute cores, faster shader core clocks with 64kb SRAM blocks. A modern tessellation unit. eDRAM that could very well be faster than PS4 GDDR5 in bandwidth not clock. I would say, the GPU is misleading on first impression.

Using numbers like those as markers is extremely misleading.

Core and hertz are in no way the biggest measures of strength strength and capability anymore.

You can take 5 different models of dual or quad core processes and get a huge variant in performance in the hundreds of percentile.

The technology backing the units and the way in which they handle data is the most important thing in modern times. Take lostinblue's explanation for example.

http://www.neogaf.com/forum/showpost.php?p=81275949&postcount=951


You quoted the list and quoted me saying I got the list form the link

?
 
Using numbers like those are infinitely misleading.

Core and hertz are in no way the biggest measures of strength.

You can take 5 different models of dual core processers and get a huge varient in performance in the hundreds of percentile.



You quoted the list and quoted me saying I got the list form the link

?

When comparing identical parts, clock speed is great. Ultimately in GPU's the Wii U, Xbone and PS4 are inherently quite similar. Having a low clock speed is a major detriment. We cannot use it to get an exact percentage of how less powerful it is, but we can tell just by the speed alone that the Wii U doesn't hold a candle to the more sophisticated chips.
 
Come again?

I read that quite clearly. That is where I got the list. Where is the error in my statement?


Going by that list, the majority "don't".

30 games in total used deffered lighting this gen.

16 used deferred shading.

9 game engines have support for it.

According to the page, it is heavily memory intensive and geared toward DX10+ hardware which the PS3/360 were not. They could do it, but only to a certain degree.

He was correct. The majority of games used forward rendering.

Are you seriously using a partial, non-comprehensive list of games and engines (and not even considering how many games are shipped on the engines listed that aren't listed in the list of games themselves) to try to prove your point about all games? That sounds like a pretty definitive statement on your part.

We may never know for sure unless the games reverse engineered, as many developers never give public statements about their rendering technology.

Also I don't tell people they have "reading comprehension problems". Asking someone to reread something or telling them they read it wrong is not the same as saying they have a problem with comprehension. The problem was with their interpretation, as some often completely miss the point and purpose of various statements.

???
 
When comparing identical parts, clock speed is great. Ultimately in GPU's the Wii U, Xbone and PS4 are inherently quite similar. Having a low clock speed is a major detriment. We cannot use it to get an exact percentage of how less powerful it is, but we can tell just by the speed alone that the Wii U doesn't hold a candle to the more sophisticated chips.

Not at all.

The GTX780 has a significantly lower clock then the GTX770 but beats it all around in all benchmarks.

As I've posted before:

A 3 Ghz Athlon XP will outperform a 3 Ghz Pentium D
A 3 Ghz core 2 Duo will outperform a 3 Ghz Athlon XP
A 3 Ghz Core iX will outperform a Core 2 Duo
A 3 Ghz Core iX 2nd gen will outperform a Core iX 1st gen
A 3 Ghz Core iX 3rd gen will out perform a Core iX 2nd gen

Clocks don't tell you the most about capability anymore. If that were true, then the Xbox1 CPU would have been stronger than the GC, but it wasn't by a long shot. GC CPU dwarfed the Xbox1 CPU even though it had a lower clock.

http://www.hwcompare.com/14625/geforce-gtx-770-vs-geforce-gtx-780/

Are you seriously using a partial, non-comprehensive list of games and engines (and not even considering how many games are shipped on the engines listed that aren't listed in the list of games themselves) to try to prove your point about all games?



???

Loaded question.

I made no such point.
 
When comparing identical parts, clock speed is great. Ultimately in GPU's the Wii U, Xbone and PS4 are inherently quite similar. Having a low clock speed is a major detriment. We cannot use it to get an exact percentage of how less powerful it is, but we can tell just by the speed alone that the Wii U doesn't hold a candle to the more sophisticated chips.

Nobody answered this when I asked.

I acknowledge the TDP and what has been revealed about the GPU in this thread.

The only thing I question and I wonder, is... what if Wii U has this modest GPU and a solid CPU, but the memory bandwidth is the star of the system. Could it achieve more than the GPU would suggest?

If the GPU is fed faster and with less latency, can it punch above its weight.

Actually thinking about this phrase "punch above its weight" used by some devs when talking Wii U, I think this is trying to answer the above question. I would say that means performing above what is expected when looking at the hardware.
 
Not at all.

The GTX780 has a lower clock then the GTX770 but beats it all around in all benchmarks.

http://www.hwcompare.com/14625/geforce-gtx-770-vs-geforce-gtx-780/



Loaded question.

I made no such point.

The GTX 780 is a more sophisticated chip. We know that the Xbone and PS4 are more sophisticated chips but not the exact details. I mentioned they were more sophisticated chips. So just by judging clock speed we know the Wii U is less powerful.


When comparing identical parts, clock speed is great. Ultimately in GPU's the Wii U, Xbone and PS4 are inherently quite similar. Having a low clock speed is a major detriment. We cannot use it to get an exact percentage of how less powerful it is, but we can tell just by the speed alone that the Wii U doesn't hold a candle to the more sophisticated chips.
 
You quoted the list and quoted me saying I got the list form the link

?
You said 30 games this gen used deferred lighting. In other words, the games listed on the Wikipedia page are the only ones that used deferred lighting. But the page says these are just some examples of games that use it. Therefore you are wrong, or lying.
 
Going by that list, the majority "don't".

30 games in total used deffered lighting this gen.

16 used deferred shading.

9 game engines have support for it.

According to the page, it is heavily memory intensive and geared toward DX10+ hardware which the PS3/360 were not. They could do it, but only to a certain degree.

He was correct. The majority of games used forward rendering.

Loaded question.

I made no such point.

???

What DID you mean?
 
???

What DID you mean?

It was a list of console games.

I'm sure a ton of PC games use differed rendering, as they have no memory limitations to worry about or any of the other issues the page presented. The list clearly wasn't mentioning PC games so I know it wasn't a completely. There were also a few that clearly weren't on the list like Playstation Allstars, but "you" provided the material so that is what I will go by.

The text made it clear that most console games would not have used deferred rendering do to various constraints. Which is the original question that all of these post spawned from in case you forgot. You are the one who is trying to prove otherwise and the info you have posted has failed to do so.

Most games did not use deferred rendering on the last gen consoles.
 
I'm seriously close to just calling people out here... Deferred rendering has been used pretty much over this entire current generation. It's an incredibly well established fact.

It's the graphics technique that defined this generation.
 
Console games.

I'm sure a ton of PC games use differed rendering, as they have no memory limitations to worry about. The list clearly wasn't mentioning PC games.

The text made it clear that most console games would not have used deferred rendering do to various constraints. Which is the original question that all of these spawned from in case you forgot.

The Wikipedia list again, with console games and engines bolded (might have missed a few):

Use of the technique has increased in video games because of the control it enables in terms of using a large amount of dynamic lights and reducing the complexity of required shader instructions. Some examples of games using deferred lighting are:
Alan Wake
Assassin's Creed 3[12]
Bioshock Infinite[13]
Blur
Brink
Crackdown and Crackdown 2[14]
Crysis 2[15]
Dead Space,[16] Dead Space 2[17] and Dead Space 3[18]
Deus Ex: Human Revolution [19]
Dragon's Dogma [20]
Grand Theft Auto IV
Halo: Reach [21]
inFamous and inFamous 2
LittleBigPlanet and LittleBigPlanet 2[22]
Shift 2 UNLEASHED [23]
Stalker: Shadow of Chernobyl, Clear Sky and Call of Prypiat[24]
Red Dead Redemption
Resistance series[25]
Rochard
StarCraft II [26]
Uncharted and Uncharted 2[27]
Vanquish [28]
Deferred shading in commercial games[edit source | editbeta]

In comparison to deferred lighting, this technique is not very popular due to high memory size and bandwidth requirements. Especially on consoles, where memory size and bandwidth are very limited and often a bottleneck.
Amnesia: The Dark Descent[29]
Battlefield 3[30]
Dungeons
Killzone 2 and Killzone 3[31]
Mafia 2
Miner Wars 2081[32]
Metro 2033[33]
Rift
Shrek[34]
Splinter Cell: Conviction
Tabula Rasa[35]
Trine [36]
Trine 2 [37]
Viva Pinata
Dota 2[38]
Deferred techniques in game engines[edit source | editbeta]

CryEngine 3 [39]
I-Novae [40]
Unity [41]
Frostbite 2 [42]
Unreal Engine 3 [43]
Chrome Engine
MT Framework
GameStart [44]
Source (game engine)[45]
 
I'm seriously close to just calling people out here... Deferred rendering has been used pretty much over this entire current generation. It's an incredibly well established fact.

It's the graphics technique that defined this generation.

I am not questioning if it was used at all. Its clear that it was.

The question is did "most" games use it.


And so, how many games does that make, vs how many PS3/360 games were released?

Once again, the question is did "most" games used forward rendering?
 
Status
Not open for further replies.
Top Bottom