WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Jesus man calm down...

The guy was just pointing out if Nintendo did improve the character models, then why haven't they mentioned it in Directs ect. His point with OoT 3DS is a very good one.

How is it any more off topic than you posting PS All Stars and LBP Karting screen shots ?.

For someone that moans about being attacked all the time you don't half jump down peoples throats when they post something that doesn't fall into your view of the GPU.

Anyway...

Iwilliams3 mentioned that Link's character model is 83 000 polygons, how many polygons are in the average main character model nowadays ?.

I also wonder if it's technically possible to have this game run at 60fps considering the extra effects it seems to be running. I know it was a design choice to be 30fps but it would be interesting to know if the console could handle it when it struggled to run MH3 Ultimate at anything above 50fps.

Jesus man, stop implying that I am raging and trying to blow up my statement out of proportion when I'm not expressing any remote form of anger and am just asking a question...

That is really annoying.

Neither of those demos use ray tracing.
How do you know this?

Link's jaw-line doesn't look as pointy, but that could be due to jaggies on the GC version...

Now that you are mentioning it, I could very well be mistaking the aliasing reduction as more roundness.

MH3U was a first gen game. We already have confirmation that dev kit tools were unfinished even after launch, well after the game was done. I really don't know what you expect.

That is what they do. They limit the Wii U to it worst ports like it was maxed out at a launch. Its nothing new to be honest. They love using the launch ports as "proof" of it not being much stronger than the last gen consoles, though only the bad ones. Ports like Trine 2 where the dev explicitly stated had better frame rate, texture quality and geometry on top of the the game not being able to run on the PS3/360 anymore without downgrades, or NFS Most Wanted U never come up.
 
You joking right?Explain the real time reflections then on the fish as it comes out of the water or the real time reflections on the floor in the temple Link is fighting Ghoma.

I am all ears.

There's a trick that devs used to do in the GameCube era (and some still do it today).

Basically, they take the same model of the room and then they mirror it in whatever direction the mirrored image is supposed to be; then they make the "reflective" surface semi-transparent. I'll post an image example when I find one.

EDIT: https://www.youtube.com/watch?v=ILesaBQ8kos Note the reflections in the mirrors.
 
There's a trick that devs used to do in the GameCube era (and some still do it today).

Basically, they take the same model of the room and then they mirror it in whatever direction the mirrored image is supposed to be; then they make the "reflective" surface semi-transparent. I'll post an image example when I find one.

Honestly, how many types could it be? There are only a handful of reflective techniques.
There is sphere mapping, cube mapping or HEALPix.

What difference does this make though. I keep hearing people say, "its just cube mapping" like Cube mapping is suddenly something bad the same way they did with bloom on the Wii and I've seen people trying to do with FXAA on the Wii U.
 
Honestly, how many type could it be? There are only a handful of relfective techiques.
There is sphere mapping, cube mapping or HEALPix.

Which are both common reflection maps.

What difference does this make though. I keep hearing people say, "its just cube mapping" like Cube mapping is suddenly something bad the same way they did with bloom on the Wii and I've seen people trying to do with FXAA on the Wii U.

Because claiming it's ray tracing is disingenuous. Almost any device can do cube maps. Most things can't do ray tracing.
 
Ok, just to update what you said earlier:
My slighly different outake :)

- Added non-anamorphic 16:9, renders 1080p
- Reworked graphical overhead from Flipper/TEV Pipeline to a modern GPU/programmable shader environment
- High Dynamic Range (HDR)
- Screen space Ambient Occlusion (SSAO)
- Self-shadowing
- Removed horrible fake-DOF blur filter
- Loads up entire ocean at once to improve traveling/pacing
- 32 bit color framebuffer (up from 16-bit with dithering)
- Redone sky and clouds
- Bigger resolution texture assets

Nothing wrong with the way you were wording it all, though.
Has the texture work also been upgraded with the resolution bump? Red Lion looks a bit smoother than the original.
It did, notice the background texture:

u1kBCUU.gif


It's not even the same asset, it was clearly painted over, reinterpreted; doesn't 100% match.


Other comparison, since we've been at it:

E3Zznsv.jpg


0144.png



It's a crying shame they didn't spruce up geometry a little.
 
There's a trick that devs used to do in the GameCube era (and some still do it today).
I reckon Gamecube could pull it without mirroring geometry (ir supported cube mapping, for instance); PS2 and earlier couldn't (they did it on N64 on Banjo Tooie too) so games like Metal Gear Solid 2 did that. Whether it was too costly or advantageous to do it the other method, I don't know; but Rogue Squadron 2 and others used it.

As for WW in particular, I dunno.

EDIT: Hmm, yeah, probably doubled. Too crisp for GC's framebuffer. Such reflections were usually blurred samples (ie: low res).

g7SkCkS.png
 
Gamecube could pull it without mirroring geometry; PS2 and earlier couldn't (they did it on N64 on Banjo Tooie too) so games like Metal Gear Solid 2 did that.

As for WW in particular, I dunno.

TimeSplitters did it, MGS: The Twin Snakes did it, Twilight Princess did it IIRC (the Temple of Time I think, or there was somewhere with really reflective floors. I don't remember).
 
Cube/environments maps can do that.
http://en.wikipedia.org/wiki/Environment_mapping

Crash Bandicoot on PS1 had something similar for the ice.
http://www.youtube.com/watch?v=zV4BJju6Jd8&feature=player_detailpage#t=31

JordaN I do not know how you confuse real raytracing with cube mapping is beyond me. Cube mapping cannot reflect MOVING objects real time plus you need only one source of light to use cube mapping which is clear on the tech demo you have at least a dozen.

Here is a quote you forgot to mention in your post from the source you provided.

Disadvantages

If a new object or new lighting is introduced into scene or if some object that is reflected in it is moving or changing in some manner, then the reflection changes and the cube map must be re-rendered. When the cube map is affixed to an object that moves through the scene then the cube map must also be re-rendered from that new position.[

If they used cube mapping on the tech demo then Wii U is a beast on the rendering department and it is more capable by your logic.

Real time ray tracing isn't plausible.

At most, it's a screen space reflection.

Here is a video of the real time ray tracing capabilities of a Kepler GPU...
https://www.youtube.com/watch?v=h5mRRElXy-w

Real time ray tracing is capable even on Ps3 and Xbox 360

http://www.youtube.com/watch?v=eG7UGzNAMXQ

They just do not use it because it is very heavy on the hardware.

Wii U GPU is more modern and runs ray tracing more than fine on the tech demos Nintendo showed at E3 2011. If you were objective on the matter you at least acknowledge that in the Zelda tech demo the light source from the torches and windows are reflected real time on the floor and on Link's model as the fairy(MOVING source of light) is always reflected on floor,walls and character model. Also light diffusion happens on the floor of the temple as light is casting from the windows.

Advantages over other rendering methods

Ray tracing's popularity stems from its basis in a realistic simulation of lighting over other rendering methods (such as scanline rendering or ray casting). Effects such as reflections and shadows, which are difficult to simulate using other algorithms, are a natural result of the ray tracing algorithm. Relatively simple to implement yet yielding impressive visual results, ray tracing often represents a first foray into graphics programming. The computational independence of each ray makes ray tracing amenable to parallelization.[4]

But nice try to represent that ray tracing needs a kepler GPU to implement on games.
 
JordaN I do not know how you confuse real raytracing with cube mapping is beyond me. Cube mapping cannot reflect MOVING objects real time plus you need only one source of light to use cube mapping which is clear on the tech demo you have at least a dozen.

Here is a quote you forgot to mention in your post from the source you provided.



If they used cube mapping on the tech demo then Wii U is a beast on the rendering department and it is more capable by your logic.



Real time ray tracing is capable even on Ps3 and Xbox 360

http://www.youtube.com/watch?v=eG7UGzNAMXQ

They just do not use it because it is very heavy on the hardware.

Wii U GPU is more modern and runs ray tracing more than fine on the tech demos Nintendo showed at E3 2011. If you were objective on the matter you at least acknowledge that in the Zelda tech demo the light source from the torches and windows are reflected real time on the floor and on Link's model as the fairy(MOVING source of light) is always reflected on floor,walls and character model. Also light diffusion happens on the floor of the temple as light is casting from the windows.

Advantages over other rendering methods

But nice try to represent that ray tracing needs a kepler GPU to implement on games.

Dude... You're saying I'm not objective? I don't think you understand the processing requirements of that high a resolution of ray tracing.
Lol. Keep thinking that that was real time ray tracing.

EDIT: The PS3 was capable of real time ray tracing when 2 other PS3's were connected... Each Cell has around 180 flops in SPE performance (SIMD based). That is 540 GFLOPs total of some of the best SIMD processing we've seen in years.
http://www.youtube.com/watch?v=oLte5f34ya8
 
JordaN I do not know how you confuse real raytracing with cube mapping is beyond me.
I actually didn't say it was raytracing. I grouped the two together because that's what I commonly associate the term reflection mapping with (I even linked to just the first wiki page), not the weakness.

But I edited it out anyway.
 
JordaN I do not know how you confuse real raytracing with cube mapping is beyond me. Cube mapping cannot reflect MOVING objects real time plus you need only one source of light to use cube mapping which is clear on the tech demo you have at least a dozen.

Here is a quote you forgot to mention in your post from the source you provided.



If they used cube mapping on the tech demo then Wii U is a beast on the rendering department and it is more capable by your logic.



Real time ray tracing is capable even on Ps3 and Xbox 360

http://www.youtube.com/watch?v=eG7UGzNAMXQ

They just do not use it because it is very heavy on the hardware.

Wii U GPU is more modern and runs ray tracing more than fine on the tech demos Nintendo showed at E3 2011. If you were objective on the matter you at least acknowledge that in the Zelda tech demo the light source from the torches and windows are reflected real time on the floor and on Link's model as the fairy(MOVING source of light) is always reflected on floor,walls and character model. Also light diffusion happens on the floor of the temple as light is casting from the windows.

Advantages over other rendering methods



But nice try to represent that ray tracing needs a kepler GPU to implement on games.

Ray casting is being used all the time. Ray tracing is still a ways off in gaming situations. The more complex a scene the more costly ray tracing is. phosphor112 is probably right. It's more than likely SSR.

That's not to write off the future for actual ray tracing, but it isn't there yet. Especially as we ramp up asset model complexity and asset diversity. There's a reason MU is the first Pixar movie to use it instead of just approximating it. Because it's a giant resource hog.
 
Yeah Screen Space Reflection seems to be the most plausible in this case. It doesn't look as rough as realtime cubemapping that usually leverages a Render To Texture feature...

The quality looks too high for SSR. SSR would have more artefacts around the screen edges and where objects get out of view. It's probably just dynamic cubemaps.
 
Yeah Screen Space Reflection seems to be the most plausible in this case. It doesn't look as rough as realtime cubemapping that usually leverages a Render To Texture feature...

Screen space reflection is for objects not light sources. Here is a perfect example that I did not know for Pikmin 3 uses ray tracing.

zlCfzRDV-5UNFquzpa


See how the sunset light hits the water and ray tracing the source on the Onion.

Let's see how you can make that with screen space reflection.
 
Screen space reflection is for objects not light sources. Here is a perfect example that I did not know for Pikmin 3 uses ray tracing.

zlCfzRDV-5UNFquzpa


See how the sunset light hits the water and ray tracing the source on the Onion.

Let's see how you can make that with screen space reflection.

What the fuck are you talking about? You can screen space light sources just fine.

http://youtu.be/JWvgETOo5ek?t=1m52s

See that? That's NOT ray tracing. That's an SSR.
 
Screen space reflection is for objects not light sources. Here is a perfect example that I did not know for Pikmin 3 uses ray tracing.

zlCfzRDV-5UNFquzpa


See how the sunset light hits the water and ray tracing the source on the Onion.

Let's see how you can make that with screen space reflection.

That's just a specular map.
 
Screen space reflection is for objects not light sources. Here is a perfect example that I did not know for Pikmin 3 uses ray tracing.

zlCfzRDV-5UNFquzpa


See how the sunset light hits the water and ray tracing the source on the Onion.

Let's see how you can make that with screen space reflection.

Looks just like the stuff I was doing in photoshop 5 in high school... it's basic light reflections in that scene. my windows 98se AMD laptop handled the calculations on the CPU.
 
What the fuck are you talking about? You can screen space light sources just fine.

the more you know

http://wiki.unity3d.com/index.php/SurfaceReflection

Usage

Prerequisites: This technique requires Unity Pro, version 2 or newer.

Create a material that uses the shader below (FX/Surface Reflection).
Apply the material to a plane-like (i.e. flat) object.
Attach the SurfaceReflection.cs script to the object.
Set the object's layer to 'Water'. (optional but recommended)

Notes:

The reflection happens along the object's 'up' direction by default (green axis in the scene view). (e.g. the built-in plane object is suitable for use as a mirror) If you experience weird reflections, try enabling the script's "Normals From Mesh" option. If the problem persists, try correcting the surface's orientation/axis-direction manually.
If you use this on multiple surfaces, you have to create a separate material for each one, otherwise their reflections will be disrupted when both in view.

Looks just like the stuff I was doing in photoshop 5 in high school... it's basic light reflections in that scene. my windows 98se AMD laptop handled the calculations on the CPU.

Yes your right pikmin 3 can be done with photoshop now...haha good one.
 
So... judging by the other thread, this is.... bad?

But like... objectively poor, or just "oh internet, you hyperbolic drama queen, you" bad?
I asked the internets and it says just code to the metal and all will be well.
 
We've actually went full circle on this ray tracing thing. Some wrongly thinking of its feasibility when we thought the hardware was a 640 ALU part and now back again when down to 160.

I mean... if any game was going to use it it'd be one with minimal asset diversity of more meager beginnings like Pikimin 3, but if Nintendo actually was achieving it at a consistent 30fps and full 720p resolution? They'd be the talk of realtime rendering everywhere.

You're not just wrong Jack, but so wrong that even my patience is tested,
 
He's probably underestimating the actual cost of ray tracing. We won't see any of that with X1 or PS4 either.
I'm sure it'd be possible. But the question becomes do you want your game to have the asset diversity and model complexity of a PS2 game to do it? At which point is there much of a benefit.

I better shut up before someone begins to think WWHD is using it.
 
I'm sure it'd be possible. But the question becomes do you want your game to have the asset diversity and model complexity of a PS2 game to do it? At which point is there much of a benefit.

I better shut up before someone begins to think WWHD is using it.

Yeah, probably. Quake 3 or something was running ray tracing like 5 years ago.. on some hefty hardware. PS4 might be able to do that?... maybe?....
 
I'm sure it'd be possible. But the question becomes do you want your game to have the asset diversity and model complexity of a PS2 game to do it? At which point is there much of a benefit.

I better shut up before someone begins to think WWHD is using it.

I don't think we can achieve PS2 poly counts and full high quality raytracing.
 
In regards to games like Bayonetta 2 being evidence of the Wii U's vastly improved superiority to the PS3. You have to factor in how well the developer tapped the PS3's hardware, how familiar and experienced they were developing such a game at the time, how big their budget was, how long development cycle went for, how developing it for both Xbox 360 and PS3 effected resource and time allocation, and how many resources they had at their disposal for developing the game.

There's no doubt in my mind that had Bayonetta 2 been in development for PS3 as an exclusive it would offer improved visual and graphical enchancements over the original game too. The devleoper would be more experienced with the PS3's architecture, have more time to improve their engine, could focus wholly on the PS3 alone and not worry about other consoles to multi port it too, could apply all the learning and experience they gained making the first game to it, and could invest more resources and time into optmizaition and enchancements.

So really, how much of Bay 2's improvements are down to the Wii U being a superior platform vs the devleoper having more resources at their disposal, longer devleopment cycle, increased experience, devloping the game on a single platform vs multi port, direct support from the hardware vendor themselves, and time to optimize and improve their engine and code from the first game?

It's really making shit up saying Bay 2 is proof of how much superior the Wii U is > Xbox 360 and PS3.
 
We've actually went full circle on this ray tracing thing. Some wrongly thinking of its feasibility when we thought the hardware was a 640 ALU part and now back again when down to 160.

I mean... if any game was going to use it it'd be one with minimal asset diversity of more meager beginnings like Pikimin 3, but if Nintendo actually was achieving it at a consistent 30fps and full 720p resolution? They'd be the talk of realtime rendering everywhere.

You're not just wrong Jack, but so wrong that even my patience is tested,

Where did you gather and confirmed this information about that number, we do not know nothing about how many ALUs the GPU is. Or you use this kind of imaginable information as fact because it suits you in your assumptions.

That's not a ray tracing technique. Read this shit you post before you try to claim ownage.

I wish you 'd follow your own advice, that was information about screen space reflection from Unity Engine wiki. Ray tracing is about reflection of light source and images that light source creates when reflected. The ray tracing the sun lights is making on the water on the pikmin image is diffusion on objects.

402px-PathOfRays.svg.png
 
Yeah, probably. Quake 3 or something was running ray tracing like 5 years ago.. on some hefty hardware. PS4 might be able to do that?... maybe?....
what for though?

Anything can pull ray tracing, thing is graphics will look basic compared to what you can pull through other means because unlike a movie or a render, it's supposed to be realtime, so... something's gotta give.

It's not the game changer it's supposed to be, it's just a resource hog, even some render plugins on 3D software and the like are moving away from it, hell, there's people licensing UE3 for movies; precisely because if you want quick results, relatively light on the hardware so you can render lots of frames with less muscle and 95% the result, ray tracing is not it.
 
Yeah, probably. Quake 3 or something was running ray tracing like 5 years ago.. on some hefty hardware. PS4 might be able to do that?... maybe?....

I don't think we can achieve PS2 poly counts and full high quality raytracing.
Well again it really depends on what you'd be willing to sacrifice. No implementations of it with higher than Quake 3 polygonal complexity has ran at a consistent framerate from any of the tests I've seen.

And none of them are what I'd call exactly accurate ray tracing. Tracing the minimum necessary. Or being rendered at a high resolution, or at a consistent framerate.

There's very little in the way of true technical excellence at play with any of Nintendo's WiiU software Jack. Just impeccable art direction, and serviceable understanding of modern hardware.
 
what for though?

Anything can pull raycasting, thing is graphics will look basic compared to what you can pull through other means.

I know. That's the whole point of my argument. This guy is saying it's ray tracing, when there are MANY other means of achieving full dynamic reflections without the ENORMOUS costs of ray tracing.

Hell Forza 3 and Forza 4 feature full real time reflections when using hood cam. Reflecting other cars by doing real time cube maps.

Forza 5 has super high quality reflections on the hood, but that is strictly limited to the hood because it uses part of the buffer (it even reflects the UI elements)
 
Well again it really depends on what you'd be willing to sacrifice. No implementations of it with higher than Quake 3 polygonal complexity has ran at a consistent framerate from any of the tests I've seen.

And none of them are what I'd call exactly accurate ray tracing. Tracing the minimum necessary. Or being rendered at a high resolution, or at a consistent framerate.

There's very little in the way of true technical excellence at play with any of Nintendo's WiiU software Jack. Just impeccable art direction, and serviceable understanding of modern hardware.

I never said that the Wii U is a powerhouse on a specs technical level but a very efficient machine that can produce excellent IQ and uses all the "next gen" shader languages that all game engines are using nothing more, nothing less.

On the other hand you represent the Wii U as a turd in a shining foil, well a totally disagree with that opinion when I see in 8 months time some very good examples in games like Trine 2 and Nano Assault and Pikmin what the machine's GPU is capable to do if the developers tap it's power.

Some people here have an agenda and I can totally understand that by what misinformation they are spreading even on a simple tray trace shader on Pikmin.
 
I'm curious why Nintendo of all companies would use ray tracing.

This was the same company that said they were still struggling with HD by the way. In fact, I think Miymaoto said they had to get "outside help" for Pikmin 3 so what other developer has knowledge of ray tracing?
 
I know. That's the whole point of my argument. This guy is saying it's ray tracing
He made a mistake or didn't know what he was saying and I cringed, yes.

Since we were talking PS3; at some point sony pulled Ray tracing on PS2's, with the GSCube (graphics synthetizer cube). It's something that'll always resurface down the line as the knight in shining armor that could change anything, but in reality it's the Leroy Jenkins of Knights.
 
I never said that the Wii U is a powerhouse on a specs technical level but a very efficient machine that can produced excellent IQ and uses all the "next gen" shader languages that all game engines are using nothing more, nothing less.

On the other hand you represent the Wii U as a turd in a shining foil, well a totally disagree with that opinion when I see in 8 months time some very good examples in games like Trine 2 and Nano Assault and Pikmin what the machine's GPU is capable to do if the developers tap it's power.

Some people here have an agenda and I can totally understand that by what misinformation they are spreading even on a simple ray trace shader on Pikmin.

Lmfao.

He made a mistake or didn't know what he was saying and I cringed, yes.

Since we were talking PS3; at some point sony pulled Ray tracing on PS2's, with the GSCube (graphics synthetizer cube). It's something that'll always resurface down the line as the knight in shining armor that could change anything, but in reality it's the Leroy Jenkins of Knights.

Holy shit at those specs.

  • 16 × Emotion Engine CPUs clocked at 294.912 MHz
  • 2 GB of DRDRAM Rambus main memory (16 × 128 MB)
  • (128 MB was a common memory allocation on devkits vs. the 32 MB on shipping units)
  • Memory Bus Bandwidth 50.3 GB/s (3.1 GB/s × 16)
  • Floating Point Performance 97.5 GFLOPS (6.1 GFLOPS × 16)
  • 16 × "Graphics Synthesizer "I-32" Graphics Processors clocked at 147.456 MHz
  • 512 MB of eDRAM Video Memory (16 × 32 MB)
  • (The "I-32" Graphics Synthesizer was a custom variant that contained 32 MB of eDRAM instead of the typical 4 MB)
  • eDRAM Bandwidth 755 GB/s (47.2 GB/s × 16)
  • Pixel Fill Rate 37.7 GB/s (2.36 GB/s × 16)
  • Maximum Polygon Drawing Rate 1.2 Gpolygons/s (73.7 Mpolygons/s × 16)
 
Where did you gather and confirmed this information about that number, we do not know nothing about how many ALUs the GPU is. Or you use this kind of imaginable information as fact because it suits you in your assumptions.
It wouldn't matter if it was either. Ray tracing is literally a way to simulate the effects of light in a rendered image. Now if it was only a single ray being cast it would not be a computational challenge for realtime rendering. The issue is how many rays have to be cast for the effect to be optimal. A relatively simple piece of arithmetic. That becomes this huge resource hog as one ray bounces from one material to the next all doing this in tandem with potentially thousands if not millions of others.

Because as asset diversity and complexity rise the need for more rays becomes abundant. More pixels to bounce and illuminate the next material.


Right now it's just much cheaper to prebake your lighting. Which is no doubt what is being done with all of Nintendo's WiiU software. Because no matter if it was a 640 ALU part or 160 it does not even have the power to do the less computationally intensive sparse voxel octree global illumination let alone ray traced lighting.

I hope you don't get mad about this because I don't mean to insult. But truly read some of this stuff before you start sounding off again. Ignorance can be changed, But being earnest about something you don't seem to fully grasp while completely ignoring those trying to set you on a better understanding is not the way to make friends.

I hope you read and have a pleasant time posting. I promise I'm not always this stodgy but drinking Scotch always makes me feel like a gentleman.
 
He made a mistake or didn't know what he was saying and I cringed, yes.

Since we were talking PS3; at some point sony pulled Ray tracing on PS2's, with the GSCube (graphics synthetizer cube). It's something that'll always resurface down the line as the knight in shining armor that could change anything, but in reality it's the Leroy Jenkins of Knights.

Can you please correct me if I am wrong with what I said about ray tracing, what other shader code can produced this kind of effect on light diffusion on surfaces? Real question.
 
Some people here have an agenda and I can totally understand that by what misinformation they are spreading even on a simple tray trace shader on Pikmin.

I fully understand that you won't be convinced by this, but whatever.

I don't even see any reflections in this image at all:

04haEB2.png


The red bit is a glossy specular highlight, this stuff is common all over and doesn't involve raytracing.

Green parts are stupid lens flare cluttering up the scene, plus some bloom.

Blue parts are weird lit parts that don't make any sense given the direction of the light inferred from the specularity in the red circle.

Yellow part looks like geo but could potentially be misconstrued as a reflection.

Maybe there are cube maps in there but they are very subtle. Either way much of the scene looks "off" given the weird inconsistently lit stuff.
 
Can you please correct me if I am wrong with what I said about ray tracing, what other shader code can produced this kind of effect on light diffusion on surfaces? Real question.

When not even the machines that are 5-6x more powerful then the Wii U are pulling off RT raytracing you can be pretty sure the Wii U isn't.
 
When not even the machines that are 5-6x more powerful then the Wii U are pulling off RT raytracing you can be pretty sure the Wii U isn't.
Hell they can't even do SVOGI at an acceptable framerate either. And that's quite a few steps down from ray tracing. I mean in theory it should be a good enough approximation when we can achieve it. But it's still not anywhere near the same complexity.
 
Status
Not open for further replies.
Top Bottom