WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Did you guys see the Mario and Sonic Olympics game for Wii U? Terrible graphics but still low framerate. This is so depressing.

It's Call of Duty/Pokemon syndrome. Why put the effort in for graphics when the gameplay is so addictive to your audience, they're going to buy it regardless?

How that game looks has nothing to do with the capability of the console, but everything to do with the choice of the developer.
 
It's Call of Duty/Pokemon syndrome. Why put the effort in for graphics when the gameplay is so addictive to your audience, they're going to buy it regardless?

How that game looks has nothing to do with the capability of the console, but everything to do with the choice of the developer.

I am sorry to call you out on this, but there seems to be an excuse for eveything. Maybe there is still time to optimize before release, but if Wii U were as powerful as others are saying, that game should be running upwards of 60fps. Framerate directly affects gameplay and should not require huge effort if the chip is so powerful.
 
If I remember correctly an interview with a crytek developer said they could get more out of the WiiU (granted whatvthey had already was great i think 720 or 1080 at solid 30fps)if they could turn the gamepad off but there wasn't any way to do it.

I played the resident evil demo with the pro controller and the gamepad shut off completely. Maybe its now an option since the update.
 
I am sorry to call you out on this, but there seems to be an excuse for eveything. Maybe there is still time to optimize before release, but if Wii U were as powerful as others are saying, that game should be running upwards of 60fps. Framerate directly affects gameplay and should not require huge effort if the chip is so powerful.

and theres always an excuse (edit: excuse isnt right word, cant think of the right one though) why we have to argue over a shitty performing game when theres far better showcases at hand? its ridiculous.
edit: why dont we wait for a non-stream trailer, and maybe show floor. in the mean time look at something else or even look at new pikmin shots. tangent: Just me or could they have looked amazing with a bit of tessellation?
 
I am sorry to call you out on this, but there seems to be an excuse for eveything.

yup. There was a guy complaining about how the wii was shortchanged, and we never saw its full potential because every developer except for one [that never released a wii game] misunderstood it. The wii!

Unconquerable hope is a really good trait to have, so god bless 'em.
 
Yeah, the normal map theory really doesn't make sense when you analyze it, because I was sure normal maps were usually made using CG.
I don't even understand what you're saying. normal maps usually made using CG?

Normal maps are usually one of the last steps, you generate a high resolution model with maya/3ds/etc... Maybe even use plugins like zbrush. You have a high poly model. Then you generate normal maps, that compensate for lower poly models. These are to be used in-game, though your 3d modeling/animating software can render them too.

As for what you see in modeling/animation software, they could be using any version of the model, probably the lower poly ones if they are doing anything that requires responsiveness (like animation).

Though anything around 200k or lower is likely already using normal maps, a high poly source model will likely have well over a million, and probably multiple millions of poly's depending on the artist. Whether that 130k model is "gameplay" is another matter, often the highest LOD will be reserved for non interactive scenes.
 
What exactly is this "better shaders" and when did it have it and this was before the changes for the development hardware ? Do we even known when this changes were made to the development hardware.

Do you have anything to back this up.

I love (not really) the way that you're quick to diss everyone elses ideas and ask for information to back them up but when pressed to give your theories and engage in a discussion, you completely duck the question by saying stuff like 'It' silly to discuss this when we don't have the information'. Why bother coming into this thread at all in which case?
 
I apologize if I have been a bit touchy lately in this thread. GPU speculation is nothing more than a fun hobby, afterall. You could even call it a somewhat bizarre (nerdy) offshoot of gaming in general. Admittedly, it is slightly frustrating that the possibility of a 160 shader Latte is dismissed so hastily (without proposing some type of massive architectural overhaul). I, personally, would love for there to be more going on there, but the deeper I have dug, the less likely it seems.

The bottom line is that all of us are in way over our heads with this technical banter. None of us really know how these game engines run and where exactly the bottlenecks are. I am taking an educated guess, based on what I have read from developers, that RAM has been a major bottleneck this generation. The number one request that devs have made to the console manufacturers is more RAM, and that's been true for a long time. RAM makes all the difference. Let's look back at some of function's posts on beyond3D:

http://forum.beyond3d.com/showthread.php?t=60501&page=190

As he states, in the CoD4 comparisons (1024x768 and highest settings, btw), the GDDR5 version of the 6450 gets 40%+ better performance over the DDR3 version despite being only 20% faster in GPU core clock. In Hawx, there is a ridiculously eye opening 76% difference. If anything goes to show that shader count does not necessarily an efficient GPU make, this is it!

No, unfortunately there is no perfect test, because there is no true retail configuration of Latte at its precise speed along w/ the proposed core config and mixture of eDRAM + DDR3, but that does not mean we can not make some general observations in saying that it might, nay, probably is possible for a 160:8:8 graphics chip to pull off the results we see in the wild.

I have a few more observations about the chip layout and bg's proposed scenario to write up shortly, but I just needed to get that off my chest.
 
I am sorry to call you out on this, but there seems to be an excuse for eveything. Maybe there is still time to optimize before release, but if Wii U were as powerful as others are saying, that game should be running upwards of 60fps. Framerate directly affects gameplay and should not require huge effort if the chip is so powerful.

Call me out? Okie dokie...

Are you suggesting that the game should look better or have a higher frame rate but the hardware doesn't allow for it?

Because that is absurd considering even the piddling mess we've already had released on the console.

Just like it was with the Wii games, they aren't going for any sort of presentation high marks with Sonic and Mario. The games will look nice enough, and that's about it. Might they decide to make it look better before release? Sure. But none of it has to do with how capable or incapable the console is.

Maybe I'm misunderstanding where you are coming from and you can clarify.
 
Call me out? Okie dokie...

Are you suggesting that the game should look better or have a higher frame rate but the hardware doesn't allow for it?

Because that is absurd considering even the piddling mess we've already had released on the console.

Just like it was with the Wii games, they aren't going for any sort of presentation high marks with Sonic and Mario. The games will look nice enough, and that's about it. Might they decide to make it look better before release? Sure. But none of it has to do with how capable or incapable the console is.

Maybe I'm misunderstanding where you are coming from and you can clarify.

Sure, I'll clarify. Even if the code is not optimized or the devs decide to completely mail it in, if Latte were a more capable processor, it theoretically should not take much effort to have that game running much better than we saw. I'm willing to give the devs the benefit of the doubt since the game is unfinished, but I also don't think that they are that incompetent that they're getting those framerates out of a GPU that is supposedly a good 1.5x Xenos.
 
I love (not really) the way that you're quick to diss everyone elses ideas and ask for information to back them up but when pressed to give your theories and engage in a discussion, you completely duck the question by saying stuff like 'It' silly to discuss this when we don't have the information'. Why bother coming into this thread at all in which case?

Wow nice strawman.

I ask for information because it clear he doesnt know what he is talking about. Statement he is making just do not make sense. His argument for everything is "no way 160 shader can do X". Base off nothing. Here is a random statement that this game looks better so again no way a 160 shader part could do that without magic not see on this planet or whatever nonsense.

I have laid out the most likely explanation and have clearly shown my work. I have dev statements, benchmarks, power consumption and die itself all which I have went in detail about.
 
Sure, I'll clarify. Even if the code is not optimized or the devs decide to completely mail it in, if Latte were a more capable processor, it theoretically should not take much effort to have that game running much better than we saw. I'm willing to give the devs the benefit of the doubt since the game is unfinished, but I also don't think that they are that incompetent that they're getting those framerates out of a GPU that is supposedly a good 1.5x Xenos.

I think that even if for the sake of argument we accept a premise of '60fps is more aesthetically pleasing, even to the people who don't consciously realize the difference', we have to separate that from the reality that the majority of the Sonic and Mario gaming crowd don't care much about such a difference even if they notice, and thus that it's not a priority for Sega with this game.

They could probably release this game at 480p again and still sell millions. But just being HD will be a buzz point for that audience. People will see this new game and say 'Oh HD! Cool! I have one of those TV's now, so I should get this!'

I'm just saying that 60fps is nowhere near as important to most gamers as it is to people like us who talk about games on the internet, and would probably only ever buy this game so that we can play it with people who can't tell or don't care about the difference between 60 and 30 fps anyway. If you are Sega, you get the game up and running at a solid 30fps, then you worry about the gameplay. If for whatever reason, in the end you want it at 60fps, then you go there. The idea of some day one of these developers for Sonic and Mario saying, 'Yeah, we wanted to go 60fps for this game, but it would have been too much work', when The Wonderful 101 is at 60fps, just doesn't ring anywhere near something likely to ever be possible to me.
 
I am sorry to call you out on this, but there seems to be an excuse for eveything. Maybe there is still time to optimize before release, but if Wii U were as powerful as others are saying, that game should be running upwards of 60fps. Framerate directly affects gameplay and should not require huge effort if the chip is so powerful.

If it were that simple then every PSN/XBLA/Eshop game port from older consoles would be running at Full HD 1080p 60 FPS.

Ports don't magically upgrade themselves. It costs money to port them and it cost more to upgrade them. Show me where one single dev has shown that they went all in on a port to a Nintendo console? Unless the code is augmented, it will run exactly as it did on the other hardware.

Also, you are crossing over a point I've brought up before. Games don't make thesmelves. Then there is the fact that no matter how much you do at a higher resolution/frame rate, you can do more at a lower one.

You are dismissing the factors of developer skill, effort, budget, knowledge of hardware and time constraints
 
To be fare, the environments in Frostbite engine made games do have tendency to be filled with large amounts of nothing. At least from an interactivity stand point.

Yeah, well, when you have 32 people all flying jets or driving tanks around crashing all over the place and blowing up buildings in enormous multiplayer maps sometimes you have to sacrifice a little detail.
 
If it were that simple then every PSN/XBLA/Eshop game port from older consoles would be running at Full HD 1080p 60 FPS.

Ports don't magically upgrade themselves. It costs money to port them and it cost more to upgrade them. Show me where one single dev has shown that they went all in on a port to a Nintendo console?

Also, you are crossing over a point I've brought up before. Games don't make thesmelves.

Now you're comparing multiplat games likely running over a huge abstraction layer, some using hand drawn art and some made by a team of 1-2 people to an exclusive collaboration between Sega and Nintendo made specifically for Nintendo's flagship console?

Actually, I think the graphics look quite nice, but I doubt that Sega simply "wouldn't gun for 60fps" if the power were there. All their arcade titles ran at 60fps going back to the glory days of Daytona USA. It's important for more than just aesthetics and completely changes the feel of gameplay.
 
I think that even if for the sake of argument we accept a premise of '60fps is more aesthetically pleasing, even to the people who don't consciously realize the difference', we have to separate that from the reality that the majority of the Sonic and Mario gaming crowd don't care much about such a difference even if they notice, and thus that it's not a priority for Sega with this game.

They could probably release this game at 480p again and still sell millions. But just being HD will be a buzz point for that audience. People will see this new game and say 'Oh HD! Cool! I have one of those TV's now, so I should get this!'

I'm just saying that 60fps is nowhere near as important to most gamers as it is to people like us who talk about games on the internet, and would probably only ever buy this game so that we can play it with people who can't tell or don't care about the difference between 60 and 30 fps anyway. If you are Sega, you get the game up and running at a solid 30fps, then you worry about the gameplay. If for whatever reason, in the end you want it at 60fps, then you go there. The idea of some day one of these developers for Sonic and Mario saying, 'Yeah, we wanted to go 60fps for this game, but it would have been too much work', when The Wonderful 101 is at 60fps, just doesn't ring anywhere near something likely to ever be possible to me.

This isn't 2007, bro.
 
Now you're comparing multiplat games likely running over a huge abstraction layer, some using hand drawn art and some made by a team of 1-2 people to an exclusive collaboration between Sega and Nintendo made specifically for Nintendo's flagship console?

Actually, I think the graphics look quite nice, but I doubt that Sega simply "wouldn't gun for 60fps" if the power were there. All their arcade titles ran at 60fps going back to the glory days of Daytona USA. It's important for more than just aesthetics and completely changes the feel of gameplay.

??? Where did I say anything about SEGA or an exclusive? I was just speaking in response your statement about what "should" be done if the Wii U is stronger. Not every developer is going to go all in, especially given the history of third party game receptions on Nintendo hardware.

When I said ports, I was referring to things like the FF7 and Mario Kart 64 type ports primarily. You can't blame the hardware for a game not showing all of the upgrades that are possible in every game ported to the system. There is too much more to it.
 
??? Where did I say anything about SEGA or an exclusive? I was just speaking in response your statement about what "should" be done if the Wii U is stronger. Not every developer is going to go all in, especially given the history of third party game receptions on Nintendo hardware.

When I said ports, I was referring to things like the FF7 and Mario Kart 64 type ports.

I was making a point regarding a specific title in Sonic and Mario which you countered with something that has no relevance to that game?
 
I was making a point regarding a specific title in Sonic and Mario which you countered with something that has no relevance to that game?

I was responding to your comment to the other person who was talking about COD/Pokemon. You dismissed his point, but I thought it was quite valid.

As far as the Sonic game is concerned, I think it would suffice to say Nintendo Land and Sega Allstars Racing were doing far more than what that game was showing and they didn't have the frame hiccups. The game is likely still in development much like Monolith's X or Bayonetta 2 was, so I won't jump to conclusions. Nintendo probably just showed what footage was ready to be shown. Just as there was absolutely no footage of Sonic Lost World, but simply a screenshot. There is likely not enough functioning game segments to make a decent video for it. It was likely beta footage Sonic and the Olympics.

Right now Nintendo seems to be trying to show people that there are major games for the Wii U and trying to get their names out there. I want judge the games till they release.
 
I was responding to your comment to the other person who was talking about COD/Pokemon. You dismissed his point, but I thought it was quite valid.

I am going to write an answer to the thing you mentioned, where you talked about the person who was talking about COD/Pokemon. This is my response:
jtxt8gI.jpg
 
Admittedly, it is slightly frustrating that the possibility of a 160 shader Latte is dismissed so hastily (without proposing some type of massive architectural overhaul).
Don't be frusterated, it's only being dismissed by a few. I personally think it's a high possibility (160 shader), as with all things Nintendo, it's always safest to side on the conservative side, if it would at all reduce costs, and could be made up somewhat with optimization.

It'd be great to be proven wrong though, hopefully we can get some further leaks or breakthrough soon :)
 
Don't be frusterated, it's only being dismissed by a few. I personally think it's a high possibility (160 shader), as with all things Nintendo, it's always safest to side on the conservative side, if it would at all reduce costs, and could be made up somewhat with optimization.

It'd be great to be proven wrong though, hopefully we can get some further leaks or breakthrough soon :)

Precisely. I have dropped the logic that "X way of doing things has been show to be demonstrably better than Y, thus Nintendo must have gone with X in Latte!" This applies to things like hardware interpolation and the VLIW5 architecture in general. It might not make sense to us as technology enthusiasts, but it's not unfathomable that it made sense to Nintendo at some point in time, for whatever Nintendo reasons they have (price, simplicity, "good enough" mentality).
 
I know bro. But for many of the gamers, we consider casual, they are relatively new to HD, and/or have played SD games on HD TV's. I.E. much of the Sonic and Mario crowd.

The majority of casual gamers have also been playing casual games on their 300dpi phones and 1080p+ pc monitors for a great long time now..
 
I apologize if I have been a bit touchy lately in this thread. GPU speculation is nothing more than a fun hobby, afterall. You could even call it a somewhat bizarre (nerdy) offshoot of gaming in general. Admittedly, it is slightly frustrating that the possibility of a 160 shader Latte is dismissed so hastily (without proposing some type of massive architectural overhaul). I, personally, would love for there to be more going on there, but the deeper I have dug, the less likely it seems.

So, what is supposed to be the explanation behind the improved draw distance, lighting engine, high res textures etc in NFS:MW? I know the RAM bonus goes a long way (especially for textures), but i have a hard time believing a small team going back to a fast port to spruce it up in a month or two, on hardware with bad documentation, on which they've had zero experience can do a better job than done on the lead development platform, on which they've had 7 years of experience. If you ask people to be realistic... i have to ask. Are you? I've tried to stay out of these threads for a while, because it keeps going back and forth between trolls and fanboys and the only poster who's word i'd take as fact (Blu) is not showing the back of his tongue (it seems).
I know shaders aren't everything, but i have a hard time believing the GPU would actually be ill-featured compared to Xenos whith ports such as ME3, AC3, NFSMW... wich are all high-profile games not looking much better or worse on hardware where the main dev teams have had years of practice on.

I am sorry to call you out on this, but there seems to be an excuse for eveything. Maybe there is still time to optimize before release, but if Wii U were as powerful as others are saying, that game should be running upwards of 60fps. Framerate directly affects gameplay and should not require huge effort if the chip is so powerful.

Wii Music 5000 fps confirmed.
 
I'm getting this same vibe. Apparently, any game made should automatically peak the consoles power and if it doesn't, then its because because the console isn't that strong.

Straw man alert! I'm talking about one arcade style sports/racing game where 60fps has undisputed benefits to gameplay and somehow this becomes about Wii Music?

So, what is supposed to be the explanation behind the improved draw distance, lighting engine, high res textures etc in NFS:MW? I know the RAM bonus goes a long way (especially for textures), but i have a hard time believing a small team going back to a fast port to spruce it up in a month or two, on hardware with bad documentation, on which they've had zero experience can do a better job than done on the lead development platform, on which they've had 7 years of experience. If you ask people to be realistic... i have to ask. Are you? I've tried to stay out of these threads for a while, because it keeps going back and forth between trolls and fanboys and the only poster who's word i'd take as fact (Blu) is not showing the back of his tongue (it seems).
I know shaders aren't everything, but i have a hard time believing the GPU would actually be ill-featured compared to Xenos whith ports such as ME3, AC3, NFSMW... wich are all high-profile games not looking much better or worse on hardware where the main dev teams have had years of practice on.

It's a somewhat complicated answer that I don't believe I'm qualified to give. However, I have been reading over some of the improvements DirectX10 level shaders made to their predecessors and will post them in a bit. However, of those things you mentioned, Criterion cited improved optimization of their lighting engine rather than utilizing extra shading power. High res textures are obviously the result of more RAM. Draw distance could be RAM related as well or the overall increased speed of the card, but I'd have to research that a bit more before making a bolder statement.

I don't think I'm being unrealistic as all my posts have provided reason to back up my assertions. I have yet to read a satisfactory explanation as to why it would be "impossible" for us to see this level of performance out of a 160 shader Latte.

Realistically speaking, it seems highly unlikely that Nintendo/Renesas have achieved greater shader block density than Llano, which is a 32nm part! Sure, there might be some additional logic in Llano's shaders to support DirectX11, but the same could be said for whatever DirectX10.1+ features Nintendo have come up with in addition to the logic which runs the abstraction layer for TEV code.
 
Straw man alert! I'm talking about one arcade style sports/racing game where 60fps has undisputed benefits to gameplay and somehow this becomes about Wii Music?

How is that a strawman? I never said "you" had stated that. I said that is the vibe I'm getting.

I'm not talking about any style, because it makes no difference. A game will only demonstrate as much capability as the dev is willing to put forth effort into it, and it takes time. Any glitch, flaw, inconsistency can bottlekneck the performance of something.

Like I said in my previous post. What we were likely seeing was beta game play that was shown just to show off the game and nothing more. You are judging the systems potential based on a game play video for a single, technically unimpressive game.

Do you think the game will run like that off the shelf?

Perfect example would be Epic Mickey 2. The Wii U version has frame rate and slow down issues everywhere, and they happen for odd reasons. The Wii version, which is made on much weaker hardware, has none. Please, explain how that works to me. Is the Wii stronger than the Wii U?

http://www.youtube.com/watch?v=vvsKPtEhuy8
http://www.youtube.com/watch?v=Qqq7Ff9Zjeo
 
Straw man alert! I'm talking about one arcade style sports/racing game where 60fps has undisputed benefits to gameplay and somehow this becomes about Wii Music?
Just let it go, man. They will defend everything Nintendo does. Even if this game ran at 5 fps (its very close to that anyways). " Thats a design choice dude, its about le gameplay, not about teh grafix and fps"
 
Not sure what the technical definition of 'crap' is though.

How is this relevant to the GPU discussion? You seem to henge very heavily on the most negative comments you can find about the Wii U. Did that dev give any technical data to back his statement? What makes his statement more valid than anyone else? How does this help or contribute to anything? Its like this has turned into a dumping ground for fan bias.

This has become far less about the facts and more about ego. Can we please, get back on topic? Why is there so much hostility?
 
How is this relevant to the GPU discussion? You seem to henge very heavily on the most negative comments you can find about the Wii U. Did that dev give any technical data to back his statement? What makes his statement more valid than anyone else?

senior software engineer and game architect vs diletante forum posters? I need to explain this, srsly?

How does this help or contribute to anything?

senior software engineer and game architect comments on 'power' (whatever that is) of wii u in comparison to 360 + people here have been debating wii u vs xbox as an indirect way of ascertaining most likely setup of gpu...

Its like this has become a dumping ground for fan bias.

This has become far less about the facts and more about ego. Can we please, get back on topic?

umm, ok.
 
How is that a strawman? I never said "you" had stated that. I said that is the vibe I'm getting.

I'm not talking about any style, because it makes no difference. A game will only demonstrate as much capability as the dev is willing to put forth effort into it, and it takes time. Any glitch, flaw, inconsistency can bottlekneck the performance of something.

Like I said in my previous post. What we were likely seeing was beta game play that was shown just to show off the game and nothing more. You are judging the systems potential based on a game play video for a single, technically unimpressive game.

Do you think the game will run like that off the shelf?

Perfect example would be Epic Mickey 2. The Wii U version has frame rate and slow down issues everywhere, and they happen for odd reasons. The Wii version, which is made on much weaker hardware, has none. Please, explain how that works to me. Is the Wii stronger than the Wii U?

http://www.youtube.com/watch?v=vvsKPtEhuy8
http://www.youtube.com/watch?v=Qqq7Ff9Zjeo
Maybe the Wii U cant handle Wii Code that well when the games are upscaled to HD. Maybe the Mario and Sonic game was also devolped on the Wii and they ported the engine without optimization to the Wii U and increased the resolution, polycount and textures. Maybe this GPGPU thing isnt just an option, but mandatory if you want your games to perform well. Just a theory.
 
Maybe the Wii U cant handle Wii Code that well when the games are upscaled to HD. Maybe the Mario and Sonic game was also devolped on the Wii and they ported the engine without optimization to the Wii U and increased the resolution, polycount and textures. Maybe this GPGPU thing isnt just an option, but mandatory if you want your games to perform well. Just a theory.

Based on what? Please explain how this theory makes sense. I am lost.

If that is the case, then Pikmin 3 should have demonstrated those problems, but it never did. Pikmin 3 runs at 60 FPS as well if I"m not mistaken.
 
Based on what? Please explain how this theory makes sense. I am lost.

If that is the case, then Pikmin 3 should have demonstrated those problems, but it never did. Pikmin 3 runs at 60 FPS as well if I"m not mistaken.
Based on Mickey Mouse and Mario/Sonic. Also this could be the reason why they cant just dump all the Wii Virtual Console games to the Wii U at once. They might need optimization. I experiened some terrible framerate drops in that Kirby game for example. How the hell does this happen? Its a NES Game.

Edit: Also, wasnt there a trailer analysis for Pikmin 3 and it turned it, it was locked at 30 fps?
 
Based on Mickey Mouse and Mario/Sonic. Also this could be the reason why they cant just dump all the Wii Virtual Console games to the Wii U at once. They might need optimization. I experiened some terrible framerate drops in that Kirby game for example. How the hell does this happen? Its a NES Game.

Edit: Also, wasnt there a trailer analysis for Pikmin 3 and it turned it, it was locked at 30 fps?

My understanding is that the slowdown is there in the original NES version, and was retained for the sake of authenticity. I'm sure it's no reflection on the capabilities of the Wii U.
 
My understanding is that the slowdown is there in the original NES version, and was retained for the sake of authenticity. I'm sure it's no reflection on the capabilities of the Wii U.

This is the exact point I've been making. A port will have all of the problems that it had originally unless they are specifically correct, and, in may cases, it will also have problems from lack of optimization on different architecture.

I don't see why I'm bothering now, though. It seems this thread has been overrun with ego, bias and closed minded presumptions. Its stopped being about fact and rational assessments. We were progressing so much yesterday too.
 
Not sure what the technical definition of 'crap' is though.
There's no technical definition of crap. The guy apparently conveys the narrative that goes at his employer though. Apparently EA are in a feedback loop re how 'unworthy' the WiiU is. Well, best of luck to them.
 
There's no technical definition of crap. The guy apparently conveys the narrative that goes at his employer though. Apparently EA are in a feedback loop re how 'unworthy' the WiiU is. Well, best of luck to them.

Still trying to figure out what makes the tablet "weird" myself.
 
Status
Not open for further replies.
Top Bottom