What was it?
I'm not sure what you are asking.
What was it?
That is true, though technicaly, if the game does not continusally store the precedurally generated data, then you could run and entire game like that is less than 200 MB of memory. You would just have to clear the cache as you proceed.
Generally, the only way you will "need" more than a 1GB is if all of the textures in the game are the highest resolution possible, and the code isn't clean(ie. it doesn't ovewrite data for things that aren't on screen anymore.)
A dev would have to make an intentional effort to make a game require more than 2GB of RAM.
Maybe when we reach the point where we can game in 4k, Then I could see us absolutely needing more than 2GB.
It's also considered bad code by today's standards.Assembly has always been my favorite means of programming.
It is the only way to truly get the most out of hardware.
Of course they do, or did.I wonder if Shin'en program some of their game components in assembly.
Stop, just stop.They get such a high quality model with such little data put into use. Also, on another note, the Wii hardware had some really nice occlusion capability.
Source: http://jettrocket.wordpress.com/2010/06/21/part-three-from-vision-to-wii/In our proprietary Game editor we place all of the geometry, bake ambient occlusion, setup lighting conditions and so on.
Froblins runs at 22fps on a HD 4890 by the way. Not really sure how that's helping Wii U especially since R700 is still outdated and that's a much more powerful card (almost Xbox One level?).
Also, that's a tech demo. Everyone knows games play differently. There was never a game made with it that I know of.
Speaking of ambient occlusion, I couldn't help noticing it Wind Waker HD. It's presumably screen-space but it's still no mean feat considering it's running at 1080p/30. Having low geometric detail probably helps, though.
Not really. Any game can fit onto a DVD if you compress the assets enough and make cuts to the resolution and fidelity of those assets -- just like any game can run comfortably with 2GB of memory, if the same things hold true.That is an an extreme exaggeration of my point and far cry from what I was trying to say.
I'm not raging, I'm disagreeing with you.Though, why are all of you raging about this? Its just my opinion. If you don't like it, you don't have to believe it. Goodness
Playing Tomb Raider at high spec pushed me past 2GB of VRAM usage, and that's a game that was made and for and targeting current-gen systems. As the generation wears on, a low amount of memory will become a larger and larger issue -- just like it has with current-gen consoles.Generally, the only way you will "need" more than a 1GB is if all of the textures in the game are the highest resolution possible, and the code isn't clean(ie. it doesn't overwrite data for things that aren't on screen anymore.)
A dev would have to make an intentional effort to make a game require more than 2GB of RAM.
Maybe when we reach the point where we can game in 4k, Then I could see us realistically needing more than 2GB.
Well put; took the words from my mouth. Different developers have different methodologies and different requirements and objectives. Shin'en, coming from the demo scene, is focused on technical brilliance and optimization. That's great for them, but they're not a ruler to measure other teams against.And you've gotta quit comparing small teams that use proprietary tools on a limited scope and whose heritage is the demoscene to big studios, Shin'en deserves lots of praise, yes, they're also only capable of doing it because they're small and their games are also small in scope (they say as much, being small is an option; and there's reason to be so), their source code tree structures are probably pretty complex due to them actually going assembly and the like but still manageable because they wrote it and there aren't 100 dudes messing with the code or adding to it. It's a commodity.
One game claimed to using it; the "one game claimed to use it, it must be in the spec AND FUN FACT: IT'S REALLY GOOD AT DOING IT" mentality has to stop; it clearly wasn't.
First of all, you're looking at the model passes; not all passes must be done by the hardware; one can pre-bake ambient occlusion; if you had read the same articles you're drawing your conclusions from you'd know as much:
Source: http://jettrocket.wordpress.com/2010/06/21/part-three-from-vision-to-wii/
This plays to Wii's strenghts; which is texturing; realtime ambient occlusion being another completely different implementation that would be way too taxing and require manipulation via software to achieve (in fact, perhaps even this one required as much as I'm imagining it being done via normal mapping which wasn't on spec, but what I'm saying is real-time one would be like suicide), jeez. That though, makes it so that it's not showing off any "really nice occlusion capability".
Stop... jumping... to... conclusions!
And you've gotta quit comparing small teams that use proprietary tools on a limited scope and whose heritage is the demoscene to big studios, Shin'en deserves lots of praise, yes, they're also only capable of doing it because they're small and their games are also small in scope (they say as much, being small is an option; and there's reason to be so), their source code tree structures are probably pretty complex due to them actually going assembly and the like but still manageable because they wrote it and there aren't 100 dudes messing with the code or adding to it. It's a commodity.
Also, take mention of the fact I just told in the beginning of the post that, assembly is considered bad code by today's standards; that's because if you go assembly you'll be making ports a nightmare; Shin'en isn't tackling huge scope nor are they doing multiplatform, and that's the reason. Of course a "big studio" can't go there, they would have to be bollocks to, or at least have a very big incentive (huge userbase in said platform) to do so.
Thankfully though, they don't have to, and you're just comparing apples to oranges.
Converting assets is not so hard, providing the basis for the game can run, of course there are sweet spots, and compressing too much is never optimal (cough Zelda TP cough), but it can be done; lots of developers did so this generation on console conversions via batch conversion, and although that's not a professional way to do things no one will notice unless they had a more optimized version in hands and therefore knew it was possible.Playing Tomb Raider at high spec pushed me past 2GB of VRAM usage, and that's a game that was made and for and targeting current-gen systems. As the generation wears on, a low amount of memory will become a larger and larger issue -- just like it has with current-gen consoles.
You're just confusing the fact it could do 8 hardware lights for free with whatever you want to conclude (occlusion, is it?). Or perhaps you think the rim shading/fairy shader on Mario Galaxy is occlusion?I wasn't basing that staement solely on Jett rocket. It just reminded me. Wii games often had global illumination which is one of the most common uses of ambient occlusion. Games like Mario Galaxy and Metroid Prime 3 seemed to use it extensively.
I won't even try to elaborate further, I made my point; please drop it.
Playing Tomb Raider at high spec pushed me past 2GB of VRAM usage, and that's a game that was made and for and targeting current-gen systems. As the generation wears on, a low amount of memory will become a larger and larger issue -- just like it has with current-gen consoles.
You don't need 4k resolutions to blow through VRAM.
I only did because you made me to.Just drop it? You're the one who made a big issue out of it. Though, whatever.
I only did because you made me to.
You try to pose theories and theories you pulled out of nowhere as facts.
And you never backtrack on them, or admit to being wrong. Like I said before, it's damaging to the thread because we have to lose our time debunking them to someone that might just bring it up as a fact two pages ahead. You seem to believe things become true through repetition, your repetition.
Hell, I'll shut up, I was trying to not say anything, but it was too much.
Can I laugh as an answer? Otherwise we'll just get into quote wars you'll dance your way around, as always.Where did I ever promote any of that as fact?
That was just pure theory. You took it as fact apparently, but I made no such claim.
I said it had good occlusion capability(as in potential), nothing more.
You mean I never admit to doing something that didn't do. As I just said, my exact word "capability". The only way that would be wrong is if it were impposible totdo occlusion on the Wii at all.
Is it impossible? If not, then you are the only one who is in the wrong and refuses to admit it here. Don't throw dirt and run when you get called out on your mistake. That is so immature
Not promoted as a fact? LOL.Also, on another note, the Wii hardware had some really nice occlusion capability.
I'm not sure what you're saying here, that the game can't or doesn't go above 2GB VRAM usage because your card ran it? Because it certainly can; as I said, it did for me. I don't have my exact settings on hand, but I cranked up everything as high as I could except, IIRC, shadows and AA due to performance issues before new drivers had hit. I was above 2GB VRAM used.I played through all of Tomb Raider(that game was such a disappointment by the way) on my PC with the setting at the highest possible. Tessellation and all. My 560Ti does not even have 2 GB of RAM. I ran into no major issues. It ran pretty fluidly.
Again, I could say the same thing of storage media. Just because developers can do without it doesn't mean they should. Games are all the better for pushing technological boundaries, not making do with what we (almost always incorrectly) assume is the limit we can feasibly use.I can think of no circumstance where not having more than 1GB would absolute prevent a game from being able to run effectively at the moment. The days of memory/storage capacitive bottlenecks are behind us.
Can I laugh as an answer? Otherwise we'll just get into quote wars you'll dance your way around, as always.
You should try politics.
Anyway, this is offtopic on my part, I'll drop it since you won't.
What mistake? Seriously, I must be talking to a wall.
You're the one saving face, you're just not doing it very well.
i dont know anything about occlusion, but the spat is at this stage:
jett rockets occlusion got defeated by lostinblu showing it was baked. krizzx then said global illumination is an example of it and used in galaxy and metroid prime 3 (no evidence, but i dont know what it is/how obvious it was or known) so guess that needs discussion/refuting/recategorising.
also you said good capability, not just capability.
I said used in Metroid Prime 3 and Mario galaxy "from what I can tell".
so... based on you saying "on another note" in the edit, you first of all didnt provide the examples/proof there, but then cite later on something you arent sure on?(note this is meant to show 'good capability' by your claim). note again: i know nothing of these things, just picking you up on bad critical writing.
This thread turns into a brawl every 10 posts.
This thread turns into a brawl every 10 posts.
I'm still confused by what you were saying in the post I quoted above, regarding Tomb Raider. Could you clarify for me?
I'm not sure what you're saying here, that the game can't or doesn't go above 2GB VRAM usage because your card ran it? Because it certainly can; as I said, it did for me. I don't have my exact settings on hand, but I cranked up everything as high as I could except, IIRC, shadows and AA due to performance issues before new drivers had hit. I was above 2GB VRAM used.
Again, I could say the same thing of storage media. Just because developers can do without it doesn't mean they should. Games are all the better for pushing technological boundaries, not making do with what we (almost always incorrectly) assume is the limit we can feasibly use.
Actually, this was not confirmed.As for Wii U, we know it fished above R700 on stuff like the Eyefinity implementation.
And I agree, as I never said it was necessary. I used it as an example specifically because it can reach that amount, and it's a current gen game targeting 7-8 year old hardware. If a game optimized for current gen hardware can reach 2GB of VRAM usage, what does that illustrate going forward as assets increase in quality/resolution and size? That's my point: 2GB may seem like a lot today, even if there are games that get there already, but it won't be out of reach for long as everything trends upwards in size, as it inevitably and always does.I was saying that even 2 GB was not necessary to run that game at its highest settings.
Consoles aren't PC that need to dedicate memory to system tasks and other things all of the time. They have pure access.
A lot of people who claimed that the Wii U having only 1 GB of accessible memory(currently) and the Xboxone and PS4 haven't 4-5 will prevent games from being portable within reason but I disagree.
Actually, this was not confirmed.
Multiple display support (that was in the press release) doesn't have to infer Eyefinity (which is just being able to hook up to 3-6 displays). You can still output to more than one monitor on older GPU cards.
Regardless though, I don't think it means much considering Wii U only needs to output to one screen (and was never explicitly stated to do more).
the wii u can output to 3 screens
Edit: Was referring to the gamepad. But woah and behold, Nintendo was only pushing for one display (off tv) so I'm still kinda right!
And I agree, as I never said it was necessary. I used it as an example specifically because it can reach that amount, and it's a current gen game targeting 7-8 year old hardware. If a game optimized for current gen hardware can reach 2GB of VRAM usage, what does that illustrate going forward as assets increase in quality/resolution and size? That's my point: 2GB may seem like a lot today, even if there are games that get there already, but it won't be out of reach for long as everything trends upwards in size, as it inevitably and always does.
We're talking about VRAM, however, so I'm not sure why you're discussing system tasks and other running applications, since that affecting RAM more than VRAM (if at all).
Because back to Tomb Raider, it's using up to 2GB+ of only VRAM, in addition to whatever it holds in regular RAM. The WiiU has only 1GB, and it has to act as the pool for both. Thus, my point.
It's actually two. Nintendo wanted to "remove the TV" or whatever was the reason why they promoted the gamepad. But we're still only seeing one gamepad being used for time being (even though it supports 2).No, you're still wrong. Two GamePad-support has been confirmed at E3 2012.
TV + GamePad 1 + GamePad 2 = three screens.
No, you're still wrong. Two GamePad-support has been confirmed at E3 2012.
TV + GamePad 1 + GamePad 2 = three screens.
It's actually two. Nintendo wanted to "remove the TV" or whatever was the reason why they promoted the gamepad. But we're still only seeing one gamepad being used for time being (even though it supports 2).
I really don't quite understand what you've posted
I said it didn't matter in the end how much displays Wii U supports. By totality, it only outputs to the TV and the gamepad so far but Nintendo's philosophy for this thing also pushed for no TV making it one gamepad.I really don't quite understand what you've posted
No, you're still wrong. Two GamePad-support has been confirmed at E3 2012.
TV + GamePad 1 + GamePad 2 = three screens.
I think he's saying it supports two gamepads by not using a TV at all. Because Nintendo's original vision was to cut the TV out of the equation entirely..? I think? Correct me if I'm wrong!
Yes, devs have access to 1GB currently; thus, why I said 1GB. I was excluding the 32MB eDRAM because I'm speaking of total volume, not speed or bandwidth. Regardless, neither point addresses what my concern is.That is incorrect. The Wii U has 1 GB currently avialable to dev(we don''t know all that the other 1GB does but some could be freed for devs in the future) and 32 MB of extremely fast, low latency EDRAM.
By Sony's on admission, EDRAM pushes the performance WAY beyond what it would be at a glance. http://www.gamechup.com/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gbs/
Yes, devs have access to 1GB currently; thus, why I said 1GB. I was excluding the 32MB eDRAM because I'm speaking of total volume, not speed or bandwidth.
Until we know the WiiU's real-world performance of its eDRAM, saying that it pushes performance way beyond is speculative.
That's not how it works. Procedural generation leads to small file sizes. That's pretty much it. It still requires tons of raw performance and RAM. Ember in particular is extremely demanding in both regards, even though the executable is only 1kB.And This is why I scoff at statements that try to make issues out of meager RAM capacity differences over 1GB. No game "needs" more than 2GB of RAM to to make a high quality game. Having more than that just allows for more laziness in development.
I said it didn't matter in the end how much displays Wii U supports. By totality, it only outputs to the TV and the gamepad so far but Nintendo's philosophy for this thing also pushed for no TV making it one gamepad.
and I have seen a screenshot somewhere from wii u unity where you select between the tv, gamepad 1 and gamepad 2 as the display
That's not how it works. Procedural generation leads to small file sizes. That's pretty much it. It still requires tons of raw performance and RAM. Ember in particular is extremely demanding in both regards, even though the executable is only 1kB.
That says that it's performing its duty acting as a high-speed place to move data back and forth; it does not give any measure to how fast it's doing that. It certainly does not imply 1TB/s speeds. That's an inference,Shin'en have attested to the huge speed boost that the eDRAM allows.
So because no game currently supports two gamepads and the system supports off-tv play so you're reducing the screen output one...
...and what is your point exactly with that?
Unity demo, I think it was GDC?
That it didn't matter if Wii U was displaying 6 screens. You were never going to see that much.So because no game currently supports two gamepads and the system supports off-tv play so you're reducing the screen output one...
...and what is your point exactly with that?
That it didn't matter if Wii U was displaying 6 screens. You were never going to see that much.
Lostinblue brought up Eyefinity...Was somebody talking about six screens earlier? Otherwise I don't get where you're getting six from now...
That it didn't matter if Wii U was displaying 6 screens. You were never going to see that much.