• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Metro 2033 |OT| Fear the Future

Peterthumpa

Member
brain_stew said:
Yes it runs on the 360 but it runs with what are the equivalent of low/medium settings. If that level of graphics is fine for you then your "$650 videocard" will deliver you something like 200fps assuming your CPU is up to it.

Thankfully 4A isn't filled with fucktards that believe gimping graphics = optimisation.
The game delivers a level of graphics for the performance it gives you that is very competitive with other games, if you want it to look like a console game then you can have it look like a console game and run it on a 3 year old GPU just fine but thankfully they've offered options above and beyond that. Remember that word, options, that means you don't have to use them, they don't make the game look any worse on lower settings, they just make it look better if you have the hardware to manage them.

Crysis was the perfect example how to do it right. At "high" settings it obliterated anything else on the market (and still does) and those were perfectly achievable on a $200 videocard (8800GT). There was the option to push the graphics further than that and hardware released after its launched took full advantage of that fact, its a game that keeps on looking better as time goes by, but its "high" settings were still more impressive than the "max" settings of any other game released at the time. Yet, clueless people like you came out and said it was a sluggish unoptimised piece of junk, without having a single fucking clue what the hell they were talking about. It was pathetic frankly, and I will call out anyone who spurts the same shit with regards to Metro.
Honestly, do you have a job there?

You already made your point, read my last post.

Now, don't hide the fact that half employees from your beloved 4A split off from GSC Game World, which are know by the buggy and unoptimized mess that was STALKER when it launched.
 
felipepl said:
Never said that they can't push things forward with new technologies, they just have to do it RIGHT. I mean, why the eff does my card supports tessellation when in the end, when I turn the shit on, it makes the game barely playable?

But you do have a point, I lose.

Sorry, but it just hurts my heart looking at the box of my 5970 and just by my side my TV screen with the FRAPS counter below "30" :lol

Ask ATI, if their hardware is not capable of powering that feature at playable framerates then it appears to be a hardware issue not a software issue. You're also trying to run two of the fastest GPUs available with a pathetic 1GB framebuffer which barely beats the standard set by cards released some 4 years ago. It was a horrible purchase and anyone doing their research would see what a ridiculously huge potential bottleneck it was. Fermi will deliver around 4x the theoretical tesselation and comes with more memory as standard, so if games that push tesselation and consume lots of memory are what you're after, it seems that you just simply didn't do your research tbh.

Fwiw, turning off DX11 depth of field and switching the AA solution to AAA should probably clear up your framerate issues, I suspect that the (comparatively) tiny 1GB framebuffer is wreaking havoc and that should ease the issue.
 
felipepl said:
Honestly, do you have a job there?

You already made your point, read my last post.

Now, don't hide the fact that half employees from your beloved 4A split off from GSC Game World, which are know by the buggy and unoptimized mess that was STALKER when it launched.

Oh, we're playing that card are we? I'm simply addressing each and every issue you raise and in a comprehensive enough style that my argument is fully succinct. This is less about Metro and more about the ridiculous and clueless attitude of some PC gamers in general. Its hold over rage from all those that spread ridiculous amounts of FUD about Crysis if anything, I'm sure as shit not letting that situation crop up here again, if I can help it.

I'm not making any excuses or STALKER's performance, STALKER's engine lead himself said how it was out of date and scales horribly to new hardware. Everything I've seen of Metro thus far, seems to go against that.
 

Slo

Member
Can't wait until my 5850 gets here so I can waffle back and forth between DX10 @ playable rates and DX11 at 10 fps. Knowing me, I'll probably be in DX11 more often than not. :lol
 

derFeef

Member
brain_stew said:
Ask ATI, if their hardware is not capable of powering that feature at playable framerates then it appears to be a hardware issue not a software issue. You're also trying to run two of the fastest GPUs available with a pathetic 1GB framebuffer which barely beats the standard set by cards released some 4 years ago. It was a horrible purchase and anyone doing their research would see what a ridiculously huge potential bottleneck it was. Fermi will deliver around 4x the theoretical tesselation and comes with more memory as standard, so if games that push tesselation and consume lots of memory are what you're after, it seems that you just simply didn't do your research tbh.

Fwiw, turning off DX11 depth of field and switching the AA solution to AAA should probably clear up your framerate issues, I suspect that the (comparatively) tiny 1GB framebuffer is wreaking havoc and that should ease the issue.
I facepalmed back then when ATI announced their cards. Waiting for 2GB non-reference cards for sure. But I honestly think performance could improve with new drivers/hotfixes. Why the devs did not work with ATI, as they have the only DX11 hardware available right now, is beyond my imagination though. *cough* green money *cough*
 

panda21

Member
felipepl said:
Sorry, but it just hurts my heart looking at the box of my 5970 and just by my side my TV screen with the FRAPS counter below "30" :lol

i'm guessing it looks pretty good though right? i went back to my PS3 after a long break of PC gaming on a fairly average machine, and oh boy does even the best stuff PS3 look bad in comparison to recent-ish PC games :lol
 

Peterthumpa

Member
brain_stew said:
Ask ATI, if their hardware is not capable of powering that feature at playable framerates then it appears to be a hardware issue not a software issue.
Not when other "tessellation-ready" games perform just fine (DIRT2, AvP)

brain_stew said:
You're also trying to run two of the fastest GPUs available with a pathetic 1GB framebuffer which barely beats the standard set by cards released some 4 years ago. It was a horrible purchase and anyone doing their research would see what a ridiculously huge potential bottleneck it was. Fermi will deliver around 4x the theoretical tesselation and comes with more memory as standard, so if games that push tesselation and consume lots of memory are what you're after, it seems that you just simply didn't do your research tbh.
Let's talk about this when the cards are out, but recent benches using the UniEngine bench are disappointing, according to Guru3D, HardOCP, etc.

brain_stew said:
Fwiw, turning off DX11 depth of field and switching the AA solution to AAA should probably clear up your framerate issues, I suspect that the (comparatively) tiny 1GB framebuffer is wreaking havoc and that should ease the issue.
Will try that out.
 
REMEMBER CITADEL said:
Actually, somewhere between medium and high is what they said.

So, no Xbox 360 impressions yet?

No they didn't. Their specific comments were that it was comparable to a "mid range PC" whatever the hell that means, they made no direct comparison to the ingame PC graphics options. Everything from the minimum requirements to the screenshots posted in this very thread seems to line up with the fact that its somewhere around low/med, I'll be able to make a pretty decent decision either way when I get the game on Friday, anyway.
 
felipepl said:
Not when other "tessellation-ready" games perform just fine (DIRT2, AvP)


Let's talk about this when the cards are out, but recent benches using the UniEngine bench are disappointing, according to Guru3D, HardOCP, etc.

Actually the Unigine benchmark is the one situation where Fermi completely destroys the performance of the 5870, not surprising really considering Nvidia's tesselation solution is much more advanced than ATI's. It also pulls ahead when 5870 runs out of memory and considering those max DX11 settings i Metro are probably stressing those two areas more than any other game, its really not surprising that ATI's cards are struggling.

The tesselation in Metro appears to be more extensive than any other game thus far, and if you stress a weak area of hardware it becomes a bottleneck for performance. That very well may be the case here. I still stand by the fact that a 1GB framebuffer was never ever going to be a sufficient amount of memory for the 5970, if it can significantly bottleneck a modest 5850 then it sure as hell can bottleneck a card more than twice as fast as it.
 

Feindflug

Member
brain_stew said:
No they didn't. Their specific comments were that it was comparable to a "mid range PC" whatever the hell that means, they made no direct comparison to the ingame PC graphics options. Everything from the minimum requirements to the screenshots posted in this very thread seems to line up with the fact that its somewhere around low/med, I'll be able to make a pretty decent decision either way when I get the game on Friday, anyway.

http://games.on.net/article/7836/Metro_2033_-_Technical_QA

games.on.net: This game is a cutting edge PC game. Was it hard to squeeze these visuals on to the Xbox 360 as well?

Olez: Yes and no. Xbox is a fixed platform, you have all the access to the hardware and you can do a lot of cool stuff way, way cheaper than on PC. So the Xbox’s graphics quality is not of a low quality PC, it’s actually better than a middle-range PC.
 

SuperÑ

OptionN, ShiftN
Just some comments about X360 version: textures take some miliseconds to load, and it'd need a little more AA to look nice through my eyes, but Metro 2033 definitely has one of the most impressive lightning engines ever. Faces are a bit ugly though.
 

FrankT

Member
SuperÑ said:
Just some comments about X360 version: textures take some miliseconds to load, and it'd need a little more AA to look nice through my eyes, but Metro 2033 definitely has one of the most impressive lightning engines ever. Faces are a bit ugly though.

That is pretty much my judgement without even playing it yet which is why I think this setting bar on the platform falls a bit flat. AW has incredible lighting, 720p and 4XAA. That is not to say this game isn't impressive however.
 
I'm inclined to agree about the tessellation... this game makes as much use of tessellation as the Heaven DX 11 benchmark.. the performance similarities between the two are rather striking..

As for video memory.. this is really the only game aside from crysis that seems to be hitting the wall of the 1GB frame buffer.

Just about any other game out there I can play fine in DX 11 at 6064x1080 in eyefinity, but certainly not this one.

I hit the 1GB wall in AVP if I try to enable AA in DX11 mode at 6064x1080, but not at the lower resolution of 1920x1080.

I suppose the only explanation for the performance in Metro is the heavy use of tessellation compared to other games that only use it in certain places.
 
brain_stew said:
No they didn't. Their specific comments were that it was comparable to a "mid range PC" whatever the hell that means, they made no direct comparison to the ingame PC graphics options.

I'm pretty sure they did, it was either in a video interview or in a recent issue of Edge. Unfortunately, I can't check now because I won't be at home until at least tomorrow.

EDIT: Or it could be what Feindflug quoted. Like I said, I can't check now (nor do I find it all that important).
 
Jtyettis said:
Technically that says better.

Well yeah, but when we're talking about all PCs that includes stuff with integrated graphics, so yeah, he's probably right on with that because Xenos is still better than what ships with most PCs from Dell and HP etc. Its a great little GPU considering the time the 360 launched. The point is though that he doesn't once mention actual ingame PC graphics settings, and considering the minimum requirements are far in excess of the 360 (there' not a single multiplatform game I know of where the 360 has outperformed an 8800GT when THE pc is not CPU bottlenecked) its probably wrong to expect better than medium settings.

Its semantics anyway, the point is that it makes very efficient use of the hardware and that's what matters really, in comparison to other 360 games it looks fantastic but of course its not going to hold up to a PC with a GPU 10x as capable. Its the same argument that I'm making aginst those that lament the fact they can't max the game. So long as the game makes efficient use of your hardware, it really shouldn't ever matter that others can make the game look prettier, it should have no impact on your enjoyment of the game and it doesn't make your version look any worse.
 

Marc :D

Neo Member
Dear god,

Preordered! Got my free copy of red faction and am preloading.

My Amd 940 x4 and my 4870 1Gb might perish in the coming weeks. Please save a place for them. They have done well.

:(
 
Good to hear that the 'performance issues' are basically limited to max settings and DX11 stuff. Thanks for the information. I'll definitely get this one sometime, sounds and looks great.
 
brain_stew said:
Well yeah, but when we're talking about all PCs that includes stuff with integrated graphics, so yeah, he's probably right on with that because Xenos is still better than what ships with most PCs from Dell and HP etc.

Something doesn't add up. Going by Feindflug's quote, he says that "Xbox’s graphics quality is not of a low quality PC", which implies that by "low quality PC" he actually means a low quality configuration able to run Metro 2033, not a low quality configuration in general. Why would he be comparing it to shitty hardware that can't even run the game?
 

Feindflug

Member
brain_stew said:
Yeah, that's exactly what I said. They said its equivalent to a "mid range" PC, they didn't say anything about what the equivalent PC settings were though.

That's exactly what you said? so better = comparable & equivalent? well that's interesting.
 
REMEMBER CITADEL said:
Something doesn't add up. Going by Feindflug's quote, he says that "Xbox’s graphics quality is not of a low quality PC", which implies that by "low quality PC" he actually means a low quality configuration able to run Metro 2033, not a low quality configuration in general. Why would he be comparing it to shitty hardware that can't even run the game?

No, it doesn't at all, you're jumping to conclusions. Why do that? Pretty simple really, if he said the 360 version was equivalent to low/medium settings it'd piss a lot of people off and negatively affect the sales of his game, what he says just reads like good PR to me. Like I say I've seen plenty media from the 360 version now so I'll be able to make a good opinion either way come Friday. I'll try and construct some shots with settings as close to the 360 version as possible and I'll let you guys know where it lines up. Its not a big deal or anything, but its cool to know, and it'll be good to see if the engine scales well to more modest PCs as well.
 

dark10x

Digital Foundry pixel pusher
Anyone want to try 720p in DX11?

Tessellation actually doesn't look all that hot to me in this game. It seems to produce results similar to STALKER-CoP in that it simply makes characters appear bloated. On paper, the idea is great, but the results I've seen thus far just don't impress.
 
dark10x said:
Anyone want to try 720p in DX11?

Tessellation actually doesn't look all that hot to me in this game. It seems to produce results similar to STALKER-CoP in that it simply makes characters appear bloated. On paper, the idea is great, but the results I've seen thus far just don't impress.

Yeah, the tradeoff doesn't seem worth it judging by most of the shots I've seen. Some areas benefit a lot (like the helmets) but even without tessellation the models seem pretty damn detailed so it may not be worth the performance hit. AVP's tesselation was subtle but looked pretty damn nice and didn't hit the framerate too much, that's probably the best implementation thus far.
 
AgentOtaku said:
lol

this is a PC game release thread IE: more technical talk than game talk =/

Tweaking a new engine is the best bit! :D


I'm kind of serious as well, I really enjoy it, love seeing how different effects affect a scene and performance
 
brain_stew said:
Tweaking a new engine is the best bit! :D


I'm kind of serious as well, I really enjoy it, love seeing how different effects affect a scene and performance

oh so do I, but I can't help but feel a bit guilty ya know =P

I dunno, I just have the attitude that if it runs well on my typical system config (pretty much 2xAA/4xAA, 1280x800) , than that's all I care about. I wasn't one of those guys who spent long nights playing with crysis config files...
 
Cat in the Hat said:
It bugs the fucking shit out of me. Every PC thread it comes to this.

not so much comes to this, as how is starts off. Eventually the kids calm down and start discussing the game =P
 

dark10x

Digital Foundry pixel pusher
brain_stew said:
Yeah, the tradeoff doesn't seem worth it judging by most of the shots I've seen. Some areas benefit a lot (like the helmets) but even without tessellation the models seem pretty damn detailed so it may not be worth the performance hit. AVP's tesselation was subtle but looked pretty damn nice and didn't hit the framerate too much, that's probably the best implementation thus far.
AvP just hates my PC.

I can play in DX9 mode on both the GTX and the ATI, but DX10 and DX11 mode do not run properly.

Before I installed the rest of my new hardware, the DX10/11 mode would start up but trying to actually get into the game would result in a crash if you tried to play the Marine campaign (it worked with the others).

Now, after re-installing Windows, the DX11 mode simply won't start at all while DX9 plays without any problems.

No idea what's up with that.
 
AgentOtaku said:
not to much comes to this, as how is starts off. Eventually the kids calm down and start discussing the game =P
Using by then I finish the game then return to the thread to see game talk in the later pages so that part is nice. I'm just mad because I'm unable to play this til Saturday at the earliest(out of town) and no game talk :(
 
brain_stew said:
No, it doesn't at all, you're jumping to conclusions. Why do that? Pretty simple really, if he said the 360 version was equivalent to low/medium settings it'd piss a lot of people off and negatively affect the sales of his game, what he says just reads like good PR to me. Like I say I've seen plenty media from the 360 version now so I'll be able to make a good opinion either way come Friday. I'll try and construct some shots with settings as close to the 360 version as possible and I'll let you guys know where it lines up. Its not a big deal or anything, but its cool to know, and it'll be good to see if the engine scales well to more modest PCs as well.

How is that jumping to conclusions and what you just said isn't? Sorry, but you come off as someone ardently trying to defend the superiority of the PC platform even though no one is trying to disprove that. Yes, in terms of features and pure computing power even today's mid-range PC hardware leaves both Xbox 360 and PS3 in the dust. However, you can never optimize for all possible PC configurations the way you can for a fixed platform and that's why you often end up with games looking and performing better on consoles than on comparable PC hardware. Even 4A Games guys confirm as much:

Digital Foundry: How would you characterise the combination of Xenos and Xenon compared to the traditional x86/GPU combo on PC? Surely on the face of it, Xbox 360 is lacking a lot of power compared to today's entry-level "enthusiast" PC hardware?

Oles Shishkovstov: You can calculate it like this: each 360 CPU core is approximately a quarter of the same-frequency Nehalem (i7) core. Add in approximately 1.5 times better performance because of the second, shared thread for 360 and around 1.3 times for Nehalem, multiply by three cores and you get around 70 to 85 per cent of a single modern CPU core on generic (but multi-threaded) code.

Bear in mind though that the above calculation will not work in the case where the code is properly vectorised. In that case 360 can actually exceed PC on a per-thread per-clock basis. So, is it enough? Nope, there is no CPU in the world that is enough for games!

The 360 GPU is a different beast. Compared to today's high-end hardware it is 5-10 times slower depending on what you do. But performance of hardware is only one side of equation. Because we as programmers can optimise for the specific GPU we can reach nearly 100 per cent utilisation of all the sub-units. That's just not possible on a PC.

In addition to this we can do dirty MSAA tricks, like treating some surfaces as multi-sampled (for example hi-stencil masking the light-influence does that), or rendering multi-sampled shadow maps, and then sampling correct sub-pixel values because we know exactly what pattern and what positions sub-samples have, etc. So, it's not directly comparable.
 

saunderez

Member
AgentOtaku said:
lol

this is a PC game release thread IE: more technical talk than game talk =/
I was going to talk about how much I'm enjoying the 360 version and how it's my first GOTY candidate but it just seemed out of place.....
 

saunderez

Member
AgentOtaku said:
your fine =)

Why was it so great for you?
Well I'm only about 4 hours in so far so it could be all downhill from here for all I know but so far it's been a fantastic ride.

Most of that comes down to the atmosphere and setting but for me a lot of the enjoyment is coming from how brutally punishing it can be. You rush into a situation and you will die. Often. You take a bit more time to plan your assault (often resorting to stealth) and you can easily overcome what looked impossible moments earlier.

One thing though, today the 360 version was patched which made the game much better for me. Adjustable gamma being an important one but the main change is a complete overhaul of the weapon selection. Instead of using the dpad to change weapons like the default config you can choose a circular weapon selection option that brings up a radial menu on holding Y. An added bonus is that pressing Y now switches to your melée weapon. This change has made the game much less frustrating for me though I'm still trying to unlearn the original controls.
 
saunderez said:
Well I'm only about 4 hours in so far so it could be all downhill from here for all I know but so far it's been a fantastic ride.

Most of that comes down to the atmosphere and setting but for me a lot of the enjoyment is coming from how brutally punishing it can be. You rush into a situation and you will die. Often. You take a bit more time to plan your assault (often resorting to stealth) and you can easily overcome what looked impossible moments earlier.

One thing though, today the 360 version was patched which made the game much better for me. Adjustable gamma being an important one but the main change is a complete overhaul of the weapon selection. Instead of using the dpad to change weapons like the default config you can choose a circular weapon selection option that brings up a radial menu on holding Y. An added bonus is that pressing Y now switches to your melée weapon. This change has made the game much less frustrating for me though I'm still trying to unlearn the original controls.

Glad to see they've patched/improved the game so quickly. I guess they did some extra testing and tweaking after it had gone gold.
 
Shot from the PC version on LOW settings:

lowj0j6.jpg


Looks to hold up really well so those with lesser hardware need not fear. As expected its pretty similar to the 360 version (better if anything).

More comparisons of the various settings here:

http://translate.googleusercontent....om&usg=ALkJrhg0cKi22q8neRMIvjMzaAEknZe8Sg#top


And a comparison to the 360 version (sadly the 360's gamma seems to be slightly borked):

http://translate.googleusercontent....om&usg=ALkJrhhi_yNgWLsuSX3crYbGzH92tWPAOw#top

These are with "max" setting at 720p (not sure on the AA or DX level). There's a huge difference as expected but the 360 holds up pretty well and the texture detail is rather good, maybe that would be different at higher resolutions but at 720p the texture definition is closer than you might think.
 

Peterthumpa

Member
brain_stew said:
Fwiw, turning off DX11 depth of field and switching the AA solution to AAA should probably clear up your framerate issues, I suspect that the (comparatively) tiny 1GB framebuffer is wreaking havoc and that should ease the issue.
Ok, the DX11 DOF was the enemy here. FPS is much better now.
Anyway to enable VSYNC?
 

Caspel

Business & Marketing Manager @ GungHo
Both copies of Metro 2033 that I am giving away (Xbox 360 version) have yet to have any winners.

Also, my review of the title has gone live after the embargo was lifted.
 

dionysus

Yaldog
On the linearity scale, where does it fall? HL2 being linear, Crysis being in the middle, and Far Cry 2/Stalker being non-linear.
 
Odious Tea said:
Is DX10/DX9 DOF still functional, or is it an all or nothing thing?

Its only a quality thing. There's still depth of field in DX10 mode, just check out any one of a number of DX10 shots posted.
 
Top Bottom