Evil Within System & Hard Disk Requirements (PC/Consoles), strongly suggests 4GB VRAM

If the PC version is intended to run at a next gen experience, then people like myself with a rig more capable than a PS4 should have a good experience, despite having a 3GB VRAM, right? I certainly hope so.

The wording implies that 3GB won't get you parity with the consoles in this game. There is "more capable" and there is having enough VRAM.
 
Whelp .. its IDTech 5 ..

meaning as shit as the PS3 version might look, its still gonna be 60 FPS :p
 
Glad I grabbed a 290. I figured 4GB would be the sweet spot. Still, I think (hope) it scales well for others that have less than 4GB VRAM. Would be kind of weird if it did not.
 
Whelp .. its IDTech 5 ..

meaning as shit as the PS3 version might look, its still gonna be 60 FPS :p

Hehe

People seem to think the words "should have" mean "need to have".

And what does "the team said you can give it a go, but they don't recommend it" means? Does it mean "it's all good guys, go for it, it'll look great and run very well, just no ultra settings"?

I'm perfectly fine with Shadow of Mordor recommending 6GB of VRAM for Ultra Textures, even though I only have 2, and I don't plan on canceling that pre-order and getting it on a console, so I definitely know the difference between "recommended" and "required".
 
Well "givin it a go" doesn't sound like playable to me. You could also "give it a go" ,for example, to run Lords of Shadow 2 on a gt 740m but the game will stutter and lag even on the lowest settings available that it won't be playable at all.

Weird... I played the demo of that on a GT650m (which only seems to be marginally better in benchmarks) and it ran fine.
 
I wonder why they are not revealing the graphics options menu for this game. I guess we won't find out any details until the game is released and people brave enough to download it and see if it runs on non-recommended rigs. I already pre-ordered it and I meet the recommended requirements with the exception of having an I5 4690K, but I'm not worried about that.
 
It's 30 FPS.


They modified the engine with all kinds of lighting effects and shit, so it will run at 30fps on all consoles. Need a PC for that silky smooth.

s8VSiaf.png


I'll be playing the PS3 version anyway, so probably not the best person to complain about frame rates :p
 
If the PC version is intended to run at a next gen experience, then people like myself with a rig more capable than a PS4 should have a good experience, despite having a 3GB VRAM, right? I certainly hope so.

i5 3570K - 7970 3GB reporting for duty.

I'm running i5 3570k and 770 4GB... I'll post impressions, check for VRAM usage, etc.
 
They modified the engine with all kinds of lighting effects and shit, so it will run at 30fps on all consoles. Need a PC for that silky smooth.

You mean they modified it to actually have lighting as expected of a modern engine?

idTech5 has almost 0 real time lighting and shadowing. All the shadows are precalculated fixed projections for the most part or baked into the megatexture.

THem adding features from modern engine design should not somehow make this game 30fps on consoles. Otherwise it is justifiable to say that metro Redux should be 30 fps on consoles because it uses "all kinds of lighting effects." We need to know more and see more colose ups oabout this game is doing before we should say that its performance is justified on any or all platforms.
 
You mean they modified it to actually have lighting as expected of a modern engine?

idTech5 has almost 0 real time lighting and shadowing. All the shadows are precalculated fixed projections for the most part or baked into the megatexture.

THem adding features from modern engine design should not somehow make this game 30fps on consoles. Otherwise it is justifiable to say that metro Redux should be 30 fps on consoles because it uses "all kinds of lighting effects." We need to know more and see more colose ups oabout this game is doing before we should say that its performance is justified on any or all platforms.

I bet it will have dinamic lighting, because Sebastian has a lamp

 
I bet it will have dinamic lighting, because Sebastian has a lamp

Of course it will, but that in now way says anything about its performance profile. Games have had real time lighting and shadoiwing for years and generic statements about that fact, say nothing as to a games performance. Esepcially when it is so ubiquitous/ not necessarily even performance intensive (doom3/riddick/shitt shrek game on OG xbox have fully realtime lighting and shadowing).
 
The Shadows of Mordor ultra textures thread is awash with people panicking that their cards are not up to it with tongue-in-cheek (I think) comments about 970/980 cards having a good run but this game seems a touch more concerning if they are saying you really should have 4GB VRAM.

That definitely rules me out and a fair number of people. I am more than okay with turning settings down but I want a game to be playable.

I am still staying strong and not upgrading my card yet.
 
You mean they modified it to actually have lighting as expected of a modern engine?

idTech5 has almost 0 real time lighting and shadowing. All the shadows are precalculated fixed projections for the most part or baked into the megatexture.

THem adding features from modern engine design should not somehow make this game 30fps on consoles. Otherwise it is justifiable to say that metro Redux should be 30 fps on consoles because it uses "all kinds of lighting effects." We need to know more and see more colose ups oabout this game is doing before we should say that its performance is justified on any or all platforms.

As far as I remember, idTech5 always had real time lighting support. If you remember the earliest demos of it, before RAGE was announced for consoles, it was definitely showing dynamic shadows, etc.

What I remember is that they cut the lighting from RAGE because they were prioritizing 60 fps on the consoles, and didn't want to have to have two completely different lighting models to test for people on PC that could handle it.

I remember them saying Doom 4 would likely be 30 fps on consoles (way back when there was the belief it might launch on PS3 and Xbox 360 in 2012) but that it would feature more dynamic lighting as a result.

All as far as I remember and my memory is fallible, but that's my understanding. idTech5 can do dynamic lighting, just not at 60 fps on 360 and PS3 and iD didn't want to have to light everything twice so cut the feature from the PC version too.
 
You mean they modified it to actually have lighting as expected of a modern engine?
Yeah, basically. :P

The only real confirmed reason we have for TEW being 30fps on consoles is "because it's a horror game", in a tweet from a dev. So the rest is left up to us for speculation.
 
Weird... I played the demo of that on a GT650m (which only seems to be marginally better in benchmarks) and it ran fine.

OT but yes that was truly weird. I finished the main game on the desktop then i wanted to play revelations on the laptop only to find out that it was unplayable @ 1024 * 768.
 
As far as I remember, idTech5 always had real time lighting support. If you remember the earliest demos of it, before RAGE was announced for consoles, it was definitely showing dynamic shadows, etc.

What I remember is that they cut the lighting from RAGE because they were prioritizing 60 fps on the consoles, and didn't want to have to have two completely different lighting models to test for people on PC that could handle it.

I remember them saying Doom 4 would likely be 30 fps on consoles (way back when there was the belief it might launch on PS3 and Xbox 360 in 2012) but that it would feature more dynamic lighting as a result.

All as far as I remember and my memory is fallible, but that's my understanding. idTech5 can do dynamic lighting, just not at 60 fps on 360 and PS3 and iD didn't want to have to light everything twice so cut the feature from the PC version too.

Yeah the first demo video of idTech 5 (which still looks pretty good btw in some areas) had all real time lighting and shadowing. The engine supports it (it is horribly broken in the Rage mod SDK, just so you know).

Just pointing out the fact that a game having real time direct lighting/shadowing on next gen consoles (not 360/ps3) does not say much about whether one should expect 30 or 60. We are beyond that tech hurdle :D. For exmaple though, an early video of this game had volumetric lighting (with indirect off screen shafts and whatnot), something like that being in the game could perhaps justify its fps.
 
4gb vram?

This seems really high for this game.

I bet it wont use it. (Maybe theyr talking 4k?)

I hope, however its not like cod ghosts where it checks for dumb crap like that and doesnt let you even install the damn game if you dont.
 
Yeah the first demo video of idTech 5 (which still looks pretty good btw in some areas) had all real time lighting and shadowing. The engine supports it (it is horribly broken in the Rage mod SDK, just so you know).

Just pointing out the fact that a game having real time direct lighting/shadowing on next gen consoles (not 360/ps3) does not say much about whether one should expect 30 or 60. We are beyond that tech hurdle :D. For exmaple though, an early video of this game had volumetric lighting (with indirect off screen shafts and whatnot), something like that being in the game could perhaps justify its fps.

Well, I'm basing it off of quotes about how Doom 4 was expected to run on 360 and PS3 with more dynamic lighting. Of course, that only explains the framerates on the last gen console versions. I'm optimistic my PC will comfortably sail past 60 fps, but who can say at this point.
 
Well, I'm basing it off of quotes about how Doom 4 was expected to run on 360 and PS3 with more dynamic lighting. Of course, that only explains the framerates on the last gen console versions. I'm optimistic my PC will comfortably sail past 60 fps, but who can say at this point.

such a shame idTech5 cannot use SLI at all really.

A while back some tech review, I forget which one, posted in their review how they lifted the 60 fps limit in Wolfenstein. ANyone have any idea how they did that?
 
such a shame idTech5 cannot use SLI at all really.

A while back some tech review, I forget which one, posted in their review how they lifted the 60 fps limit in Wolfenstein. ANyone have any idea how they did that?

I can't wait until we get out of these 60hz dark ages and people start caring about unlocking 60fps games, not just the 30fps locked ones.

But seriously that's relevant to my interests. Wolfenstein is top of my steam wishlist for the next sale, after I manage to find a 970.
 
I played it on Gamescom and the textures look way better in The Evil Within then they did in Wolfenstein: TNO. Which was already quite hungry for VRAM - if you upped the resolution on that one to 4K it ate about 5-6 Gigs of Vram instantly.

Lighting an Shadows are also improved, so I guess this should come as no surprise - keeping in mind this is id-tech5.

such a shame idTech5 cannot use SLI at all really.

A while back some tech review, I forget which one, posted in their review how they lifted the 60 fps limit in Wolfenstein. ANyone have any idea how they did that?


Yeah, just add "cvaradd com_synctotime -1" in the console. But it fucks up the pacing of the game. Whenever you go over 60 fps everything goes turbo-mode.

I can't wait until we get out of these 60hz dark ages and people start caring about unlocking 60fps games, not just the 30fps locked ones.

But seriously that's relevant to my interests. Wolfenstein is top of my steam wishlist for the next sale, after I manage to find a 970.

No CPU (even oc'd) is able to handle a whole lot more than 60 fps in Wolfenstein. As in quite a few other games as well - so we'd need much more powerful CPUs as well as powerful GPUs.
 
Yeah, just add "cvaradd com_synctotime -1" in the console. But it fucks up the pacing of the game. Whenever you go over 60 fps everything goes turbo-mode.

I find it hard to believe that devs still code their engines at lock frame stepping like this. What on earth are they thinking?!
 
Main reason I wanted to go pc was for removing the black bars and 60fps.

I know it's about devs having the time to optimize last minute, but the latter makes it feel like a brand new console isn't up to snuff.
 
Did Wolfenstein ever get any post-launch patches for optimization and such? Just wondering if the dev team can possibly do so here, at least for the PC version.
 
I will check on afterburner too. Have a 770 2gb. Wouldn't be surprised if it runs just normal... Neogaf really loves to panic. I can also make screenshots on different texture resolutions for comparison.

This panic was self created by them not giving minimum requirements and giving non answers to questions that could of calmed this panic.

It be nice if someone got a pc review version and tested it on non 4GB vram to see how it works before the release. I was going to preorder it or buy it on release date but now I will wait to see how it runs on below requirement computers.
 
I will check on afterburner too. Have a 770 2gb. Wouldn't be surprised if it runs just normal... Neogaf really loves to panic. I can also make screenshots on different texture resolutions for comparison.

If it's anything like Wolfenstein some settings won't appear unless you have the right amount of video memory.
 
I am still up in the air on this.

Please bethesda release a demo or something that lets people who have 2gb cards like a 770 know how this will play. I have to decide whether to cancel my preorder, if i even can.

The way you have worded all your information has not been helpful and has left too much doubt.
 
And what does "the team said you can give it a go, but they don't recommend it" means? Does it mean "it's all good guys, go for it, it'll look great and run very well, just no ultra settings"?

I'd say that means the game will run at less than 4GB of VRAM.
 
So apparently after reading this and the Mordor specs.

The recommended specs are the minimum specs nowdays...
For a small group of enthusiasts that were used to the only question about requirements being how much they would be able to downsample last-gen console games on their 2 year old GPU, and still don't want to buy a new one, pretty much.
 
For a small group of enthusiasts that were used to the only question about requirements being how much they would be able to downsample last-gen console games on their 2 year old GPU, but still don't want to buy a new one, pretty much.

It is a bit ridiculous along with expensive to have a card like a 770 get outdated so fast. Have a bad taste in my mouth from all this.
 
Top Bottom