Evil Within System & Hard Disk Requirements (PC/Consoles), strongly suggests 4GB VRAM

I honestly think 2gb cards will be fine.

New reply from gstaff today,

so, Sir. if I buy a MSI Twin Frozr Gaming OC GTX 760 4GB VRAM the game will run smoothly as intended, as opposed to MSI Twin Frozr Gaming OC GTX 770 2GB VRAM although it’s a faster card, for sure?

Gstaff; Having 4GB VRAM remains the recommended way to go. If we have any updates, we’ll let everyone know.


Hopefully they are looking into having an update for people.
 
This is maybe the most improperly handled release of system specifications I've ever seen, and this game comes out in two weeks. No minimum specifications, no elaboration on why it needs 4 gigs of VRAM(are there no graphical presets? is the entire game, top to bottom, just that demanding?), just a bunch of cryptic ass information from Bethesda.

And because of that, I won't be anywhere near this game when it launches until I hear more, which is a bit unfortunate since I completely planned to support Mikami at release.
 
I dont understand...

They are saying we need a grahpics card with 4GB VRAM to enjoy the game properly.

1.64% of steam users have that.

:/

http://store.steampowered.com/hwsurvey/

ROAZ0L6.png


Not many people even have 3GB atm.
 
So it's saying even if your specs are below the recommended requirements, you should have 4GBs of VRAM regardless? So, worse processor, worse graphics, but still have 4 GBs of VRAM? What does optimal performance mean in this context? Does it just mean to run on max settings?
 
So it's saying even if your specs are below the recommended requirements, you should have 4GBs of VRAM regardless? So, worse processor, worse graphics, but still have 4 GBs of VRAM? What does optimal performance mean in this context? Does it just mean to run on max settings?

That's the problem here, they're telling everyone with under 4GB VRAM to "give it a go" without any idea of performance, settings or if the game will even function and they're recommending folks have 4GB VRAM to have the "recommended" experience with no information on performance or settings.

They haven't really said shit, it's annoying.
 
Time to go console again. . no way I'm getting into an arms race with developers who don't value optimization/responsible utilization of resources. PC market is going to crash things keep going like this. .
 
Time to go console again. . no way I'm getting into an arms race with developers who don't value optimization/responsible utilization of resources. PC market is going to crash things keep going like this. .

Quite right, and the "GDDR5 gap" is only going to grow. Indeed, there are some very alarming trends emerging here:

Consider that in early 2013, Nvidia's latest and greatest flagship card debuted with 6GB of VRAM. By late 2014, however, Nvidia's latest flagship cards could only manage a paltry 4GB of VRAM.

Now, if this data rich and highly predictive pattern continues, and there's absolutely no remotely conceivable reason whatsoever to think that it won't, then in 2015, Nvidia's newest flagship card will sport no more than 2GB of VRAM, a small fraction of the memory available on the current generation consoles.
 
Did anybody else purchase a 970 or 980, only to read the recently released VRAM requirements of upcoming games and wish you waited for an 8GB model?

*raises hand*

Personally I have zero problem with games becoming this power hungry, but they have to show for it, and the hardware developers need to stop being stingy with memory - specifically NVidia; AMD typically has a bit more memory on competing cards.
 
Did anybody else purchase a 970 or 980, only to read the recently released VRAM requirements of upcoming games and wish you waited for an 8GB model?

*raises hand*

Personally I have zero problem with games becoming this power hungry, but they have to show for it, and the hardware developers need to stop being stingy with memory - specifically NVidia; AMD typically has a bit more memory on competing cards.

I got a 4GB 770. I'm waiting for 1070 or higher (so 20nm) and 8GB to upgrade.
 
I got a 4GB 770. I'm waiting for 1070 or higher (so 20nm) and 8GB to upgrade.

I'm starting to think the same.

This is exactly why I have such a hard time when talking to my non-pc gaming friends about how viable pc gaming is. They just walk out buy a game and it works.

Meanwhile we seem to be in this really uncertain and confusing period of pc gaming where game devs are demanding increasingly better hardware but I'm not seeing anything out there right now that will futureproof me to any extent.

If there was a 6GB or 8GB 900 series I'd be set. But there isn't and I can't help but feel NVidia are doing it deliberately to screw us over and get us all to double dip with the latest range.

I would be pretty pissed right now if I'd splashed out on a 4GB 900 series as it seems inevitable higher VRAM versions will be upon us at some point.
 
Consider that in early 2013, Nvidia's latest and greatest flagship card debuted with 6GB of VRAM. By late 2014, however, Nvidia's latest flagship cards could only manage a paltry 4GB of VRAM.

Now, if this data rich and highly predictive pattern continues, and there's absolutely no remotely conceivable reason whatsoever to think that it won't, then in 2015, Nvidia's newest flagship card will sport no more than 2GB of VRAM, a small fraction of the memory available on the current generation consoles.
Those seem to be two different product lines, rather than one flagship.

GK110, GM210 (hypothetical): 780, Titan, 780Ti, Titan Black
GM104, GM204: 680, 770, 970, 980
 
Is shared VRAM stable/fast?

It's just system RAM that can be used to store graphics data. How fast it is depends somewhat on how much it's being used. Performance can be ok provided the game isn't having to constantly access more data than fits in dedicated VRAM. If however the amount of data in the scene displayed on screen per frame exceeds VRAM space, performance is likely to take a turn towards woeful.
 
To posit a frame of reference for current GPU hardware, the 8800GTX launched back in 2006 around the same time as the PS3 and the year after the Xbox 360 with 768MBs of VRAM (50% more than either console's combined total amount of RAM). The 8800GTX was a $600-650 monster of its day that trashed both consoles in performance. In comparison, the 780/290/970 level stuff of today has at least such an advantage over the latest consoles, perhaps even more if anything. However, they do not have more VRAM than both systems have total, not even the same amount, potentially not even the same amount actually usable only for GPU-centric assets . It is not unpredented that more VRAM than what we've been generally given would be utilized given the strength of current GPUs. The 780/Ti in particular is far more powerful than only 4x the old 8800GTX, but it only has 4x the VRAM. So where's the problem here?

The problem is GPU manufacturers (especially Nvidia) have not been scaling up VRAM amounts with increases in power properly, they've been very deliberately restricting VRAM amounts to bare minimums outside of their new (ridiculous) premium Titan line and as the (arguably) leading GPU manufacturer of the world, they should have known requirements would raise, yet have elected to do nothing about it or perhaps even actively restrict VRAM in the hopes people would have to upgrade arbitrarily over it. Assigning blame towards the developers vs. the hardware manufacturers is a tricky line, but at the end of the day, the end-result is the same. Nvidia should have and probably did foresee this, but they've taken no preventative measures and have been far too cheap on VRAM. People with less can make do, certainly, but this will restrict high-end hardware from being able to render high-resolution textures as they're fully capable of doing and, in SLI especially, might even face significant AA or resolution limitations. The 780s in particular having only 3GBs and the same for 770s with 2GBs is actually rather disgusting and regrettably, I sold my 780 for a sidegrade to a 970 because I was already slamming into my 780's VRAM wall in certain scenarios (like modded Skyrim, Space Engine, Watch Dogs, tested scenario of certain games downsampled from 4K, etc.). It's a disgusting realization, but the blame rests squarely on the hardware manufacturers for this situation, regardless of whether developers are using too much or not; it is Nvidia (and AMD's) jobs to create the balanced hardware necessary to render the latest and upcoming software and to make smart engineering decisions for their products, and Nvidia have dropped the ball miserably here because it's a lucrative business opportunity. No excuses.
 
Oh dear God.

In some ways, he is right. Sometime PC gaming can cause an insane amount of buyer's remorse. Some of the folks here just spent 500 bucks on a GPU less than six months ago to find out it was less than optimal for the upcoming games. That's not including the cost of other components. I don't agree with him, but I see where he's coming from.

I just got a R9 280x, it sucks somtimes, but I'm rolling with the the punches.
 
Shared VRAM? Well, I've got 11GB of that so sayeth directx, but that's a really weird thing to have as a requirement.
Is that in Dxdiag/display tab?

For me it says "Approx total memory: 4071 MB". But i only have a 8800GT 512MB.

I do have an i5 with 8GB RAM though, i just didn't get a new card so i'm using the old one until i decide to upgrade it.
 
Quite right, and the "GDDR5 gap" is only going to grow. Indeed, there are some very alarming trends emerging here:

Consider that in early 2013, Nvidia's latest and greatest flagship card debuted with 6GB of VRAM. By late 2014, however, Nvidia's latest flagship cards could only manage a paltry 4GB of VRAM.

Now, if this data rich and highly predictive pattern continues, and there's absolutely no remotely conceivable reason whatsoever to think that it won't, then in 2015, Nvidia's newest flagship card will sport no more than 2GB of VRAM, a small fraction of the memory available on the current generation consoles.

Haha, you almost got me...

Found this on the Steam forums (sorry if already posted!) Make of it what you will.

Nice find!

Shared VRAM? Well, I've got 11GB of that so sayeth directx, but that's a really weird thing to have as a requirement.

Yeah that is a bizarre way to word it. I'm sitting at about 11GB to according to DirectX. I have a 3GB 7950 so I'm thinking I'll be all good.

Is that in Dxdiag/display tab?

For me it says "Approx total memory: 4071 MB". But i only have a 8800GT 512MB.

I do have an i5 with 8GB RAM though, i just didn't get a new card so i'm using the old one until i decide to upgrade it.

Wow you are still using that card eh. I found 3 years ago I had to turn down the graphics options enough on games to where they were looking pretty ugly, I shudder to think what a modern game would look like.

Also surprised its still running considering those things died in droves, at least with Nvidia's stock cooling. I manually configured my 8800GT to automatically crank up the fan when playing games otherwise it would overheat and crash.
 
It'd be really interesting if a manufacturer took advantage of this moment in time where many enthusiast PC gamers are uneasy about the future-proofing their rigs can currently provide. This pretty much entirely rests on VRAM requirements, as just about every other spec is able to be future-proofed with getting higher end components. We can buy CPUs that won't break a sweat in some of the most demanding games, GPUs that can be deemed "overpowered" for 1080p rendering (minus the new VRAM debacle of course), a quantity and bandwidth of system RAM that's more than enough, and ridiculously fast storage.

If someone like ASUS decided to be really aggressive and release 6 or 8GB versions of the new Maxwell cards, I feel that would satisfy a lot of us who otherwise built systems with the intention of longevity. It'd probably sell like crazy.
 
Wow you are still using that card eh. I found 3 years ago I had to turn down the graphics options enough on games to where they were looking pretty ugly, I shudder to think what a modern game would look like.

Also surprised its still running considering those things died in droves, at least with Nvidia's stock cooling. I manually configured my 8800GT to automatically crank up the fan when playing games otherwise it would overheat and crash.
Well, i still play last gen titles though like Dead Space 2, Portal 2, Dark Souls, Sonic All Star Racing Transformed and stuff like that but the card performs well in fairly new games like Grid Autosport. All these games i'm referring play at 60fps stable with everything at max, except Autosport that runs at 40-ish.

Also, when i bought the card, i replaced the cooling system with a better one, within the first week. It never went above 65c ever since.
 
This stuff about shared VRAM makes more sense, and people that have cards like Nvidia GTX 780 with 3GB VRAM are likely to have more than the recommended 4GB of system RAM (probably 8-16GB), so hopefully the game will make use of this system memory to swap out the textures as the game needs them.

I am only speculating, but it does feel like Bethesda are doing a poor job explaining it.
 
CyHfSOv.jpg


From the Steam Community. A guy asked Bethesda and they replied the 4GB are concerning "Total Shared Memory" ... see Picture.

For example: My 770 has 2GB dedicated VRAM and 2GB shared VRAM. Together that's 4GB. It's a weird way of posting Requirements ... but well. I thought I should post it.
 
Thanks. Now I can finally breathe a huge sigh of relief.
I wouldn't rest easy just yet. This is such a weird way of wording everything and I wouldn't be surprised if the support staff who replied here wasn't 100% sure what they were saying or what the person was asking.
 
Thanks. Now I can finally breathe a huge sigh of relief.

Keep in mind though that you'll be able to tell when the frame buffer on the video card is full and the spillover into the appropriated 2GB of system RAM happens. Whatever game you're playing will start stuttering for a few moments when the data from the game is loaded into the slower system RAM as opposed to the much faster dedicated VRAM that's built onto the video card itself.
 
So now they're saying you don't need 4GB dedicated VRAM?

I have a feeling Bethesda are just idiots and this'll run fine on lesser hardware, the lack of communication and their community managers vague ass comments are extremely irritating though.

I don't know why it is so difficult for them to be open about what is required. They have lost any chance of me pre-ordering the game at this time.

I am fine with turning settings down as long as I don't have a stuttering low frame rate mess.

Recent tests with games such as NFS: Rivals and Watch Dogs produced bizarre results. I could drop a load of settings down and the framerate stayed almost the same (2-5FPS difference only) as on 'ultra' settings. I don't want a game where I turn settings down and still end up with a poorish performing game.
 
The sad thing is that this is a cross-gen looking game. I can only imagine what games will require as we get deeper into the life cycle of this generation.
 
The sad thing is that this is a cross-gen looking game. I can only imagine what games will require as we get deeper into the life cycle of this generation.

The VRAM recommendation is due to the MegaTexture tech. Wolf14 is much the same -- sure, it's certainly playable on just about any remotely recent GPU, however the game forcibly disables the highest texture cache setting if it detects a GPU with less than 3GB VRAM. The only reason Rage fares better is because the textures were created with the lowest common denominator in mind (in this case the storage capacity of two X360 DVDs), an approach that Carmack later lamented due to the woefully prolonged development period (when the game entered production the X360 was comparable to a high-end PC aside from overall RAM count, however by the time Rage released the console was half a decade old).
 
I'm totally expecting this game to come out and to run fine on video cards with 2 and 3 gigs. Maybe I'm wrong but maybe this will end up being a "look at me, I'm so amazing I need maximum hardware!"' and this turn out to not be the case.
 
I feel bad for all these infant gpu's that are pretty much obsolete by the time they're even installed. Kind of a waste of resources. Gpu's aint got no job security.
 
I'm at a loss as well. i know that certain gpus can use system memory to augment their tiny dedicated memory but I assumed that most gaming level gpus in desktop pcs didn't do that.

Yeah, I think only integrated gpus from laptop do that. But I assume people with laptops also play games, who knew :P.
 
Top Bottom