Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

2GB GTX 770s simply don't have enough VRAM. You should be comparing 4GB GTX 770s not the 2GB models.

Not with 2GB of VRAM he won't. It's already a bottleneck in the here and now and it will only end up to prove crippling as time goes by. If you want to get through this generation comfortably at 1080p then I feel the the GTX 970 is the first midrange card that will allow you to do that.

Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?
 
Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?

VRAM in SLI is mirrored, so it doesn't actually add; you could still only use 4GB. I can't imagine what we'd be seeing in something like Skyrim if people could stack 4 Titans / 780s for 24GB VRAM.
 
Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?

4GB will be enough for the foreseeable future if you are playing at 1080p. By the time that won't be enough it's probably time to upgrade anyway.
 
I want to get the iPhone 6 plus. But i also want to get this graphics card. By the time I'll be able to buy it, the 980 is going to be sold out.
 
Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?

No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.
 
-Massive size, weight and desk space
-Higher power draw
-Audible whine(though very small)
-No multi monitor
-Lower resolutions
-4:3
-CRT Drift over time
-Analog signal (VGA)


While CRT's have superior color reproduction and input lag, there is no reason to own one in todays day and age. Its out dated technology.

And this is coming from someone is has owned nearly every monitor in existence and used the holly grail of CRT's- Sony FW900. I would take my Dell 32'4k or Asus ROG over the Sony FW900 any day.
crts are not limited to low resolutions or 4:3 (if you had an fw900 you'd know that)
A crt does not produce any noise, if it does it is broken.

crt used vga because at the time digital connections like dvi were in their infancy and only had enough bandwidth for a pityful 1600*1200@60hz

I was typing a whole reply explaining how lcd tech is fundamentally flawed and how telling it is that you think your TN panel is superior to a crt..the biggest feature of your monitor is brightness destroying backlight strobing that requires high refresh rates (so a powerful pc) to approach crt motion clarity.
Whatever hardware you have in your pc, it's not enough for 1440p @144hz (resolutions and refresh rates crt monitors reached 12 years before your monitor was released btw, welcome to last decade) , so enjoy your upscaling blur @120hz 1080p or enjoy your awful motion resolution blur at 1440p 60 hz.

But you can research this shit in detail for yourself

The basic gist is backlit pixels + sample and hold+ slow lcd pixels make for a fundamentally flawed technology which was a huge step back from crt tech.
10 years ago you didn't need an unholy monster pc to enjoy pixel perfect sharp motion at higher resolutions, because crt tech didn't suck dick and didn't need asinine bandaids like backlight strobing to produce an acceptable image in motion.
You could also enjoy pixel perfect contrast, black blacks , no black crush and superior color accuracy (at all viewing angles)

Sorry man but if you think a tn panel is worth a shit you are wrong, I understand it in light of having to justify spending 650 euros on one though.

One last thing, input lag is not inherent to lcd tech, it's the image processing the monitor does before sending it to the panel that causes it , so you couldn't even get the one thing you said in favor of crt monitors right.
 
No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.
I think there will be games released that will require more than 4GB to use the highest texture settings without issues, but there will not be any games released that require 4GB to run at lower texture settings. They can't do that without alienating a huge swath of the PC market.
 
I think there will be games released that will require more than 4GB to use the highest texture settings without issues, but there will not be any games released that require 4GB to run at lower texture settings. They can't do that without alienating a huge swath of the PC market.

Well the assumption is someone buying an 980 class card isn't asking how far they can pare back settings to fit it into 4GB. PC ports are going to continue adding enhancements over console versions, and the consoles will almost assuredly reduce their OS footprints affording developers 6+GB to work with. You're not going to port those mega budget AAA games that take full advantage that + plus PC exclusive enhancements onto a 4GB card. I have serious doubts you'll even get console par experiences 2 or 3 years down the line, which is at bare minimum what the guy was asking. This isn't even getting into the fact that 4k monitors are likely going to take off like a rocket in the next year or two.
 
No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.

That's only true if you have to use the highest texture setting. There will most likely be games that require more than 4GB if you want to play the game maxed out in this generation. But 4GB will last if you don't have to use the best textures and MSAA.

Heck, my friend played BF3 with a 512MB 9800GTX. Everything is possible if you are willing to make sacrifices.
 
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?

I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?
 
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?

I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?

Wait for reviews before you make any decisions. It's just 5 days away now anyway. But as it looks right now, it will not be a big step up from 780Ti. It will have the same performance at a much lower TDP, but that's not really what most people care about when buying high-end videocards. :P
 
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?

I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?

The 980 should trade blows with the 780ti at 1080p. 25% faster than a 780, while a 970 should be 25% faster than a 770.
If I were you I'd wait until actual benchmarks start showing up.
 
Well the assumption is someone buying an 880 class card isn't asking how far they can pare back settings to fit it into 4GB. PC ports are going to continue adding enhancements over console versions, and the consoles will almost assuredly reduce their OS footprints affording developers 6+GB to work with. You're not going to port those mega budget AAA games that take full advantage that + plus PC exclusive enhancements onto a 4GB card. I have serious doubts you'll even get console par experiences 2 or 3 years down the line, which is at bare minimum what the guy was asking. This isn't even getting into the fact that 4k monitors are likely going to take off like a rocket in the next year or two.
I actually think you could, but there will always be quick and dirty ports that don't have proper optimization. That 6GB won't all be used on video memory.
 
Well the assumption is someone buying an 880 class card isn't asking how far they can pare back settings to fit it into 4GB. PC ports are going to continue adding enhancements over console versions, and the consoles will almost assuredly reduce their OS footprints affording developers 6+GB to work with. You're not going to port those mega budget AAA games that take full advantage that + plus PC exclusive enhancements onto a 4GB card. I have serious doubts you'll even get console par experiences 2 or 3 years down the line, which is at bare minimum what the guy was asking. This isn't even getting into the fact that 4k monitors are likely going to take off like a rocket in the next year or two.
Wasn't a 8800GT with 512Mb pretty much good for the whole generation so long as you were just playing at 720p?
 
The PS3 had 512MB of RAM (256MB of which was VRAM) Vs. the 512MB 8800GT. The PS4 has 8GB of RAM Vs. a 4GB GTX980.

I'm not super techy but i do know that just because a console is using 6gb of vram it doesn't mean that a PC would need the exact same amount of vram to run the same game, PC's have two memory pools.

Worst case scenario is every game is gonna get coded like Watch Dogs but that's not happening.
 
I'm not super techy but i do know that just because a console is using 6gb of vram it doesn't mean that a PC would need the exact same amount of vram, PC's have two memory pools.

Worst case scenario is every game is gonna get coded like Watch Dogs but that's not happening.

I'll be the first to admit I have no idea what PC games use general DDR3 system RAM for. I've asked several times, but have never received anything approaching a technical or detailed response. Still, there is no correlation between a 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half that of current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.
 
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?

I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?

They're going to be revealed next week by all accounts. At the very least you should get more memory and a lower power draw with similar or better performance for a lower price. Nothing revolutionary but if you've waited this long, you might as well hold off a few more days.
 
Still, there is no correlation between 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.

*facepalm*
 
I'll be the first to admit I have no idea what PC games use general DDR3 system RAM for. I've asked several times, but have never received anything approaching a technical or detailed response. Still, there is no correlation between 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.

@ 720p or below? probably

Also, what the heck does this have to do with these cards?
 
I'll be the first to admit I have no idea what PC games use general DDR3 system RAM for. I've asked several times, but have never received anything approaching a technical or detailed response. Still, there is no correlation between 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.

Again that 8GB of ram in the consoles are shared for EVERYTHING, it's not just Vram. System RAM is where the game is loaded so it does not have to constantly read off the HDD. This makes it run faster, that's why the more RAM you have, the faster your computer can go up to a point because it can load everything into System RAM.

Consoles also need to load the game into ram, so that 8gb is now even less because it needs to load it so the game will run faster. So you if want to add up the shared pool in the Consoles, then I'm going to add it up in my PC. 8GB + 3GB from my video card = 11GB! I win!
 
The PS3 had 512MB of RAM (256MB of which was VRAM) Vs. the 512MB 8800GT. The PS4 has 8GB of RAM Vs. a 4GB GTX980.

I like how you add up the split ram pool of the ps3 then fail to do the same for the split ram pool of a PC

Most modern midrange/high end gaming pcs will have 2-4GB of vram (with twice the bandwidth of the ram in ps4, but shhhhhh don't mention that) + 8GB of ddr3 ram

In the end you have way more bandwidth as well as more memory to work with on a pc, even with only 2GB vram.

Again that 8GB of ram in the consoles are shared for EVERYTHING, it's not just Vram. System RAM is where the game is loaded so it does not have to constantly read off the HDD. This makes it run faster, that's why the more RAM you have, the faster your computer can go up to a point because it can load everything into System RAM.

Consoles also need to load the game into ram, so that 8gb is now even less because it needs to load it so the game will run faster.
Yep, and after reserving 3.5 GB of that 8GB of shared ram for the OS on ps4 (always conveniently left out of these asinine 8GB gddr5 arguments) and using a sizeable amount as cache for stuff it will need later/data needed by the cpu (just like what you described happens on pc) there is likely not even going to be 2GB worth of memory left for the gpu to work on rendering and buffering the next frame. And again, vram on pc has a much higher bandwidth in high end gpus, and it doesn't need to share that bandwidth with the cpu.
 
It's entirely possible that a PS4 game could call for what would essentially require 5+gb of VRAM in a PC environment, right?
 
Again that 8GB of ram in the consoles are shared for EVERYTHING, it's not just Vram. System RAM is where the game is loaded so it does not have to constantly read off the HDD. This makes it run faster, that's why the more RAM you have, the faster your computer can go up to a point because it can load everything into System RAM.

Consoles also need to load the game into ram, so that 8gb is now even less because it needs to load it so the game will run faster.

I like how you add up the split ram pool of the ps3 then fail to do the same for the split ram pool of a PC

Most modern midrange/high end gaming pcs will have 2-4GB of vram (with twice the bandwidth of the ram in ps4, but shhhhhh don't mention that) + 8GB of ddr3 ram

In the end you have way more bandwidth as well as more memory to work with on a pc, even with only 2GB vram.

Can you guys not read 2 or 3 posts above for context? That was a response to Seanspeed's post that was likening the 8800GT Vs. PS3 situation to a proposed 4GB GTX980 Vs. PS4 situation. I was simply pointing out it's not the same thing. You're attempting to compare a card with VRAM matching the PS3's total system RAM to a card with half the VRAM of a PS4's total system RAM. It's not the same situation.
 
It's entirely possible that a PS4 game could call for what would essentially require 5+gb of VRAM in a PC environment, right?

Not if it actually wants to run the game. Plus the video card is not all that great, so dumping more textures into Vram is just going to bog it down.
 
Still, there is no correlation between a 512MB 8800GT which matched last gen consoles
Why you assume that the 8800GT matches last gen consoles? From my experience its WAY superior than last gen consoles. I played most multiplatform games on it at 60fps with 4x AA during that generation.

Also, a big portion of that 8GB ram on PS4 is reserved for the OS, AI, game logic, etc. While the VGA memory on PC cards is reserved only for graphics.
 
It would be a pretty terrible port if that were too happen.

I should clarify. When I say "require" I mean "to pull off at least 100% as well as on the PS4". Obviously a PC port would have options for lower-than-console fidelity in this case.
 
You can't just compare the 8GB PS4 has with the VGA memory. You also have to take into account the System RAM. A PC today has 8GB system RAM + 2GB only for graphics. That's 10GB total. 12 if you have a 4GB card.
 
Vram is mostly used right now for Multi Monitor set ups. So unless you're running a bunch of 4k screens at once, 3-4GB is overkill for 1080P.
 
I still use my 8800GT 512MB on my new PC, until i get a new card. I can play some games like Grid Autosport with everything on ultra (including textures) at around 40fps.

That's a 512MB card. I haven't even experienced a 2GB card yet. And some people say that 4GB isn't enough? Dem high standards!
 
I still use my 8800GT 512MB on my new PC, until i get a new card. I can play some games like Grid Autosport with everything on ultra (including textures) at around 40fps.

That's a 512MB card. I haven't even experienced a 2GB card yet. And some people say that 4BG isn't enough? Dem high standards!

It's this weird way of thinking that unless you're running everything maxed @ 4k @ 120fps you're not really PC gaming at all or that it isn't worth the investment.
 
I should clarify. When I say "require" I mean "to pull off at least 100% as well as on the PS4". Obviously a PC port would have options for lower-than-console fidelity in this case.

The ps4 gpu can't push and shade anywhere nearly enough pixels to use that much vram
Again it's not 5 or8 it's 4.5 max as the rest is reserved by the OS
You still need a large part of that 4.5 to run the game off (you can't run it off the hdd, loading screens are the game putting the data required to run the game from the hdd into the ram and as you know just loading a few GB from the hdd to the ram takes forever (100s to 1000s of times longer than the time you have to render a frame) so a large amount of that 4.5 GB is going to that.
That 176GB/sec memory bandwidth is disproportional to the amount of memory if the gpu were to use it all (it doesn't use it all so the bandwith is fine)

It's like those 3GB ddr3 gt620 gpus they put in packard bell pcs, they are capable of a lot less than a 1GB gtx 650ti because the ram is too slow and the gpu is too weak to do anything with all of it.

Just like the ps4 gpu isn't powerful enough to saturate that 4.5GB of ram in a way that results into better graphics or more pixels on screen.
Your 8GB gddr5 ps4 struggles to run bf4 at 900p, while a 3GB amd r9 290 will not break a sweat at 1440p with 2x msaa, thanks to it's whopping 320GB/sec of memory bandwidth and because the gpu can render way more pixels per second and is much faster at texturing

The ps4 has so much memory because it needs it for the os (recording gameplay, basic multitasking, better browser and all that stuff that was missing on ps3) and because it is one shared pool.
A ps4 with a better gpu ,4 GB ddr3 ram and 2GB of faster vram would have been better at rendering games than it is now.

A pc also needs to do all that (multitasking, os, recording/streaming etc) so pcs also have tons of memory (8GB of system ram or more)
 
Yep, and after reserving 3.5 GB of that 8GB of shared ram for the OS on ps4 (always conveniently left out of these asinine 8GB gddr5 arguments) and using a sizeable amount as cache for stuff it will need later/data needed by the cpu (just like what you described happens on pc) there is likely not even going to be 2GB worth of memory left for the gpu to work on rendering and buffering the next frame. And again, vram on pc has a much higher bandwidth in high end gpus, and it doesn't need to share that bandwidth with the cpu.

We know that even launch titles were dedicating 3GB+ to graphics alone ( Shadow Fall) and it's not unreasonable to expect that figure to rise as more memory is unlocked as the generation progresses.

DDR3 is only additive bandwidth and memory if developers code their games to work that way and spend time optimising around a spilt memory pool. We already have Watch Dogs and Titanfall as examples of games that don't do this effectively and it's naive to think the situation is going to improve before it gets worse.

I've said all along that 4GB is the minimum that you want, I'd personally feel a lot more comfortable with 6-8GB but that's not going to be cost effective anytime soon.
 
You're attempting to compare a card with VRAM matching the PS3's total system RAM to a card with half the VRAM of a PS4's total system RAM. It's not the same situation.
Maybe. But today's top cards also have much more raw horsepower compared to the consoles. Overall memory bandwidth is quite high. There may be a few situations where you cant turn up the textures as high as you'd like, but I cant imagine the entire experience is going to be below console levels, especially in 2-3 years, given all the other performance advantages modern PC's will have.

Also, remember the XB1 does not have 8GB of GDDR5.
 
Memory doesn't grow on trees - going from 4 to 8GB would increase price of gpu, and gpu companies don't have a golden parachute of paid online to take the hit.

So I'd rather have 4GB standard gpu with 8GB variants for people who want them than mandatory 8 gigs on every card.
 
We know that even launch titles were dedicating 3GB+ to graphics alone ( Shadow Fall) and it's not unreasonable to expect that figure to rise as more memory is unlocked as the generation progresses.

DDR3 is only additive bandwidth and memory if developers code their games to work that way and spend time optimising around a spilt memory pool. We already have Watch Dogs and Titanfall as examples of games that don't do this effectively and it's naive to think the situation is going to improve before it gets worse.

I've said all along that 4GB is the minimum that you want, I'd personally feel a lot more comfortable with 6-8GB but that's not going to be cost effective anytime soon.

Why, because 2 shitty ports have no texture streaming in the pc version and apparently shitty texture compression? There are dozens of games with much better textures than either of those games that make do with 1-1.5 GB at 1080p.
Those 2 are the exception and it's certainly not because they are pushing graphics in any way.
 
and it's certainly not because they are pushing graphics in any way.
Yeah... it might be best not to spoil developers with unlimited resources or something. Its like the more you give them to work with, the less optimizations they do, resulting in standard looking games that would normally run in systems with half the resources.
 
Top Bottom