So 980/970 release by the end of September
and 960 mid October?
Hope that's true, I really need a new card and want the 980.
So 980/970 release by the end of September
and 960 mid October?
2GB GTX 770s simply don't have enough VRAM. You should be comparing 4GB GTX 770s not the 2GB models.
Not with 2GB of VRAM he won't. It's already a bottleneck in the here and now and it will only end up to prove crippling as time goes by. If you want to get through this generation comfortably at 1080p then I feel the the GTX 970 is the first midrange card that will allow you to do that.
Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?
Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?
Will 4GB VRAM be enough for all of current-gen? And if I SLI two 970s in the future, would that double my VRAM to 8GB, or is that not how it works?
crts are not limited to low resolutions or 4:3 (if you had an fw900 you'd know that)-Massive size, weight and desk space
-Higher power draw
-Audible whine(though very small)
-No multi monitor
-Lower resolutions
-4:3
-CRT Drift over time
-Analog signal (VGA)
While CRT's have superior color reproduction and input lag, there is no reason to own one in todays day and age. Its out dated technology.
And this is coming from someone is has owned nearly every monitor in existence and used the holly grail of CRT's- Sony FW900. I would take my Dell 32'4k or Asus ROG over the Sony FW900 any day.
Why not?No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.
Why not?
It'll depend on what you want to do.
You have no idea what you're talking about.No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.
No, 4GB will not be enough. Not even for 1080p.
I think there will be games released that will require more than 4GB to use the highest texture settings without issues, but there will not be any games released that require 4GB to run at lower texture settings. They can't do that without alienating a huge swath of the PC market.No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.
I think there will be games released that will require more than 4GB to use the highest texture settings without issues, but there will not be any games released that require 4GB to run at lower texture settings. They can't do that without alienating a huge swath of the PC market.
No, 4GB will not be enough. Not even for 1080p, the whole gen. Last year people claimed 2gb was enough in the 780/770 rumor thread.
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?
I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?
I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?
I actually think you could, but there will always be quick and dirty ports that don't have proper optimization. That 6GB won't all be used on video memory.Well the assumption is someone buying an 880 class card isn't asking how far they can pare back settings to fit it into 4GB. PC ports are going to continue adding enhancements over console versions, and the consoles will almost assuredly reduce their OS footprints affording developers 6+GB to work with. You're not going to port those mega budget AAA games that take full advantage that + plus PC exclusive enhancements onto a 4GB card. I have serious doubts you'll even get console par experiences 2 or 3 years down the line, which is at bare minimum what the guy was asking. This isn't even getting into the fact that 4k monitors are likely going to take off like a rocket in the next year or two.
Wasn't a 8800GT with 512Mb pretty much good for the whole generation so long as you were just playing at 720p?Well the assumption is someone buying an 880 class card isn't asking how far they can pare back settings to fit it into 4GB. PC ports are going to continue adding enhancements over console versions, and the consoles will almost assuredly reduce their OS footprints affording developers 6+GB to work with. You're not going to port those mega budget AAA games that take full advantage that + plus PC exclusive enhancements onto a 4GB card. I have serious doubts you'll even get console par experiences 2 or 3 years down the line, which is at bare minimum what the guy was asking. This isn't even getting into the fact that 4k monitors are likely going to take off like a rocket in the next year or two.
Wasn't a 8800GT with 512Mb pretty much good for the whole generation so long as you were just playing at 720p?
The PS3 had 512MB of RAM (256MB of which was VRAM) Vs. the 512MB 8800GT. The PS4 has 8GB of RAM Vs. a 4GB GTX980.
8GB of unified memory, not necessarily dedicated vRAM.The PS3 had 512MB of RAM (256MB of which was VRAM) Vs. the 512MB 8800GT. The PS4 has 8GB of RAM Vs. a 4GB GTX980.
I'm not super techy but i do know that just because a console is using 6gb of vram it doesn't mean that a PC would need the exact same amount of vram, PC's have two memory pools.
Worst case scenario is every game is gonna get coded like Watch Dogs but that's not happening.
So wait, did I get this right: a 780Ti is technically superior than the new lineup of cards?
I'm in the process of building my new pc(waited for the new X99 platform) and I've been holding out on buying the 780Ti because the new cards are right around the corner.
So in terms of raw specs it seems pointless to wait for the new cards?
Still, there is no correlation between 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.
I'll be the first to admit I have no idea what PC games use general DDR3 system RAM for. I've asked several times, but have never received anything approaching a technical or detailed response. Still, there is no correlation between 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.
I'll be the first to admit I have no idea what PC games use general DDR3 system RAM for. I've asked several times, but have never received anything approaching a technical or detailed response. Still, there is no correlation between 512MB 8800GT which matched last gen consoles and a 4GB GTX980 which is half current-gen consoles. Would a 256MB 8800GT be running late gen ports? Probably not.
The PS3 had 512MB of RAM (256MB of which was VRAM) Vs. the 512MB 8800GT. The PS4 has 8GB of RAM Vs. a 4GB GTX980.
Yep, and after reserving 3.5 GB of that 8GB of shared ram for the OS on ps4 (always conveniently left out of these asinine 8GB gddr5 arguments) and using a sizeable amount as cache for stuff it will need later/data needed by the cpu (just like what you described happens on pc) there is likely not even going to be 2GB worth of memory left for the gpu to work on rendering and buffering the next frame. And again, vram on pc has a much higher bandwidth in high end gpus, and it doesn't need to share that bandwidth with the cpu.Again that 8GB of ram in the consoles are shared for EVERYTHING, it's not just Vram. System RAM is where the game is loaded so it does not have to constantly read off the HDD. This makes it run faster, that's why the more RAM you have, the faster your computer can go up to a point because it can load everything into System RAM.
Consoles also need to load the game into ram, so that 8gb is now even less because it needs to load it so the game will run faster.
Again that 8GB of ram in the consoles are shared for EVERYTHING, it's not just Vram. System RAM is where the game is loaded so it does not have to constantly read off the HDD. This makes it run faster, that's why the more RAM you have, the faster your computer can go up to a point because it can load everything into System RAM.
Consoles also need to load the game into ram, so that 8gb is now even less because it needs to load it so the game will run faster.
I like how you add up the split ram pool of the ps3 then fail to do the same for the split ram pool of a PC
Most modern midrange/high end gaming pcs will have 2-4GB of vram (with twice the bandwidth of the ram in ps4, but shhhhhh don't mention that) + 8GB of ddr3 ram
In the end you have way more bandwidth as well as more memory to work with on a pc, even with only 2GB vram.
It's entirely possible that a PS4 game could call for what would essentially require 5+gb of VRAM in a PC environment, right?
I'm still back on a GTX 580. Might be time to finally upgrade next year.
It's entirely possible that a PS4 game could call for what would essentially require 5+gb of VRAM in a PC environment, right?
Why you assume that the 8800GT matches last gen consoles? From my experience its WAY superior than last gen consoles. I played most multiplatform games on it at 60fps with 4x AA during that generation.Still, there is no correlation between a 512MB 8800GT which matched last gen consoles
It would be a pretty terrible port if that were too happen.
IIRC, Titanfall pretty much dumps all the textures into video memory on the PC. No streaming. But it's less than 5GB.It would be a pretty terrible port if that were too happen.
IIRC, Titanfall pretty much dumps all the textures into video memory on the PC. No streaming.
IIRC, Titanfall pretty much dumps all the textures into video memory on the PC. No streaming. But it's less than 5GB.
I don't own that game, no options to change that? That kind of sucks then, no wonder i kept reading about that game running worse then expected.
I still use my 8800GT 512MB on my new PC, until i get a new card. I can play some games like Grid Autosport with everything on ultra (including textures) at around 40fps.
That's a 512MB card. I haven't even experienced a 2GB card yet. And some people say that 4BG isn't enough? Dem high standards!
I should clarify. When I say "require" I mean "to pull off at least 100% as well as on the PS4". Obviously a PC port would have options for lower-than-console fidelity in this case.
Yep, and after reserving 3.5 GB of that 8GB of shared ram for the OS on ps4 (always conveniently left out of these asinine 8GB gddr5 arguments) and using a sizeable amount as cache for stuff it will need later/data needed by the cpu (just like what you described happens on pc) there is likely not even going to be 2GB worth of memory left for the gpu to work on rendering and buffering the next frame. And again, vram on pc has a much higher bandwidth in high end gpus, and it doesn't need to share that bandwidth with the cpu.
Maybe. But today's top cards also have much more raw horsepower compared to the consoles. Overall memory bandwidth is quite high. There may be a few situations where you cant turn up the textures as high as you'd like, but I cant imagine the entire experience is going to be below console levels, especially in 2-3 years, given all the other performance advantages modern PC's will have.You're attempting to compare a card with VRAM matching the PS3's total system RAM to a card with half the VRAM of a PS4's total system RAM. It's not the same situation.
We know that even launch titles were dedicating 3GB+ to graphics alone ( Shadow Fall) and it's not unreasonable to expect that figure to rise as more memory is unlocked as the generation progresses.
DDR3 is only additive bandwidth and memory if developers code their games to work that way and spend time optimising around a spilt memory pool. We already have Watch Dogs and Titanfall as examples of games that don't do this effectively and it's naive to think the situation is going to improve before it gets worse.
I've said all along that 4GB is the minimum that you want, I'd personally feel a lot more comfortable with 6-8GB but that's not going to be cost effective anytime soon.
Yeah... it might be best not to spoil developers with unlimited resources or something. Its like the more you give them to work with, the less optimizations they do, resulting in standard looking games that would normally run in systems with half the resources.and it's certainly not because they are pushing graphics in any way.