This is all a marketing tactic from NVIDIA to sell more cards down the line, they know that their current GPU offerings are much better than consoles. And if they released cards with 8 GB VRAM it would most likely last entire generation.
So they continue to release cards with 3,4,6 GB VRAM so your forced to upgrade as the generation continues. Huge examples supporting this are games like Evil Within and Shadow of Mordor with requirements exceeding 4GB VRAM.
I was almost ready to pull the trigger on the 970/980, but no thanks NVIDIA gonna wait for the 8GB cards.
This is all a marketing tactic from NVIDIA to sell more cards down the line, they know that their current GPU offerings are much better than consoles. And if they released cards with 8 GB VRAM it would most likely last entire generation.
So they continue to release cards with 3,4,6 GB VRAM so your forced to upgrade as the generation continues. Huge examples supporting this are games like Evil Within and Shadow of Mordor with requirements exceeding 4GB VRAM.
I was almost ready to pull the trigger on the 970/980, but no thanks NVIDIA gonna wait for the 8GB cards.
It makes sense to me in a way, PC users expect double the framerate of consoles from an Id Tech 5 game with extra hacked-in dynamic lighting by Tango.I'm not at all convinced The Evil Within will actually need 4GB to match consoles.
This has all the signs of a mess of a PC release. A true horror, conveniently.It makes sense to me in a way, PC users expect double the framerate of consoles from an Id Tech 5 game with extra hacked-in dynamic lighting by Tango.
Will my P6X58D be able to handle the 970?
What are some examples?
Get your ROG Swift yet?
Looking at these 2 models. Can someone advise which one? No overclocking will be done by me:
ASUS STRIX GTX 970
Or the MSI GTX 970 Twin Frozr
The MSI is $30 more here in Canada than the Asus. I'm leaning towards the ASUS at this point.
I'm all for shitting on Nvidia's VRAM choices in past generations but if you don't plan to use SLI and plan to use 1080p as your resolution of choice then I don't think you should have any reservations about buying a 4GB 970.
2GB cards were always a DOA product but 4GB seems to be a decent compromise for now.
Don't forget that PS4 runs at 30fps, and previews showed lots of framerate dips on that console. Brace yourself.I'm almost 100% sure that The Evil Within (and other games) won't need more than 2GB to MATCH PS4/XBone graphics.
Did that for the first set of years. Now the more money I gain, the less I spend on things like expensive PC components. I was finally convinced to buy a 970 with its price/performance point.It's nice when you're into your career and have money to play with.![]()
How many times have I seen product launches of cards with high VRAM for the benchmarks to show no improvement? Often. Then you have the concern of bus speed, any other architectural bottlenecks, and of course diminishing returns on very few games. Oh and most important, price. If I saw 8gb versions close to $450, I would NOT have got a 970 and waited. Seems like a different business tactic you aren't seeing.This is all a marketing tactic from NVIDIA to sell more cards down the line, they know that their current GPU offerings are much better than consoles. And if they released cards with 8 GB VRAM it would most likely last entire generation.
Did that for the first set of years. Now the more money I gain, the less I spend on things like expensive PC components. I was finally convinced to buy a 970 with its price/performance point.
How many times have I seen product launches of cards with high VRAM for the benchmarks to show no improvement? Often. Then you have the concern of bus speed, any other architectural bottlenecks, and of course diminishing returns on very few games. Oh and most important, price. If I saw 8gb versions close to $450, I would NOT have got a 970 and waited. Seems like a different business tactic you aren't seeing.
The reason I don't comment with too must gusto on this topic is because if I tried to convince myself years ago to just "chill out on it," I would've not listened. I would have lit my torch. Now, I have to say I'm wiser. An extra ~$100 right now for 8gb probably won't see dividends for quite some time. If it does, with the amount of post-processing options available in PC gaming, you are looking at two things a) placebo effect b) bragging rights.
The Witcher 2 destroys what I have seen so far from current gen. However, with my current settings I notice shimmering textures, slight aliasing, questionable shadows, etc... What I'm not complaining about is the texture quality on what is arguably one of the best looking games out. I may put some extra work in correcting some of these issues where possible, but I'm waiting more on developers to take advantage what they have "now," not with a new VRAM ceiling. And if devs continue to evolve in their methods, I'll pick up an 8gb card eventually, step-up is a maybe but otherwise in a few years.
It does that when it's idle.
Seems a little low seeing as I can hit higher with my 780. It's likely the 920 that's making you score lower since 3dMark really likes to use the CPU in the test.
From what ive gathered in this thread a Gtx 970 and i7 2600K or i5 2500k shud be o/c to 4.5ghz or thereabouts for there to be no bottleneck? Does that mean a game like Far Cry 4 will run good. I was planning on getting. Devils canyon upgrade mobo and ram etc but will it make any difference. I game at 1080p
For FC4 we won't know until it releases.
Yes though any 2500K or 2600K should be OC'd. That's the whole point of getting a K series CPU, to OC it.
Devil's Canyon would be a pointless upgrade if you have one of those.
edit: wait there are 8gb versions coming out soon or am i going crazy
8GB flavours are currently rumoured for a November release. I hope they land earlier in the month (and at least one 8GB 970 can be had for no more than USD$399) as I'd like to upgrade in time for AssCreed Unity. If I have to settle for 4GB, I will, but I wouldn't mind paying a bit of a premium for peace of mind.
I have a i7 2600k @ 4.2ghz so will stick with it for nowcheers
Yea im on air cooling but think it will push to 4.5 thru the cpu multiplier in my bios. Time to tweak!Try and knock it on a bit more, most can get to 4.5 easily enough.
The 2500/2600k should be overclocked to 4.0Ghz+ because you *can*, not because its the only way of avoiding a bottleneck. It gives you some extra headroom so in CPU-limited situations, your ceiling before hitting any bottleneck is higher.From what ive gathered in this thread a Gtx 970 and i7 2600K or i5 2500k shud be o/c to 4.5ghz or thereabouts for there to be no bottleneck?
My rog swift keeps going to 60 Hz after each scene in 3dmark maxing out my frames to 60. Any idea how to stop that? I have to manually switch to 144hz during each load screen
Try setting 'Preferred Refresh Rate' in the Nvidia Control Panel to 'Highest Available' if it is not already.
thanks will try.
Edit: it worked!
Yay!
now a more serious question. i was getting a little over 15k before i updated my drivers now just over 13k. whats up with that?
It could be the driver. You could always roll back to the old driver and test again. I think there was a beta release right after the official 970/980 drivers.
I personally don't card about those benchmarks as long as my games run great.
im a stats kinda guy. i love benchmarking. but imma roll back to previous driver and ill let ya know what happens.
You could sell your 680 and almost cover half the price. I'd go for itSo did anyone here upgrade from a 680 to either the 970 or 980 and feel it was worth it? I keep waffling on whether or not I want to upgrade.
So did anyone here upgrade from a 680 to either the 970 or 980 and feel it was worth it? I keep waffling on whether or not I want to upgrade.
It's all good. Do what you enjoy.I just like to do a bench when I get a new card to make sure it's running properly then it's time to game.
Let me know what you come up with.
Make sure you disable G-Sync when you are benching. I think that will cause a low score. Plus it really doesn't do anything past 144fps.
If you're still on the hunt for one I recommend watching these links
http://www.nowinstock.net/computers/videocards/nvidia/gtx980/
http://www.nowinstock.net/computers/videocards/nvidia/gtx970/
It could be a MSI Afterburner issue that requires patching.. or maybe a new BIOS is incoming.Is this just an unlucky guy or is there basis to this?
https://forums.geforce.com/default/...y-lower-voltage-than-the-other-driver-bug-/1/
Hopefully it is a driver bug,as that would be horrible to limit voltage while in SLI.
So by this logic you are still playing games on your 8800GT ?
There will be zero performance increase unless they clock them higher (probably will for most) and vram limited situations.I cannot wait until the 8gig 970s are released and we get to see benchmarks side by side. I would be utterly shocked if its anything more than a subtle performance improvement on majority of games - especially at 1080p.