Ugh!
That is lame. Why even bother releasing a new line if it isn't a real upgrade?
Will this new line of cards at least be released at a cheaper price point to make up for the negligible upgrade?
Probably not.
Ugh!
That is lame. Why even bother releasing a new line if it isn't a real upgrade?
Will this new line of cards at least be released at a cheaper price point to make up for the negligible upgrade?
That's likely why the game is a crashfest.
Yeah, Nvidia users have found that rolling back to 340.43 fixes the frequent crashing (not sure about AMD GPUs since I have two 670s myself).
Ugh!
That is lame. Why even bother releasing a new line if it isn't a real upgrade?
Will this new line of cards at least be released at a cheaper price point to make up for the negligible upgrade?
So how much will this 970 be are we thinking? I've seen most sites saying $399. I was looking at the 770 for my new PC I just built, if this is $399 is it worth the extra $120 over what I can get a 770 for?
ill just stick with my 760 till i see performance yields i like.
The problem with frame pacing is that, from what I've seen, it works almost perfectly for those games which are generally used to test frame pacing, moderately well for most other mainstream games, and totally breaks apart on niche titles.The exaggerations of microstutter in this thread are waaaaaay overblown.
Take a look at SLI reviews from PCPer and TechReport. They use frame time data, and the variance between single cards and dual cards is extremely marginal. These are the two websites that are responsible for bringing the hard data on uneven frame pacing (microstutter) to light.
As I said, frame pacing has been a huge focus of both AMD and NVIDIA over the last few years. Drivers have matured very nicely on this aspect.
This is, of course, assuming there's SLI/Crossfire profiles for the game in question.
For the folks saying they see it all of the time, they most likely are confusing it with something else (tearing perhaps) or have hardware that predates Kepler and GCN.
I find this is a nice way of keeping with the high end without breaking the bank. Also, capitalizing on people who do this . . .I wonder if I should sell my 780 and just get a 970 since it would literally cost me $100 after you factor in the money I paid originally and the used price I can get. Might be worth it for the lower power draw and heat.
Has there been talks about improving the fan noise?If you're not vendor-loyal, R9 290 still is the best bang for high-end buck.
I'm using a GTX260 so...Nivdia needs to look to the owners of the 670/80 or 760/770 to upgrade so it's doubtful that those people will be willing to upgrade for more than $500.
I'm still using 4:3 and, last year, discovered how good CRT actually is (always had internal suspicions about LCD but looking data confirmed them).In an ideal situation I would wait for Rift CV1 to come out before buying a new GPU, but my lovely OC 1GB 560 Ti is starting to show its age even when only gaming at 1080p. I don't need 60FPS in all games I play, but once newer games require you to tweak graphical settings to get a stable 30 an upgrade becomes inevitable.
If the price of a 970 gets around 300 euro at release or soon after I'll probably bite. Near-Titan performance is more than enough for someone who doesn't give a crap about extreme resolutions (yet), so it should last me about two years at the very least.
That's the thing...PS4 or Gaming Card. If I don't get a new card before Witcher 3, then it'll have to be for Killing Floor 2.http://www.techpowerup.com/gpudb/2621/geforce-gtx-980.html
www.techpowerup.com/gpudb/2620/geforce-gtx-970.html
If these numbers are remotely true, the 970 should be a decent upgrade over my 670. But I may wait until witcher 3 hits.
WHAT!? GOD DAMN IT, WHY DIDN"T I KEEP PAYING ATTENTION AFTER 290's RELEASE!?You missed out. When everyone was dumping 290s due to Litecoin mining inefficiency, you could nab barely used ones for like $170. It was insane. You can still find them on sale (new) for pretty cheap, as low as ~$300.
For what it's worth, the RELATIVE performance might be pretty low for, lets say, a 760 or 280X. However, what you get is a card that can run 98% of games at 1080p/120Hz. You'll probably want to be on the lookout for AMD dropping prices, or perhaps an eventual 960 Ti.
If Alians: Isolation is good...sure. For those only interest in Total War...wait to see if they won't make another Broken Piece of Shit.I think AC Unity and GTA5 are going to push things. And if not, Witcher 3 will scream. Also the announcements of the Next Total War title should flex the muscles of the upgraders.
So far, the only PC game that is making that resolution a big deal is The Evil Within. Still, that seems like the first monitor I've been truly interested in and actually excited...just probably not at that price (and desk I'm using).That is why I want this: https://www.youtube.com/watch?v=KnrxNfxRK_4
You get two 27¨' screens in one with no bezel and only a 20% performance penalty over 2560x1440. I like the cost association of this compared to going dual screen 4k 27/30''. and the bezel... grrr...
I don't think it will be graphene and not until another 15-20 years. We will get to 7nm, fighting and screaming, but we will get there. Whether it's worth the cost will be the thing foundries will need to figure out.
Based on when TSMC claims to start volume production of 16nm (20nm w/FinFET) is Q1 2015. Which means late next year or early 2016 we may finally get an video card upgrade. That depends on if NVIDIA is ready to implement stacked DRAM and what AMD's plans are.
Yeah, hence my profile caveat. TechReport does a decent job of throwing in a random game or two to keep them on their toes.
Most niche titles shouldn't have an issue on a single high end card with SLI/Crossfire disabled. This goes back to the standard GAF advice on the subject, which is that it should always be avoided unless you are trying to achieve performance that would otherwise be impossible on a single GPU
I find this is a nice way of keeping with the high end without breaking the bank. Also, capitalizing on people who do this . . .
So. . .
Looking to sell your 780, eh?![]()
Has there been talks about improving the fan noise?
This is a real upgrade for the people who didn't get the GK110 chip (780 and up). This is the upgrade for people with 680/770 and below since those were the mid-range chips from the Kepler cards, just like these are the mid-range chips for Maxwell.
For the real upgrade to 780 you have to wait for the GTX1080 (or whatever they will call it) that will most likely be made with 16nm instead of 28.
And even if the performance gains over 780Ti won't be huge for 980 (roughly 10%, in benchmarks at least), it's still pretty impressive that they can achieve that performance at a much lower TDP. This will benefit laptops more than a desktop PC though.
Wait for the 970/980 to come out and see how their prices stack up to older cards that might get price cuts.What's better for someone who doesn't have a GPU yet, invest in the 970/980 or buy a cheaper card that will do the job for now and then upgrade when a real leap occurs?
I'm assuming my 500w power supply would be fine with a 980 if it's alright with a factory OC'd 670?
I'm assuming my 500w power supply would be fine with a 980 if it's alright with a factory OC'd 670?
The problem with frame pacing is that, from what I've seen, it works almost perfectly for those games which are generally used to test frame pacing, moderately well for most other mainstream games, and totally breaks apart on niche titles.
Like multi-GPU rendering of applications which aren't multi-GPU aware itself really. It's a hack.
Well, they don't have 6GB 780ti so you won't be able to game at higher res than 1080p reliably.
For 1080p 780ti will last you a while.
Do you want to play now or wait for another year for the "real leap"?What's better for someone who doesn't have a GPU yet, invest in the 970/980 or buy a cheaper card that will do the job for now and then upgrade when a real leap occurs?
Do you want to play now or wait for another year for the "real leap"?
I think that even 28nm 1st gen Maxwells will handle games for a couple of years without much problems. Why wait?
Waiting with a cheaper GPU is what I asked though. I'd imagine a cheaper card like a 670 or 770 would handle games for the next couple of years without much problem. Am I wrong?
I also thought about getting a 970/980 now and then SLI when they really start to show their age in like 3-4 years. Is that a good strategy, will it keep a PC going for the entire generation? And by going I mean running everything on high and 1080@60.
So how much will this 970 be are we thinking? I've seen most sites saying $399. I was looking at the 770 for my new PC I just built, if this is $399 is it worth the extra $120 over what I can get a 770 for?
I'm still running current gen games on some ancient 560 ti's just fine.
It all depends on what you are willing to sacrifice. I always run at 1080p/60 at the minimum, generally have to tone down some of the spare effects. You should be able to coast through the entire generation on a 770.
If you are planning on going high-end, wait until the next generation of cards. That should be the big jump.
So far, the only PC game that is making that resolution a big deal is The Evil Within. Still, that seems like the first monitor I've been truly interested in and actually excited...just probably not at that price (and desk I'm using).
Also, doesn't AMD also making super-synced monitors as well? I mean, I lost interest in G-Sync since I was looking super hard into the recent R series of cards and would defeat the purpose of getting one. Combine with my willingness to keep using this 17 year old monitor...man, I've been thinking for 3 years and yet it really does seem like next year will finally be the time for me to implement the last upgrade that I should have decided to choose in 2012.
wait... you are still using a monitor from 1997?
:O
As someone who may end up trucking 2-3 weeks at a time starting next year or late this year, the idea of a beefy laptop has become more appealing to me.
I'll be waiting for the 970M, at least, before making a purchase. These laptop GPU performance jumps are impressive. I don't need Ultra everything, so if I can keep temps down while maintaining a mix of Medium-High settings at 60+ FPS, I'll be content.
I need to be able to play, record and render without heat or other issues, and I've been extremely hesitant in the past to go hard on a laptop purchase. These 900-series mobile GPUs and a change in lifestyle are changing my tune, though.
wait... you are still using a monitor from 1997?
:O
Good CRT is better than any LCD.
I certain ways yes. But in this day and age the negatives of a huge ass CRT outweigh the positives.
What negatives are you talking about? That it's big? Why? Do you have a slave holding up the monitor for you while you play it and you want to spare his back? Or do you have a desk or table like a normal person.
There is nothing an lcd monitor does better than a good crt, and even a bad crt does almost everything better than a 'good' lcd
I'd have to be a fucking idiot to do photo editing on a CRT. Too unwieldy for dual monitor use too.
On techpowerup the 780ti is listed as having 5 TFLOPS of power while the 980 is only 4 TFLOPS yet the benchmark scores are really similar. Any reason for this or is it due to errors in estimation on techpowerup's website?
The 2 cards:
980 http://www.techpowerup.com/gpudb/2621/geforce-gtx-980.html
780ti http://www.techpowerup.com/gpudb/2512/geforce-gtx-780-ti.html
On techpowerup the 780ti is listed as having 5 TFLOPS of power while the 980 is only 4 TFLOPS yet the benchmark scores are really similar. Any reason for this or is it due to errors in estimation on techpowerup's website?
The 2 cards:
980 http://www.techpowerup.com/gpudb/2621/geforce-gtx-980.html
780ti http://www.techpowerup.com/gpudb/2512/geforce-gtx-780-ti.html
What negatives are you talking about? That it's big? Why? Do you have a slave holding up the monitor for you while you play it and you want to spare his back? Or do you have a desk or table like a normal person.
There is nothing an lcd monitor does better than a good crt, and even a bad crt does almost everything better than a 'good' lcd
980 should be at least on par with 780ti, so that's not right.
There is one reason, and that is awesome scaling.-Massive size, weight and desk space
-Higher power draw
-Audible whine(though very small)
-No multi monitor
-Lower resolutions
-4:3
-CRT Drift over time
-Analog signal (VGA)
While CRT's have superior color reproduction and input lag, there is no reason to own one in todays day and age. Its out dated technology.
And this is coming from someone is has owned nearly every monitor in existence and used the holly grail of CRT's- Sony FW900. I would take my Dell 32'4k or Asus ROG over the Sony FW900 any day.
On techpowerup the 780ti is listed as having 5 TFLOPS of power while the 980 is only 4 TFLOPS yet the benchmark scores are really similar. Any reason for this or is it due to errors in estimation on techpowerup's website?
The 2 cards:
980 http://www.techpowerup.com/gpudb/2621/geforce-gtx-980.html
780ti http://www.techpowerup.com/gpudb/2512/geforce-gtx-780-ti.html