The point is your current GPU would still have value so you can sell it to subsidies the cost of a new GPU. Waiting two gens this strategy is much less beneficial. Even if waiting a gen yielded a similar cost analysis your still in a net negative situation because you are doing so at the cost of not having the newest GPU at each release as opposed to the other strategy. My .02I don't see the point of upgrading each gen, unless you're going from low end to high. Skipping a gen is usually the best in terms of cost and performance gains you'll actually notice.
Nvidia lowering prices. LOL
The next day that happens, pigs will fly.
Still rocking two 3080s myself.
Can I get ~4090 perf in a 3080FE sized and power hungry card at <$1,000? Hmmm; maybe it’s time…
5070TI will definitly have 16gb, at least according to most leaks. Seems to be very similar to 5080.Hoping for 16GB VRAM in a 5070ti.
5070TI will definitly have 16gb, at least according to most leaks. Seems to be very similar to 5080.
Hoping for 16GB VRAM in a 5070ti.
NVIDIA GeForce RTX 5070 Ti Gets 16 GB GDDR7 Memory, GB203-300 GPU, 350W TBP
5090 - 33% more cores, 33% more power yet it will be 60-70% more performant?
He is smoking crack unless we have the biggest IPC increase in decades. Same is true about 5080 being faster than 4090, it doesn't make any sense.
it's likely 4080 vs 5080 and comparision like NV does per tier 4060 vs 5060, 4070 vs 5070 etcSame is true about 5080 being faster than 4090
Yeah 1tb vs 1.8 + clocks from GPU and MemoryThe 4090 is extremely bandwidth starved
The 4090 is extremely bandwidth starved. Like seriously. You could get 30% faster without changing anything else if it had GDDR7.
Depends on the game? In some games you will need more compute than memory bandwidth. Look at 4070 vs. 3080 - about the same compute power (4070 has less cores but far higher clock) but 3080 has ~50% memory BW. Results? In 1080 and 1440p both cards are ~ the same, 3080 starts to win at 4k but it's far from 50% difference.
This memory BW difference will make the most difference for people aiming to play at 8k.
Just get a new CC to pay off the otherI spent some of my RTX 5090 savings fund on Christmas. I don't want to have to use a credit card to buy a graphics card. Come on Nvidia surprise us with some nice low prices this round.
60-70% more performant is bollocks. It's going to be more like 30%.NVIDIA GeForce RTX 5070 Ti Gets 16 GB GDDR7 Memory, GB203-300 GPU, 350W TBP
Wait until we see nearly all the performance gains being attributed to the higher power budget. I remember when new generations of hardware were magnitudes more powerful and more efficient too.
3080 was magnitudes faster than 2080 (40% at 4k)Excited to see these cards.
But def skipping the generation.
RTX60s should bring real advancement for xx70 class cards.
Im too poor to afford an xx90.
Magnitudes?
When was that?
Also consider 3090 vs 4090 is one of the biggest upgrades weve seen in like what 20 years?
Man you are hugely downplaying the performance differences here3080 was magnitudes faster than 2080 (40% at 4k)
1080 Ti was magnitudes faster than 980 Ti (46% at 4k)
4090 is 40% faster than 3090 in 4k
Its always that case except 2080 Ti
Hoping for 16GB VRAM in a 5070ti.
Off-topic, but them Morgott lines hit strong lol.put these foolish ambitions to rest. thou'rt but a fool!
it's a CGI trailer. even when we get gameplay i still won't expect the game to look like that. remember Witcher 3 went through a "downgrade" phase.
but then Cyberpunk actually ended up looking a lot better than it did in the gameplay trailers.
i know but i don't have any brains or sense.
i only play at 1440p but i did upgrade to a 360hz monitor. i only run it at 240 but still i wish i had a bit more power.
i had a 1070 then got a 2080 and now a 4080. so i guess i'll wait for a 6080 or something. the xx90 cards are just way too expensive
Wait until we see nearly all the performance gains being attributed to the higher power budget. I remember when new generations of hardware were magnitudes more powerful and more efficient too.
Not real time. Already confirmed by cdpr. They used ue5 as a render engine and let the frames take as long as they needed. A 4090 would have produced identical image quality, just taken longer.
It was still made on the 5 series.
I can't remember if the 3000 series had any features? 2000 introduced DLSS and RT. 4000 brought framegen. 3000...nothing? So it's not guaranteed we'll get a new feature.If the next DLSS is exclusive to the 50 series I'm never buying Nvidia again
lol copeWe got the reveal last night though.
Not to be dramatic or confrontational, but why won’t you buy Nvidia cards if that’s the case? You’re not going to see these newer next-gen features in any other cards unless if you’re fine with waiting 3-4 years for AMD & Intel to catch up which by then, Nvidia would’ve already done more DLSS stuff.If the next DLSS is exclusive to the 50 series I'm never buying Nvidia again
Then you're probably never buying Nvidia again is my guess.If the next DLSS is exclusive to the 50 series I'm never buying Nvidia again
If the next DLSS is exclusive to the 50 series I'm never buying Nvidia again
Don't cheap out on motherboard. Get one with good vrm's and read about people having stability issues while you think, huh?, runs fine in my setup.Do you guys think I should get a PCE 5.0 mobo for the 5090 or save some bucks and get a cheaper mobo if GPU would performance same on both?
It's not FOMO it's real. Neither AMD or Intel is going to come up with technology that rivals DLSS.Lol.
Jensen knows the almighty, all consuming power of FOMO amongst PC Gamers is enough to overcome such moral indignation.
600 Watts is truly crazy. I cannot imagine how anyone can justify 600 watts for the GPU just to play games. If you have a 14900k plus Oled monitor, you could be at almost 1kw to play games. That’s just wild man.