Nvidia RTX 6000 Series Expected to Launch in Early 2027

8GB VRAM RTX 6060 for $999 and you will buy it.

Expecting +10% more performance for +33% power usage but hey, a small 1.2x boost in all ray tracing features and 16x AI Cores compared to the 5000 series.

You can't hate AI enough:

aiCQflg.png
 
Last edited:
Only positive of this series is that Nvidia saw 8gb cards sold like complete garbage in the last year, so at the very least mid-tier is going to be 12gb+.

Still doubt they'll over better value than AMD in the mid-tier though, especially as AMD closes the RT + AI gap.
 
Their mid tier products are 12 GB right now.

I think the 6060 will be 9.
Not all of them.

The RTX 5060ti has 8gb models, and the laptop 5070 still has 8gb.

I just think it's weird they exist this way when my gtx 1070 back in the day had 8gb of vram, and the Intel B580 has 12gb of vram at a $250 price point.
 
Only positive of this series is that Nvidia saw 8gb cards sold like complete garbage in the last year, so at the very least mid-tier is going to be 12gb+.

Still doubt they'll over better value than AMD in the mid-tier though, especially as AMD closes the RT + AI gap.
Yeah it completely retarded on Nvidia's part to still use 8GBs even for low ends.

At min it should be 12GB. The default 3060 being 12gb was fair and a default 5060 should have been 16gb.

I do think the 6000 series will be a decent jump since it'll be coming around the next gen consoles 2027 which is usually when the jump gets somewhat decent.

The absolute lowest like a 6050 ti should be 12gb and even the normal 6060 should be 16gb at that point.
 
That's part of the Shrinkflation. IO doesn't scale well much if at all now, plus less chips. Makes it that much more expensive. They had 1 GB chips back then, it's only 2 now.

It doesn't seem like the market cares though, because the variants of those same cards that offered 16gb dramatically outsell them, and the price gab between them now are so marginal the 8gb offerings are worthless imo. Like on Newegg there are 16gb 5060ti models that are only $20-30 more expensive than 8gb variants.
 
You're so rich, let me suck you dry :messenger_savoring:


The 5090 needs DLSS to reach 60FPS and more when path tracing is on max details (Alan Wake 2).
But is that reason enough for a new generation?

I know they won't fucking do it, but what i really want is a card that draws less energy, so where back to manageable power draws. One of the main reasons I went for a 5080. If the 6090 is the same or more than the 5090 then im looking at the lower cards again. My room gets enough hear as it is.
 
8GB VRAM RTX 6060 for $999 and you will buy it.



You can't hate AI enough:

aiCQflg.png
That seems egregious, but this is the way forward for Nvidia, using AI to generate miles better real-time graphics than what we have now presumably, Neural Rendering will be a BIG feature for the 60 series for sure.
 
Another Nvidia card that only exisits on Gaming forums and not in stores.
Oh look another 5070 sitting on the shelf that no one wants. Soon it will be part of some Sora AI Disney short on Youtube.
Youtube keeps putting dozens of make fun of Disney stuff in my feed.
There has been no better use of AI yet.
 
Last edited:
When RTX 3000 came out I stocked up on RTX 2000 cards
When RTX 6000 releases I'm buying a few RTX 4000 cards

Not playing Nvidia's games created my own generations
 
Im not buying a 16GB VRAM GPU......Im also not paying a bazillion dollars for more.
XX80 better start at 24GB.
 
Is using DLSS at 4k with a 5090 a war crime?

To run a few titles(e.g. Alan Wake 2 or Cyberpunk) with full RT with good framerates it is a actually necessary.

Also since the new transformer model is out is is very hard for me to see a difference between 4K/DLAA and 4K/DLSS-Q. Therefore using DLSS to save 200-250W of power.
 
Idk. I feel like the gains for gen on gen for these cards are ever diminishing. I'm still cool with my 4090 and even my old 2080 Super that I put in my kid's PC still runs most things fine at 1080 for her.
 
To run a few titles(e.g. Alan Wake 2 or Cyberpunk) with full RT with good framerates it is a actually necessary.

Also since the new transformer model is out is is very hard for me to see a difference between 4K/DLAA and 4K/DLSS-Q. Therefore using DLSS to save 200-250W of power.
Also Indiana Jones. That game makes me wonder if we need 24 gigabyte VRAM next gen.
 
I'm going to just get a 5060 TI 16GB whilst It's £340. That card will do me well, as has my current 12GB 3060, which I can get £150+ for.
 
Top Bottom