BattleScar
Member
DRAM CartelI'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
DRAM CartelI'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
except for the giant piece of glass in the front blocking those fans from working.What in interesting looking case.
Thermals must be nice.
But looking at the latest rumors I dont think Ill need to worry about the Evolv X.
I aint getting no 4090.
I'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
That would make the cards cost much more money, and it would make them more expensive to produce.I'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
The A6000 is 48GB and costs about 5000 dollars.That would make the cards cost much more money, and it would make them more expensive to produce.
I paid $2k for my rtx 3090, and that's with only 24gb vram. Would you pay $4000 msrp?
Plus, nvidia wants to make as much profit as they can, so they want to give you as little as they can to get the job done.
I'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
Like the 24gb in the 3090 it would largely just be a waste.The A6000 is 48GB and costs about 5000 dollars.
Quadros have Quadro Tax applied, id think a 48GB RTX 40x0 should....should cost less than 3000 dollars. (realistically it should be 2000 dollars)
Considering people were/are willing to spend 2000 dollars on a GPU thats marginally faster than the RTX 3080 I dont see why they wouldnt for a 48+GB VRAM card.
If RTX 4080 end up being powerful as 2x3080 is it still good idea to go for 1440p/monitor?
I mean 4k is nice, but i think 4k 60-100fps with RTX 4080 will not be possible(for new games). Maybe in 2 years with RTX 5080..
Yeah all the rumors point towards the xx80 and xx70 are getting VRAM bumps but with reduced memory interface, the new caches are massive though so should make up for the dropped memory in gaming scenarios which hopefully will also mean they arent that good for mining considering the bumped up TDPs.......so we are hopefully good.Like the 24gb in the 3090 it would largely just be a waste.
I wouldn't be surprised if Nvidia doesn't bump the vram in these next cards. Casuals are attracted to bigger numbers, and amd did well with that.
What? Absolutely not chance. Where did you hear that, an Nvidia press release?If RTX 4080 end up being powerful as 2x3080 is it still good idea to go for 1440p/monitor?
I mean 4k is nice, but i think 4k 60-100fps with RTX 4080 will not be possible(for new games). Maybe in 2 years with RTX 5080..
Hmm i guess with RTX 4080 we can hope at about 40%-50% increase over RTX 3080?What? Absolutely not chance. Where did you hear that, an Nvidia press release?
It will be intriguing to see for sure. Nvidia just gonna brute force the powah! I personally thought AMD was gonna outright win this time around with RDNA3, but I've backed down from that claim, lol.Whilst this is exciting for most, I’m much more intrigued to see the performance per watt and how it compares to AMD, and Apple.
Yeah something like that.Hmm i guess with RTX 4080 we can hope at about 40%-50% increase over RTX 3080?
What? Absolutely not chance. Where did you hear that, an Nvidia press release?
If they make MSRP 1000 dollars we are fucked.Yeah something like that.
Also we know teraflop count isn't really relevant when comparing.
I'm desperate to get a 4080 (for under a grand anyway), however I'm expecting a significant but not massive jump from the 3000 series.
Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.
A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch
Maxing out sliders.That’s going against the trend of the industry where SSD + IO management is drip feeding exactly what is needed for the scene and has barely any assets waiting in VRAM..
Movie studios have special cards for their needs, so can you clarify on « most peoples’ problems » ?
Maxing out sliders.
You hit the vram limit pretty fast in some games like Resident evil 2 remake.
The menu is wrong.Maxing out sliders.
You hit the vram limit pretty fast in some games like Resident evil 2 remake.
Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.
A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch
Ok, what does that have to do with wanting max settings?
I play at 3440*1440 aswell, but any 3080 would have to be the 12GB version, at way under £649 even if the 4080 sees a price bump.If they make MSRP 1000 dollars we are fucked.
Assuming they dont goof and have low stock, prices should be close to MSRP not too long after launch.
If the whole July paper launch rumors are true, then by the end of the year I should have saved enough to get a 4080....assuming its enough of a jump over the 3080 cuz you know people are gonna offload those in droves, I could live with a 3080 as I "only" play at 3440x1440p
Ok, what does that have to do with wanting max settings?
I didn't include resolution because everyone is using something different. Be it 1080p to 4k or whatever.
For me, I want to see all sliders to the right running excellent performance.
I get fairly annoyed when I hit vram limits, which happens more often, esp in newer games.
But whatever.
The way to solve this is to slowly upgrade your gpu. (by slowly I mean grab whatever is newest)
What was a limit caps today will be minimum requirements tomorrow.
Guess my reasoning on why I prefer the whole power per watt vs brute power is because I feel eventually more people are going to want smaller, more efficient devices, perhaps the Steamdeck could lead the path. Just think big tower, high power PC gaming has been on a decline for a while and for it to succeed surely a power per watt os the way to go?It will be intriguing to see for sure. Nvidia just gonna brute force the powah! I personally thought AMD was gonna outright win this time around with RDNA3, but I've backed down from that claim, lol.
Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.
A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch
Guess my reasoning on why I prefer the whole power per watt vs brute power is because I feel eventually more people are going to want smaller, more efficient devices, perhaps the Steamdeck could lead the path. Just think big tower, high power PC gaming has been on a decline for a while and for it to succeed surely a power per watt os the way to go?
Cant wait to see those thicc 4 slot EVGAs.The size of the cooler compared to the size of that hand.... I'm just imagining the power requirements and heat
F are they really gonna try to sell 4070 as 4080 and 4060 as 4070? If true than it's really greedy move I'd expect 4080ti to come out the door very soon after.Yeah all the rumors point towards the xx80 and xx70 are getting VRAM bumps but with reduced memory interface, the new caches are massive though so should make up for the dropped memory in gaming scenarios which hopefully will also mean they arent that good for mining considering the bumped up TDPs.......so we are hopefully good.
The xx60 is supposedly going to be on 128bit interface and likely only have 8GB of VRAM.
That I will dissemble as soon as I can and put EK water block on itCant wait to see those thicc 4 slot EVGAs.
x80 series have historically been the xx104 chip.F are they really gonna try to sell 4070 as 4080 and 4060 as 4070? If true than it's really greedy move I'd expect 4080ti to come out the door very soon after.
And xx80 has also been historically the very top xx102 chip. In the end it's just the naming.x80 series have historically been the xx104 chip.
NVIDIA GeForce RTX 2080 Specs
NVIDIA TU104, 1710 MHz, 2944 Cores, 184 TMUs, 64 ROPs, 8192 MB GDDR6, 1750 MHz, 256 bitwww.techpowerup.comNVIDIA GeForce GTX 1080 Specs
NVIDIA GP104, 1733 MHz, 2560 Cores, 160 TMUs, 64 ROPs, 8192 MB GDDR5X, 1251 MHz, 256 bitwww.techpowerup.comNVIDIA GeForce GTX 980 Specs
NVIDIA GM204, 1216 MHz, 2048 Cores, 128 TMUs, 64 ROPs, 4096 MB GDDR5, 1753 MHz, 256 bitwww.techpowerup.comNVIDIA GeForce GTX 680 Specs
NVIDIA GK104, 1058 MHz, 1536 Cores, 128 TMUs, 32 ROPs, 2048 MB GDDR5, 1502 MHz, 256 bitwww.techpowerup.com
The 3070ti is actually the real 3080.
Thermals must be bad since you need that many fans. But it seems like there is basically no gaps to suck air into the case, so I guess you need that many fans. You get underpressure inside the case, which should be fine I suppose for certain setups.eh, got me one of those glass cubes (disabled basically all RGB).
5x120mm in
5x120mm out
wish it was all 140mm though
You mean 80ti. That's also ignoring titan and workstation cards.And xx80 has also been historically the very top xx102 chip. In the end it's just the naming.
Everything points to 4080 being another step down in tier by invidia. Just like what they did with gtx 680 which was suppose to be gtx 670 and real 680 suddenly became 780. Watch the real 4080 the 4080ti suddenly have 320 bit set up just like 3080.
But just because amd could not compete that year greedia decided to move a tier up and so a mid range x70 with 256 bus width card sudenly was sold as x80 card. not like gtx 580 and others before it was 384bit, rtx 3080 was 320 bit, but now 4080 again with mid range 256 bit configuration moved and sold as a higher tier card.
Wait just noticed 4070 with 192 bit conf now? What a joke. Man I still member the day when a second tier card, the gtx 570, was a $330 card! Now second tier rtx 3080 or rtx 3080 ti is ~$1000. ~ 3x the price. Greed really has no limits.
You're right I'm more concerned about gaming cards and exclude those workstation ones. But simply by looking at the leaked specs [if they're accurate] you can tell the perf gap is so historically ginormous that clearly a card is missing between them.You mean 80ti. That's also ignoring titan and workstation cards.
Smaller bus is allegedly made up with cache like AMD did with rdna2. We will see I guess.
As a reminder it was said by Tim Sweeney that a console with roughly 40TF performance should be able to provide photorealistic visuals. It will be a few gens before this type of card sits in a console box.
I'm kinda there with you. My 6800 XT draws less power than a 3080 and beats/match it at my resolution (1440p). If I do go AMD for my next GPU, I hope they keep their power per watt approach, honestly.Guess my reasoning on why I prefer the whole power per watt vs brute power is because I feel eventually more people are going to want smaller, more efficient devices, perhaps the Steamdeck could lead the path. Just think big tower, high power PC gaming has been on a decline for a while and for it to succeed surely a power per watt os the way to go?