llien
Member
I tend to react to hypocrisy.Why do you sound so agitated?
I tend to react to hypocrisy.Why do you sound so agitated?
I tend to react to hypocrisy.
Would anyone recommend me selling 5700XT right now and rather focus on the new cards? What do you say..
We've got games already hitting close to that NOW what the hell is going to happen with newer games that push even more???For the next few years 10GB is enough for 1440p.
I have been thinking about the GPUs pricing (that /r/hardware post regarding Turing pushed me even further about skipping this gen)
I don't like the 10GB idea and it seems like the 3080 is basically a 2080 Ti, I don't know honestly my 2070 OCed is doing just fine at 3440x1440@100hz for World of Warcraft and CIV VI at the end of the day.
PS5 coming along with 16GB with 14GB for games at 499$ for me is a better price for this year alone. I might think better about this but sounds like I'm going with the PS5 this fall especially with the crazy games I wanna play.
I'll be watching over this thread but this gen seems like a SUPER bait for later on in the year or next one.
We've got games already hitting close to that NOW what the hell is going to happen with newer games that push even more???
Time was you used to be able to get the 80 series card and be set for a generation, now you pay more for less? Can't even do 4k without running out of VRAM.
Isn't the 3080 a 4k gpu? I mean if it's 30% more powerfull than a 2080ti the horsepower is there...it's not like the 3090 is the only 4k capable gpu...
- For the next few years 10GB is enough for 1440p.
- Good 4K monitors are expensive. Nvidias knows their audience and they can milk 4K gamers for more money with the 3090. Those people have no alternative anyway.
- Having just 10GB is cheaper for Nvidia. VRAM is one of the most expensive things for them.
- A 10GB card puts people on a more frequent upgrade interval, because they sooner realize that the VRAM is a potential bottleneck.
- People look at performance first in reviews and then at the price. The price would go up with more VRAM, but the performance wouldn't
- GPU Reviews are a reflection of the current PC gaming landscape with titles from the last few years and the 3080 10GB will be enough for them and reviewers won't run into VRAM shortcomings.
- Based on that people are going to see that 10GB is enough for them currently. What the future holds nobody knows.
- The 10GB 3080 gives a very clear distinction to the 24GB 3090 and it makes the extremely expensive 1500$ RTX 3090 look like a reasonable buy because it flaunts so much VRAM. Basic consumer psychology.
- It leaves more room for future cards that may slot between the 3080 and 3090 to be impressive. Nvidia can go with anything between 10 and 20gb then and people will be impressed.
Isn't the 3080 a 4k gpu? I mean if it's 30% more powerfull than a 2080ti the horsepower is there...it's not like the 3090 is the only 4k capable gpu...
I have a question, when you use dlss, is the vram used in the same quantity as native 4k? (i already suspect the answer but better asking anyway).Games are going to be more demanding as well. When the first Titan came out people screamed from the rooftops that this greedy 1000$ product was the first 4K GPU and today I wouldn't even use that piece of junk for 1080p.
Running native 4K is so extremely taxing that Nvidia saw the need to bring DLSS and they also see the need to release an absolute monster GPU with 350W.
In the end it is a little more complex, because many GPUs can easily run 4K if you know your way around graphical settings and the RTX 3080 will be great at 4K yeah.
But there is this unwritten law that in order to qualify as a 4K GPU you need to run native 4K with all settings maxed out and then be able to hold 60fps. (This is a bit of banter, but there is also truth to it)
I have a question, when you use dlss, is the vram used in the same quantity as native 4k? (i already suspect the answer but better asking anyway).
I used dlss a couple of times but i'm not really a guy who play with afterburner stats on the side of the screen...
Good, let's just hope that every game is gonna support this technology at launch in the near future, this is gonna be the most important thing in pc gaming, more than rtx.DLSS 4K would use less ram then native 4K
Good, let's just hope that every game is gonna support this technology at launch in the near future, this is gonna be the most important thing in pc gaming, more than rtx.
I only tried the previous version on control and it was not that impressive, never tried the 2.0 version but people are ripping their balls out so i guess it's noticeably better.I hope so too. What I have seen from DLSS 2.0 in Death Stranding, Control and F1 2020 is impressive and makes me even more excited for Cyberpunk 2077
If the fancy upscaling can reconstruct detail from the native resolution image, to the point where you don't see a difference between them, then it is super valuable.Saying "but dlss" is basically saying "but fancy upscaling".
I only tried the previous version on control and it was not that impressive, never tried the 2.0 version but people are ripping their balls out so i guess it's noticeably better.
The easiest way to find out vram bottleneck is the monitoring of "BUS interface loading".Well first the games are using more VRAM than they actually need. What monitoring tools show is not indicative of "close to hitting the VRAM bottleneck".
But that's just admitting that people are paying MORE now and not getting what they used to. In what world is a 3080 meant to be a 1440p card when the 2080/Super/Ti was a base 4k card, and even then you've got the next gen consoles hitting higher, if I want 1440p I can just grab a PS4 Pro for like $250, not spend $1,200+ on a GPUWell first the games are using more VRAM than they actually need. What monitoring tools show is not indicative of "close to hitting the VRAM bottleneck".
My 8GB card outperforms the 11GB 1080 Ti in every 4K game I have seen so far.
Faster VRAM helps a lot and the jump for the 30 series is extremely big in that regard.
The time of the X80 series cards being the flagship are long gone and Nvidia always was very greedy on the VRAM. Especially in comparison to AMD.
The 1.5GB 580
The 2GB 680
The 3GB 780
Even the 3GB 780 Ti
And once the reviews come out the RTX 3080 10GB will completely outperform the 24GB RTX Titan in probably every single 4K game, but then again I think the 3080 is the 1440p card for Nvidia and the 3090 is the card aimed at 4K gaming.
But that's just admitting that people are paying MORE now and not getting what they used to. In what world is a 3080 meant to be a 1440p card when the 2080/Super/Ti was a base 4k card, and even then you've got the next gen consoles hitting higher, if I want 1440p I can just grab a PS4 Pro for like $250, not spend $1,200+ on a GPU
4k or 1440p, but more regularly 4k now.Okay. Calm down and then please explain to me what exactly you mean, because I feel like you are misinterpreting what I said or confusing me for something someone else said.
If you think there is any fanboyism going on on my end then please reread my posts in here, because I shit on both Nvidia and AMD where they deserve it. I buy the products I think best and my last card was an R9 290X which I owned for many years and was extremely impressed by. I also owned Nvidia GPUs.
Sorry I can't figure out your angle or what makes you upset.
Depends on what monitor/resolution do you have and what you plan on playing in the future?
I think the cards we are about to hear from are going to be more expensive than the 5700XT unless they are releasing an RTX 3060 as well and if that is the case I'd expect the 2nd market price of the 5700 XT to drop a bit, but I am reaching here. However the RTX 3060 might be further down the line.
The 5700 XT is a great card, but I think it will be dwarfed by the upcoming RTX 30 series and also the Big Navi products (however many there are) not only from a performance standpoint but probably a feature set standpoint as well.
4k or 1440p, but more regularly 4k now.
That will be all of them then.
What would you recommend for 1440p situations?Well 4K is a demanding beast for the 5700 XT and then it depends on whether you are happy with lower framerates and/or turning down settings.
It is a difficult situation right now with upscaling technologies like DLSS and FidelityFX and a question what games are supporting that.
I would expect the upcoming cards to perform significantly better (that includes the 3080, 3090 and the big card from AMD based on RDNA2 that is set to be revealed this year) at 4K, but all of those are going to be expensive.
If I were in your situation and had my gaming focus at 4K I would probably sit on it another year or two and upgrade then, but I think it is reasonable to sell it and buy a more potent product when you are already unhappy and have graphically demanding games like Cyberpunk 2077 you are looking forward to.
I agree the situation is not easy. For 1440p it would be different, but with 4K it is a bit more personal to each person's requirements and expectations. I expect many 4K people (those that play modern AAA PC games) to sell their current cards and upgrade to an RTX 3090.
I'm going 3090 route if the price is right, i want to put myself in position for that high tier 4K monitor market in a years time or less. I have a 2080S right now and play just about everything on MAX HIGH settings in 1440p, ULTRA gets finicky depending on the game but mostly ok.What would you recommend for 1440p situations?
It's not contradicted by the leak, that's just for launch. We've known about the double vram variants for a long time, only we don't know when they will arrive (if at all) and how much extra will be. Both crucial questions.There are rumors that there will be 2 memory configs : https://www.tomshardware.com/news/geforce-rtx-3090-rtx-3080-rtx-3070-specifications-leaked
This is seemingly contradicted by the leak showing Zotac only has 3 models though they could have 3x2 or only have the 2x memory on the high end cards.
It's not contradicted by the leak, that's just for launch. We've known about the double vram variants for a long time, only we don't know when they will arrive (if at all) and how much extra will be. Both crucial questions.
reverb g2 requires a lot of horse power. if u want a laptop for it i hope u r rich.Guys, im planning on buying a laptop, with a good Nvidia card in, you recommend, search for a deal on one now 2070 super ish, or wait for a laptop with a 3070 in it? I plan on using it with VR at (i wsnt a HP Reverb G2) also, 10th gen i7 and 16 GB ram is fine I guess?, Thanks.
I have feeling it only has that much Ram because they want to be sure that AMD doesn't beat them for performance crown. Like you say, the massive jump in VRAM doesn't make a lot of sense otherwise.Seems a weird jump form 10 to 24 GB, maybe Nvidia will push an 18GB card later on?
What would you recommend for 1440p situations?
HDMI 2.1 unlocks the 4K 60 FPS cap on my LG C9, so I might upgrade to a 3080 from an RTX 2080 TI just for that. I really want the 3090, but it’s beginning to look like a 700 watt PSU won’t cut it.
What could you possibly need that power for? A 2080TI is more than enough to handle any game out today.
Seems a weird jump form 10 to 24 GB, maybe Nvidia will push an 18GB card later on?
To try and push 4K/120.....
Great. I think you wouldn't mind if I add $420 for that because that is what equals the PS+ online subscription over the course of the PS5 lifetime?
PS+ also gives me a ton of games, so I guess you then also have to subtract the value of those. Which will, in the end, mean you actually have to SUBTRACT from the PC build budget.
I could add all the free Epic Store games I am adding each month, but that would not be really fair for the PS+, because we would go further into adding extra budget to the PC side of things.
Are you playing in slow motion? You're probably hitting 4k/120 on all competitive games already. Is there a need to play at 120fps for games that don't require laser focus?
Alright, so let's focus on the hardware then, and ignore these unknowns. Let's say I don't want to play online (I barely do).
So 3080 has same number of cores as 2080Ti and 1GB less memory, but higher bandwidth. I am really curious how much faster than 2080Ti is it gonna be and what its price is gonna be.
Yes, you can if you set it to 4:2:0, which sucks.Current cards can't do 4k/120 at all on OLEDs because they only have HDMI 2.0.
Yes, you can if you set it to 4:2:0, which sucks.
What could you possibly need that power for? A 2080TI is more than enough to handle any game out today.
I've heard rumor that DLSS 3.0 is gonna be game agnostic? Which would be an amazing achievement. DLSS 2.0 has been amazing but with the onus on sending the full game to Nvidia to let their A.I. algorithms on the Devs, so little games use it. The ones that do are amazing examples of the tech, but more games need it. Hopefully 3.0 really can be done without a lot of Dev input.
The catch to DLSS 2.0, however, is that this still requires game developer integration, and in a much different fashion. Because DLSS 2.0 relies on motion vectors to re-project the prior frame and best compute what the output image should look like, developers now need to provide those vectors to DLSS. As many developers are already doing some form of temporal AA in their games, this information is often available within the engine, and merely needs to be exposed to DLSS. None the less, it means that DLSS 2.0 still needs to be integrated on a per-game basis, even if the per-game training is gone. It is not a pure, end-of-chain post-processing solution like FXAA or combining image sharpening with upscaling.