• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Agent_4Seven

Tears of Nintendo
How much can I realistically get for my 1080ti at the moment? I am deciding if I should sell and upgrade to the 30xx series or ride my 1080ti all the way to the finish line.
I can sell mine for like $380 easily right at this monemnt, but I've decided to wait for reviews and 1440p benchmarks on monday first, just to make sure if it's really worth it.
 
Last edited:

BluRayHiDef

Banned
I can sell mine for like $380 easily right at this monemnt, but I've decided to wait for reviews and 1440p benchmarks on monday.

I'm going to keep my 1080Ti, just as I've kept my HD 5870 and HD 7970. I had a 980Ti, but I traded that in at Microcenter for store credit, which I used to get my 1080Ti.
 

jigglet

Banned
Question guys: I read the Nvidia spec sheet for the 3070. It recommends at least a 450w PSU. Is that irrespective of what CPU I have?
 

nemiroff

Gold Member
54AA8F6EC14BF36BDBB411E7124358FF40ECB215

I thought your pic was oddly out of place brother - Until I clicked "Show ignored content". Pretty amusing turn of context.
 

jigglet

Banned
No. The PSU powers the whole system not just the GPU. :p

I know but my point was, if I had a crazy powerful CPU could that end up taking capacity from the GPU and thus make a 450w PSU insufficient (I don't have a crazy powerful CPU but I'm just curious)?
 
Last edited:
I know but my point was, if I had a crazy powerful CPU could that end up taking capacity from the GPU and thus make a 450w PSU insufficient (I don't have a crazy powerful CPU but I'm just curious)?
You need a PSU that can handle whatever load of everything is on your PC. If your CPU/Ram/Mobo pulls a lot of power because of overclocks/etc then yes, 450w PSU might not be enough.
 

jigglet

Banned
You need a PSU that can handle whatever load of everything is on your PC. If your CPU/Ram/Mobo pulls a lot of power because of overclocks/etc then yes, 450w PSU might not be enough.

Ok. Looks like I'll need to recalc my whole setup to double check the power draw.
 

jigglet

Banned
Who actually expects big navi to be competitive? I'm sure it will be priced nicely but performance wise?

I don't care. I'm not an Nvidia fanboy, I'll dump this shit as long as something better comes along. But the last AMD cards went through 6 months worth of bugs before a decent patch comes along. A $100 saving is not enough for 6 months of pain. Noooo way.
 

Rikkori

Member
Who actually expects big navi to be competitive? I'm sure it will be priced nicely but performance wise?
Who doesn't? At the silicon level things are very clear for how well it can do, depending on how hard they go in with the memory config. AMD's problem now is more in relation to the software stack and the various features where it gets outclassed by Nvidia (DLSS, the various tensor core addons like Voice, Broadcast etc; NVENC encoder, AV1 support, superior Shadowplay recording etc). An 80 CU 384bit Navi would have absolutely no trouble besting a 3080 just in terms of rasterisation, which is still the most important thing.

Problem is people are suffering from recency bias and aren't knowledgeable about GPU history in general. Pascal was a miracle breakthrough of performance, but it wasn't just AMD that suffered for it, Nvidia themselves have trouble really blowing past what they achieved back then. Radeon vs Geforce has had a tumultuous history but it's always swung back and forth. It's only taken longer now because AMD were forced to stick with GCN (for various reasons) so the comeback isn't instantaneous but RDNA can absolutely fight back. It was honestly not too different compared to their CPUs being stuck with the Bulldozers for so long because once you put things into play it takes years to change course, that's just the nature of tech manufacturing. And you see Intel have the same problem with their monolithic designs.
 
Who actually expects big navi to be competitive? I'm sure it will be priced nicely but performance wise?

No one really knows until it releases 🤷‍♂️

Based on the XSX as a piece of evidence that seems to reach roughly 2080 levels of performance with 52CUs and modest clock/power draw as it is on a single APU for a console I think they might be competitive.

With increased die size, larger CU count, higher clocks, higher power draw, perf/watt architecture improvements over RDNA1/IPC gain I think it is possible they could be competitive in performance. Obviously we won't know anything for sure until we see benchmarks but I think they can probably compete, at least I hope they can for good competition in the GPU space.

There seem to be a lot of rumours going around that indicate they can compete. Granted take them all with a grain of salt as they could be just wishful thinking but it seems Radeon group have gotten their act together somewhat now that they have proper R&D funding so they could surprise everyone.

Similarly Nvidia themselves seem to think they could be competitive given the agressive pricing/holding back larger memory sizes for Ti class cards to compete with AMD. Nvidia likely know/suspect a lot more about what AMD are cooking up than the rest of us I would imagine so by their own actions they seem to indicate that they think AMD can compete.

If they don't compete and they mess up then that is disappointing but so be it, sucks for competition in the GPU space but it is what it is.
 
Last edited:

Orta

Banned
If AMD can give us the equivalent of a 3070 for about €350 I'll be as happy as a pig wallowing in shit.
 
if im someone who doesnt mind connecting their PC to a TV then does it make any sense to spend $800+ on a 4k 120hz gaming monitor when i can just get a $800 4k 120hz tv @ a much larger size?

im not an input lag snob so if the tv is unnoticeably slower in reaponse then that means nothing to me
 
if im someone who doesnt mind connecting their PC to a TV then does it make any sense to spend $800+ on a 4k 120hz gaming monitor when i can just get a $800 4k 120hz tv @ a much larger size?

im not an input lag snob so if the tv is unnoticeably slower in reaponse then that means nothing to me

the only reason I am not getting a 4k TV is because I'm aiming for an utra wide for racing and flight sims I don't really need a big difference to be honest if you are not looking for any special high specs
 
I could probably figure this out easy enough on my own, but I'm feeling lazy. Do we need new PSU's for the 30 series? Thought I heard something about additional plug/connector requirements or somesuch.
 
Last edited:

ZywyPL

Banned
Based on the XSX as a piece of evidence that seems to reach roughly 2080 levels of performance with 52CUs and modest clock/power draw as it is on a single APU for a console I think they might be competitive.

Honestly, no matter how you slice it, the Big Navi will be a 20-22TF card, so slightly below 3070 and slightly above the rumored 3060Ti. Unless AMD is holding back some sort of a holy grail of DLSS caliber, which will make the card able to hit way above its on-paper specs. It's a simple math, really - 2080 has about 12TF, so does XSX, so that indicates that RDNA2 TFlops will be finally as effective as Nvidia ones, which is a really good sign for pretty much every gamer out there, but that also means that if we can finally translate the performance/TFlops in 1:1 proportions, the AMD flagship GPU will be fighting against Nvidia's 400-500$ cards... Unless like I said, there's something we don't know and hasn't leaked yet.
 

Rikkori

Member
if im someone who doesnt mind connecting their PC to a TV then does it make any sense to spend $800+ on a 4k 120hz gaming monitor when i can just get a $800 4k 120hz tv @ a much larger size?

im not an input lag snob so if the tv is unnoticeably slower in reaponse then that means nothing to me
Been playing on a TV connected to my PC for years, imo the monitor market is FUCKED! Super expensive & inferior in terms of picture quality. If you're not playing esports titles 24/7 or don't hate a big screen then it makes no sense to go for a monitor.
Input lag is actually very good for TVs nowadays, and even better if you put into 120hz mode. Even as an ex-competitive player I have no issue with it on an otherwise middling Sony. Now that I play single-player titles only nothing compares to the picture quality & sheer size of a TV, especially as I sit close. It's honestly the best upgrade you can make for your gaming experience.

I could probably figure this out easy enough on my own, but I'm feeling lazy. Do we need new PSU's for the 30 series? Thought I heard something about additional plug/connector requirements or somesuch.
No, only FE has 12pin and that comes with adaptor anyway.

Honestly, no matter how you slice it, the Big Navi will be a 20-22TF card, so slightly below 3070 and slightly above the rumored 3060Ti. Unless AMD is holding back some sort of a holy grail of DLSS caliber, which will make the card able to hit way above its on-paper specs. It's a simple math, really - 2080 has about 12TF, so does XSX, so that indicates that RDNA2 TFlops will be finally as effective as Nvidia ones, which is a really good sign for pretty much every gamer out there, but that also means that if we can finally translate the performance/TFlops in 1:1 proportions, the AMD flagship GPU will be fighting against Nvidia's 400-500$ cards... Unless like I said, there's something we don't know and hasn't leaked yet.
Wrong calculations. RNDA already is equal to Turing on a tflop basis (see tests by either computerbase or pcgameshardware, can't remember which). Watch Buildzoid's video for a more detailed breakdown. Remember, we already have a single leak from some time ago that put big navi at 20% faster than 2080 Ti (the VR one).

 
Last edited:
Honestly, no matter how you slice it, the Big Navi will be a 20-22TF card, so slightly below 3070 and slightly above the rumored 3060Ti. Unless AMD is holding back some sort of a holy grail of DLSS caliber, which will make the card able to hit way above its on-paper specs. It's a simple math, really - 2080 has about 12TF, so does XSX, so that indicates that RDNA2 TFlops will be finally as effective as Nvidia ones, which is a really good sign for pretty much every gamer out there, but that also means that if we can finally translate the performance/TFlops in 1:1 proportions, the AMD flagship GPU will be fighting against Nvidia's 400-500$ cards... Unless like I said, there's something we don't know and hasn't leaked yet.

TFLOPs don't really mean anything for performance as they are not strictly speaking a gaming performance metric. They are even more useless when trying to compare across completely different architectures.

I think we all discussed this in that other thread about Ampere TFLOPs count so I don't really want to go into all of that again, you should check out that thread, it is a good read.

To give an example, a 2080ti is rated as having 13.45 TFLOPS. The 3070 is supposed to match a 2080ti in performance (roughly) but has 20+ TFLOPS. TFLOPS are mostly useless for comparing gaming performance.

Putting that whole discussion aside, if you honestly don't believe AMD can at minimum match a 2080ti level of performance (3070) with 80CUs on a discrete PC GPU with high power draw/clocks when the XSX is roughly 2080 level of performance with only 52CUs clocked conservatively on a small APU die with low power draw then I have some magic beans to sell you!
 
Last edited:

Kenpachii

Member
TFLOPs don't really mean anything for performance as they are not strictly speaking a gaming performance metric. They are even more useless when trying to compare across completely different architectures.

I think we all discussed this in that other thread about Ampere TFLOPs count so I don't really want to go into all of that again, you should check out that thread, it is a good read.

To give an example, a 2080ti is rated as having 13.45 TFLOPS. The 3070 is supposed to match a 2080ti in performance (roughly) but has 20+ TFLOPS. TFLOPS are mostly useless for comparing gaming performance.

Putting that whole discussion aside, if you honestly don't believe AMD can at minimum match a 2080ti level of performance (3070) with 80CUs on a discrete PC GPU with high power draw/clocks when the XSX is roughly 2080 level of performance with only 52CUs clocked conservatively on a small APU die with low power draw then I have some magic beans to sell you!

That whole thread was embarrassing. Because of the simple fact that u cant compare tflops over different architectures.

Should probably stop watching those armchair youtube experts.

Tflops does what it does. The bigger number = more performance, its as simple as that. But it has to be on the same architecture.
 
Last edited:
U cant compare tflops over different architectures.

While it is true that TFLOPs can be a reasonable rough metric of the "power" of different cards with the same architecture even then they do not scale linearly but they do represent a very rough baseline of performance to allow comparison across the same architecture.

TFLOPs in and of themselves are just a measure of TFLOPS which can be quite useful in compute workloads/scientific calculations but don't in and of themselves represent any specific gaming performance, which is why I say that as a metric in and of themselves they don't represent gaming performance in the same way that for example MPH/KPH would accurately represent a car's performance.

I agree though that comparing TFLOPs across different architectures is completely pointless and means nothing for gaming performance.
 

GHG

Member
I have a 2060 Super and wondering if it's still a good jump to 3080. I'm just getting into the PC gaming space seriously for the first time.

It always depends on what your needs are at the time. If you are playing something and you would benefit from being able to increase your framerate/resolution/settings by stepping up to a newer card (provided you can afford it of course) then there's no reason not to. It will also give you piece of mind that you're in a better position to play newer games at higher quality than you otherwise would be able to over the next few years.
 

sackings

Member

not nearly as powerful as nvidias marketing would have you believe, assuming this is accurate.
 

J3nga

Member
Some of you are really bad at numbers. SOTR without the DLSS @4K is around 75% increase over 2080 according to that leaked chart. Far Cry New Dawn around 55% @ 4K, when NVIDIA tells 3080 doubles the performance of 2080, what they mean is best case scenario, but that number will vary depending on the game and even on the scene/location. I don't care about synthetics. because I won't be buying card to run synthetic tests, but games benchmarks are solid IMO.
 
Last edited:

not nearly as powerful as nvidias marketing would have you believe, assuming this is accurate.
That’s why we always wait for real-world benchmarks from reputable reviewers like Gamers Nexus. Why are we talking about something so obvious???
 
Last edited:

BluRayHiDef

Banned

not nearly as powerful as nvidias marketing would have you believe, assuming this is accurate.

Maybe the difference will increase with new drivers? Is there any chance of this? Also, why isn't there any info on the 3090?
 

Rikkori

Member
I have a 2060 Super and wondering if it's still a good jump to 3080. I'm just getting into the PC gaming space seriously for the first time.
Yes, of course, it's a considerable jump, especially at 4K. That being said, you already have a 2060 Super which is a capable card & has RT+DLSS. I wouldn't necessarily rush into it because a lot more options will open up next month, from both Nvidia and AMD. Worst case scenario you wait a month.
 

saintjules

Gold Member
It always depends on what your needs are at the time. If you are playing something and you would benefit from being able to increase your framerate/resolution/settings by stepping up to a newer card (provided you can afford it of course) then there's no reason not to. It will also give you piece of mind that you're in a better position to play newer games at higher quality than you otherwise would be able to over the next few years.

Yes, of course, it's a considerable jump, especially at 4K. That being said, you already have a 2060 Super which is a capable card & has RT+DLSS. I wouldn't necessarily rush into it because a lot more options will open up next month, from both Nvidia and AMD. Worst case scenario you wait a month.

Thanks Guys for the thoughts. It seems making the jump would mainly be for future proofing. Will keep watching to see what else comes about.
 

not nearly as powerful as nvidias marketing would have you believe, assuming this is accurate.
It's twice as fast as my 1080 Ti. I'm in.
 

CrustyBritches

Gold Member
*edit* missed the VCZ post on 3060ti...

On the topic of upgrading from 2060 Super: that's what I have and with Cyberpunk supporting RTX/DLSS and Ampere having better than expected pricing, I think something like a 3070 16GB/3080 10GB would be a solid upgrade.
 
Last edited:

Rikkori

Member
New RDNA2 "leak" - take it with a big fistful of salt, I normally don't link to this guy but ehhh, the thirst is real.



*60% performance per watt???
*No HBM2
*Not using 512bit bus. Lower Bus width
*Infinity Cache on the GPU 128MB which helps make up for the lack of memory bandwidth of GDDR6
*Clock Frequency similiar/around to PS5
*80CU
*6700, 6800, 6900 skus
*6700 will equal 3070
*No word if they will undercut Nvidia pricing.
 
Last edited:
New RDNA2 "leak" - take it with a big fistful of salt, I normally don't link to this guy but ehhh, the thirst is real.



*60% performance per watt???
*No HBM2
*Not using 512bit bus. Lower Bus width
*Infinity Cache on the GPU 128MB which helps make up for the lack of memory bandwidth of GDDR6
*Clock Frequency similiar/around to PS5
*80CU
*6700, 6800, 6900 skus
*6700 will equal 3070
*No word if they will undercut Nvidia pricing.


This was out days ago....also, there's no guarantee the higher SKUs ship at the same time. You really want to wait an extra 4-5 months on the off chance that the highest end Navi card *maybe beats the 3080 when we know it's still going to lose in any game that has decent implementation of DLSS? Not to even mention ray tracing. I'm just not buying any of it. And for amd to just say 'hold on, we promise it'll be worth the wait THIS TIME' instead of actually showing something, is not a great indication of where they're at.

You don't do that if you're so far in second place. At least not if you want to be successful
 
Last edited:

TaySan

Banned
Canceled the TUF. Not confident I'm getting that one at launch and the price was a bit too high for a TUF. Going to take my chance and look for an EVGA or MSI card.
 

For anyone looking to for an FE at launch don't go to MC.


That's a super big bummer. I wonder if any best buys will have them in store...I have a buddy who works there but hasn't seen anything yet...if he's privy to that info, I'd imagine it wouldn't be til next week when they start rolling in on the trucks, to keep the leaks to a minimum (feel like we would've heard of it by now).
 

martino

Member
New RDNA2 "leak" - take it with a big fistful of salt, I normally don't link to this guy but ehhh, the thirst is real.



*60% performance per watt???
*No HBM2
*Not using 512bit bus. Lower Bus width
*Infinity Cache on the GPU 128MB which helps make up for the lack of memory bandwidth of GDDR6
*Clock Frequency similiar/around to PS5
*80CU
*6700, 6800, 6900 skus
*6700 will equal 3070
*No word if they will undercut Nvidia pricing.


i won't decide before confirmation this is true or false
 
Canceled the TUF. Not confident I'm getting that one at launch and the price was a bit too high for a TUF. Going to take my chance and look for an EVGA or MSI card.

I agree that I'm not confident they'll ship it. Especially since the $800 doesn't even exist on Asus's site. There is an overclock mode/switch on the base card so I don't really get the reason for a separate sku on B&H unless they just got bad info. And if theyre selling better binned silicon for $20 more than that's just shitty as well. Idk.

Anyway, I'm keeping it for now as a backup. Can always cancel before it ships. Wouldn't pass up a better card from MSI or the like for cheaper but that's far from a guarantee
 

Malakhov

Banned
How much can I realistically get for my 1080ti at the moment? I am deciding if I should sell and upgrade to the 30xx series or ride my 1080ti all the way to the finish line.
Resell value is out of this world here right now. I sold a rx470 a few weeks ago for 150$. A 1660 super used is 250$ when new it's 300$. You should be able to get something decent. Mind that these are CAD prices

Think I'll sell the 1660 super and grab a 3070
 
Last edited:
Top Bottom