• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

DenchDeckard

Moderated wildly
From reddit:


gxyif02czsx91.png


So much for "mid range".

I really hope this ends up being true...would be amazing.
 

DJ12

Member
From reddit:


gxyif02czsx91.png


So much for "mid range".
All the AMD results said with FSR enabled, did these tests have DLSS enabled for the 4090?

No way I'm paying more than £1000 for a card (again), so not saying this as an nVidia defender, just curious.
 
Last edited:

kiphalfton

Member
AMD has had more vram than Nvidia (for equivalent cards) over the years, but it never made a big enough difference. BUT if on top of that AMD's offerings are cheaper AND can slap around Nvidia, maybe it will finally push Nvidia to use 16gb minimum across the board.

AMD seems to have price, size, amount of vram, qnd performance dialed in. This could be it.

Inb4 Nvidia drops the RTX Titan for the same price as the RTX 4090.
 

Brigandier

Member
Can't wait for the 7950xtx next year and the TI's from Nvidia 🤤

Very interested in how these RDNA3 cards OC aswell!!!
 

Azzurri

Member
This might be a nice stop gap until the 5090, especially at $1k.

Want to see the performance increase over my 3090.
 
Last edited:

Buggy Loop

Member
AMD "MSRP", i hope you like amd.com lottery or get fucked in the ass by the AIBs ridiculous prices (way worse than Nvidia AIB).
 

Gaiff

SBI’s Resident Gaslighter
From reddit:


gxyif02czsx91.png


So much for "mid range".
Except there's one fatal flaw with this; Techpowerup used a 5800X in their test setup. Even the 13900K which is much faster for gaming bottlenecks the 4090.

Here's what Hardware Unboxed got testing 13 games with a 5800X3D.

hRyWusW.png


64% faster with a 5800X vs 73% faster with a 5800X3D at 4K. With a 1.7x multiplier, the 7900 XTX would end up neck-and-neck with the 4090 in raster and get stomped in RT. We'll also see if 70% translates to a best case scenario or average numbers.

Whatever the case, the raster performance is excellent but the RT performance leaves a lot to be desired. Though at $1000, it's hard to complain. That RT uplift from the 4090 certainly isn't worth $600 which is the price of a high-end CPU or of a PS5+.
 
Last edited:

nikos

Member
From reddit:


gxyif02czsx91.png


So much for "mid range".

If I've learned anything about reddit, it's that people are hive-minded, delusional and generally have no fucking clue what they're talking about because (in this example) they construct charts based on press conference claims with absolutely no real world data. Then it gets passed around to the rest of the apes who believe it because they're still running 10 year old GPUs and have no basis for comparison.

EDIT: Now it's happening in this very thread. Wait for proper benchmarks.
 
Last edited:

GymWolf

Gold Member
Just gimme a fucking preorder date for the 7900xtx.

Do we know european prices?
 
Last edited:

tusharngf

Member
They nailed the pricing this time. I think for the first time amd is competitive. If this xtx comes atleast 10% closer to 4090 that's good enough for most consumers. Imagine pro consoles with these features. I can't wait for actual benchmark numbers now.
 

Buggy Loop

Member
If I've learned anything about reddit, it's that people are hive-minded, delusional and generally have no fucking clue what they're talking about because (in this example) they construct charts based on press conference claims with absolutely no real world data. Then it gets passed around to the rest of the apes who believe it because they're still running 10 year old GPUs and have no basis for comparison.

EDIT: Now it's happening in this very thread. Wait for proper benchmarks.

Yea, should have stopped at « up to ». It’s worthless.

There’s a reason they didn’t include the 4090 directly. It’s out, they have it. Last time they put the 3090 numbers. What are they hiding?
 
Last edited:

DenchDeckard

Moderated wildly
Genuinely tempted to get this to send a middle finger to nvidia. Hope the review embargo is up soon....as if a 4090 comes available at 1699 for me. FE I'm gonna struggle to not buy....

Spending a grand on this would leve me 700 quid to get my 13900KS early 2023.
 

Haggard

Banned
COD MW2 4K

4K_Ultra.png


4090 = 139 fps
7900 XTX (1.5 x 6950 XT) = 133 fps
6950 XT = 89 FPS

Watch Dog Legion 4K
4090 = 105 FPS
7900 XTX = 96 FPS

Cyberpunk 2077 4K
4090 = 71.2 FPS
7900 XTX = 66.3 FPS
Are you seriously taking the numbers from a PR power point presentation at face value?
...,....
....
..
 

DenchDeckard

Moderated wildly
Would this card offer better raytracing performance than a rtx 3090 ti?

Think I can live with that and raster close to a 4090.
 
I’m confused. Why didn’t AMD label the cards as 7900XT and 7900?
And I do think the 7900XT should have been $200 cheaper compared to the XTX.
 

OZ9000

Banned
Are you seriously taking the numbers from a PR power point presentation at face value?
...,....
....
..
It's all we have until the official benchmarks.

IF the performance matches the above figures then I'll be very impressed with the 7900.
 

LiquidMetal14

hide your water-based mammals
I agree with what 1000100100111001010100100010100101001001001001000010100100101010 said.

It looks good for the price but there is marketing gymnastics at work and not enough transparency.
 

YCoCg

Member
meh...
  • no real response to DLSS
They announced both FSR 2.2, which is an improved iteration of FSR 2.1 with less ghosting and artifacts and has better performance AND FSR 3.0 Motion which IS their response to DLSS3? AI Generated frames between rendered frames to give out a higher frame rate. They also announced Rapid RX which is their response to Nvidia Reflex. Did you not pay attention?
 

OZ9000

Banned
They announced both FSR 2.2, which is an improved iteration of FSR 2.1 with less ghosting and artifacts and has better performance AND FSR 3.0 Motion which IS their response to DLSS3? AI Generated frames between rendered frames to give out a higher frame rate. They also announced Rapid RX which is their response to Nvidia Reflex. Did you not pay attention?
I haven't really used FSR at all.

How does the IQ/AA compare with DLSS?

4K DLSS Quality = zero jaggies whatsoever. The image looks better than 4K. Hell even 1440p DLSS Quality looks immaculate.

My biggest disappointment with FSR was lack of machine learning similar to DLSS.
 
Last edited:

Buggy Loop

Member
These stats indicate a raster performance which is generally 5-10% behind 4090, and a RT performance which matches or exceeds the 3090 Ti.

AC Valhalla
4090 - 106 FPS

God Of War
4090 - 130 FPS

RDR2
4090 - 104 FPS

RE Village 4K RT
4090 - 175 fps
3090 Ti - 102 FPS

Doom Eternal 4K RT
3090 Ti - 128 FPS

Source: Techpowerup

Same techpowerup that had a 5800x choking the 4090?

Frustrated Parks And Recreation GIF
 

YCoCg

Member
I haven't really used FSR at all.

How does the IQ/AA compare with DLSS?

4K DLSS Quality = zero jaggies whatsoever. The image looks better than 4K. Hell even 1440p DLSS Quality looks immaculate.

My biggest disappointment with FSR was lack of machine learning similar to DLSS.
As of FSR 2.0, DLSS 2.4.12 was still better but FSR 2.1 it was a lot closer, but DLSS still had the performance edge because of the dedicated cores. These cards however seem to have dedicated chip sections for things like FSR so that should lower the performance gap by an extreme margin. We'll see how FSR 2.2 looks when Forza Horizon 5 is updated as that's the first game with it.
 
Its obvious AMD wanted maximum efficiency with those cards not maximum performance "which is also seems pretty good"

I am not sure to what extent NVIDIA RTX cards are 'efficient' in terms of their microarchitecture, but because they are mostly a GPU manufacturer/supplier: they will brute force the fuck out of their products (similar to intel where they brute force the fuck out of their CPUs with huge power draws)

I think these cards are a great starting point, it's like they threw a huge fish net trying to capture everyone: gamers, content creators, etc. and addressing all the setbacks and issues from their previous cards.

This is mere speculation, but I really don't think these cards are the cream of the crop.

These cards appetizers and set as proxy for near future (1st Q 2023) iterations on what's to come, with guaranteed further price reductions on 7900 series and RDNA 2 cards once the new iterations are released. I am hoping this will further drive sales, and for more people to afford, enter, and enjoy PC gaming.

It's a tricky thing to balance: performance, efficiency, cost and I think they nailed it and accomplished their goals and vision.
 

Marlenus

Member
Alright, how does this look like against 4090? Or even 4080?

Some guesstimates using TechSpot chart.

Average_4K-p.webp


NV presentation had the 4080 16GB about 20% ahead of the 3090Ti in raster.

AMD are claiming 1.54x perf/w but that was vs a 6900XT with both cards running at 300W which is sneakier than they usually do. Still the reference 6950XT has similar perf/w to the 6900 so if we do 1.54x the 6950XT I think that will be pretty close.

That For raster that gives us the following approximate stack

GPUFPS AveragePrice
40901441,600
7900XTX131999
7900XT115899
4080 16GB1101,200
4080 12GB / 407081899 (maybe less as a 4070?)

In RT though it does not look so good for AMD. In the Techspot 4090 review the average RT performance for the 4090 was 0.46x the raster. For the 6950XT it was 0.31x the raster and it looks like the RT performance for RDNA3 is going up as much as raster so the drop off should be about the same as RDNA2.

That gives the following

GPURT FPS AveragePrice
4090661,600
4080 16GB511,200
7900XTX41999
4080 12Gb / 407037899 (maybe less as a 4070?)
7900XT36899

so perf/$ for RT is about equal give or take but perf/$ for raster is advantage AMD. Ultimately I think it just depends on what you want your card to be good at.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter

GOW has no built-in benchmarks but the 4090 maintains 120fps+ at 4K/Ultra according to computerbase and TestingGames. This favors NVIDIA. This needs an asterisk because they could have tested totally different sections.

AC Valhalla according to HU and Guru3d manages 116fps in the built-in benchmark at 4K Ultra. That game heavily favors AMD.

DOOM Eternal on computerbase at max settings+RT manages 198fps but it could also be a CPU bottleneck.

COD Modern Warfare 2 which can favor AMD GPUs by up to 40% averages exactly 139fps on Ultra/4K according to hardware Unboxed.

Kitguru has RDR2 at 123fps. Tomshardware, 137fps.

Unless the title heavily favors AMD, it doesn't look like the 7900 XTX will compete in rasterization at all with the 4090.
 

Gaiff

SBI’s Resident Gaslighter
These stats indicate a raster performance which is generally 5-10% behind 4090, and a RT performance which matches or exceeds the 3090 Ti.

AC Valhalla
4090 - 106 FPS

God Of War
4090 - 130 FPS

RDR2
4090 - 104 FPS

RE Village 4K RT
4090 - 175 fps
3090 Ti - 102 FPS

Doom Eternal 4K RT
3090 Ti - 128 FPS

Source: Techpowerup
Techpower up are idiots and used a 5800X non-3D. Tomshardware and kitguru used a 12900K and got 123 and 137fps in RDR2. Hardware Unboxed with a 5800X3D got 116fps in Valhalla, so did guru3d.
 

YCoCg

Member
What all this have to do with RT support?
Nvidia and Intel use dedicated chips, such as the Tensor cores, to calculate Ray Tracing, last gen AMD cards used to brute force it with the raster power, you still got Ray Tracing, but it was taking a hit as it had to share with the raster power, these new cards have dedicated chiplets for things so it frees up the pure raster power and allows the RT to do it's own thing, as they said they've completely reworked their RT pipeline for these.
 

phinious

Member
Can anyone help me figure out if I could swap out my GTX 1080 for one of these new AMD cards on this prebuilt PC? Or what would I need to do?

Processor Intel Core i7-8700 3.20 GHz
Processor Main Features 64 bit 6-Core Processor
Cache Per Processor 12 MB L3 Cache
Memory 16 GB DDR4 + 16 GB Optane Memory
Storage 2 TB SATA III 7200 RPM HDD
Optical Drive 24x DVD+-R/+-RW DUAL LAYER DRIVE
Graphics NVIDIA GeForce GTX 1080 8 GB GDDR5X
Ethernet Gigabit Ethernet
Power Supply600W 80+
CaseCYBERPOWERPC W210 Black Windowed Gaming Case
 

Gaiff

SBI’s Resident Gaslighter
Can anyone help me figure out if I could swap out my GTX 1080 for one of these new AMD cards on this prebuilt PC? Or what would I need to do?

ProcessorIntel Core i7-8700 3.20 GHz
Processor Main Features64 bit 6-Core Processor
Cache Per Processor12 MB L3 Cache
Memory16 GB DDR4 + 16 GB Optane Memory
Storage2 TB SATA III 7200 RPM HDD
Optical Drive24x DVD+-R/+-RW DUAL LAYER DRIVE
GraphicsNVIDIA GeForce GTX 1080 8 GB GDDR5X
EthernetGigabit Ethernet
Power Supply600W 80+
CaseCYBERPOWERPC W210 Black Windowed Gaming Case
That CPU will hold back these GPUs a lot.
 

Buggy Loop

Member
Did not know that. Though is it true the CPU does not matter as much for 4K gaming?

Techpowerup got called on it (of course) and they did a comparison with the 5800x3d


Depending on the title, it can go from 0% to 33.7%. So it really depends on the game you land. Even a 13900k can still bottleneck a 4090 at 4k. Nobody really knows how it will flex with coming CPUs, the 7000 series with 3d cache will probably be the ones to look for.
 

CLW

Member
Sooooo as a console gamer that has held off building his own gaming pc (still using a 1070 Alienware) is there any consensus as to where these will fall performance wise?

I’m assuming <4090 but >4080?
 
Top Bottom