• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

Pretty good videos:




lQyFaQ7.jpg
H9Qv6zo.jpg
 

Celcius

°Temp. member
If 7900XT < RTX 4080 < 7900XTX, then that just shows how crazy the pricing of the 4080 is.
Ampere level ray tracing doesn't surprise me and I consider that level of ray tracing performance to be great overall (just not the best anymore).
 
I think video game developers need to utilize ray tracing better through software and hardware implementation. There are several ways to do raytracing and it doesn't need to be NVIDIA driven by their RT cores (hardware). They can:

a) Utilize CPU cores
b) create approximates of real time ray tracing effects rather than full blown simulation which tanks performance.

I think raytracing is kind of like early stages of DLSS and FSR, it will be implemented better over time by finding better, more optimized, efficient solutions. I don't think gamers care if its real time or not, as long as it looks real and contributes to the visual flare of the game.
 

hlm666

Member
Some leaked timespy and firetrike scores. I expected the 7900xtx to be better in raster vs the 4080. Reviews coming on the 12th.

 

So some how and someway AMD has made RTX 4090 the best deal in performance per dollar.

As i said that Ampere is too advance for RDNA 2. The main advantage of RDNA 2 was only clock speed, which is gone now and you real performance of nvidia FP32 X 2, which was not able to in Samsung die's.
 

Xyphie

Member
Yeah, surprisingly bad performance from the 7900 series if that lines up with what we see next week. Time Spy has been pretty neutral in terms of favouring one vendor or the other, while Fire Strike has always had AMD overperforming on a relative basis.

If you actually look at what the silicon that goes into the the 7900XTX is, only matching the 4080 would be a terrible showing. Of course, a GPU is only as good as it's priced, and the 4080 does terribly there.
 

twilo99

Gold Member
They must have final drivers by now? No way they haven’t shipped those to reviewers, etc. to test the cards with.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Damn, TimeSpy 1440p ain't lookin' good. 10% behind the 4080, hopefully actual gaming benchmarks are not that bad vs. the 4080 at 1440p. Was hoping this card would be good enough to force Nvidia to drop prices of 4080.
 
Last edited:

Crayon

Member
Oh shit if they come up hort of the 4080 in actual games. Even if they do great in the midrange, this losing raster to 4080 would not cast the nicest glow on the lineup.
 

DonkeyPunchJr

World’s Biggest Weeb
Oh shit if they come up hort of the 4080 in actual games. Even if they do great in the midrange, this losing raster to 4080 would not cast the nicest glow on the lineup.
Yeah this would be terrible news. I think it at least needs to beat 4080 by a decent margin in rasterized games to have a chance at success. Otherwise IMO anybody willing to spend $1000 on a GPU would just as soon spend the extra $200 on a 4080.

Wouldn’t that be a great irony, if the 7900 was what finally caused the 4080s to start selling.
 

hououinkyouma00

Gold Member
Why bother buying, or even offering, the XT? A $100 price difference when you're spending $900+ is not substantial.
I’m looking at the XT because I don’t want to have to buy another power supply. At that point it’s $200 or more over the XT.

Plus I use a Formd T1 and any extra space is nice.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I’m looking at the XT because I don’t want to have to buy another power supply. At that point it’s $200 or more over the XT.

Plus I use a Formd T1 and any extra space is nice.
The XT and XTX are both limited to 375W.
They both use 2x8pin.
Assuming reference.

What would your powersupply manage to do with an XT it couldnt do with an XTX?

If you are going reference then the size difference is absolutely negliable.
If you are going AIB then they will be using the exact same coolers.
And now use 3x8pins.

So whats the real hold up with the XT instead of the XTX?

12-1080.5f4cfe24.jpg



RADEON-7900-1.jpg





And this is a 4090FE in your case.
You can guage how much more space you would have assuming you are going reference.
4090-with-a-small-side-of-case-formd-t1-v0-Q7lxECTH6OhNrCZjXnUvdecC8GBxoUeuvpotCN9Ak6Y.jpg
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Damn, TimeSpy 1440p ain't lookin' good. 10% behind the 4080, hopefully actual gaming benchmarks are not that bad vs. the 4080 at 1440p. Was hoping this card would be good enough to force Nvidia to drop prices of 4080.
This is the eternal problem for AMD. Too many people just want them to drive prices down on Nvidia cards so that they can then buy Nvidia.
 

GreatnessRD

Member
I’m looking at the XT because I don’t want to have to buy another power supply. At that point it’s $200 or more over the XT.

Plus I use a Formd T1 and any extra space is nice.
If you're going to buy the 7900 series, please, don't waste money on the XT and buy the XTX.

With that said, Dec 12th can't come fast enough. These early benches and sounds of hardware/software limitations on RDNA3 is making me sad, lol.
 

hououinkyouma00

Gold Member
The XT and XTX are both limited to 375W.
They both use 2x8pin.
Assuming reference.

What would your powersupply manage to do with an XT it couldnt do with an XTX?

If you are going reference then the size difference is absolutely negliable.
If you are going AIB then they will be using the exact same coolers.
And now use 3x8pins.

So whats the real hold up with the XT instead of the XTX?

12-1080.5f4cfe24.jpg



RADEON-7900-1.jpg





And this is a 4090FE in your case.
You can guage how much more space you would have assuming you are going reference.
4090-with-a-small-side-of-case-formd-t1-v0-Q7lxECTH6OhNrCZjXnUvdecC8GBxoUeuvpotCN9Ak6Y.jpg
Damn I didn’t realize you could even get a 4090 in there. They must have moved the motherboard tray over.

I believe I saw the XTX suggests a 800w and I believe I am still on a 650 actually so neither of them may work. I might just have to bite the bullet I guess and if I do then the XTX is obvious.

This is gonna be an expensive year for me because at some point I’m gonna need a new CPU/Motherboard. Luckily at 4K it’s less of a need and helps my 3600 last a bit longer.
 

Razvedka

Banned
I was kind of expecting this. AMD has stolen the performance for dollar crown away from Nvidia at every level that isn't super-duper enthusiast tier (4090). We need to see more benchmarks, especially where raytracing is concerned and how real world performance works when you factor in FSR and DLSS3.0...

But as of right now, if you had the option of getting the 7900XTX or a 4080 why would you get the 4080? And this is a question that will probably apply at every performance bracket below it. Unless I'm missing something, this is fantastic news for consumers.
 
Last edited:
New numbers still look okay to me, so long as the xtx can beat 4080 in realworld raster performance. Just matching 4080 might not be good enough if Nvidia lowers 4080 pricing by $100 or so. If the XT can hang in there relatively close it might have an easier go of it because Nvidia probably wouldn't move the pricing down that far. Though AMD has always been quicker to discount prices in comparison to Nvidia, so if the xtx and the 4080 are too samey in performance you might see AMD match a Nvidia price cut if it happens. The AMD part will need to be cheaper by enough to allow for the lower RT performance, it's a cake walk for Nvidia if the prices are the same or within $100 (assuming similar raster performance).

Though it really depends on what the true pricing is. AMD has been closer to the MSRP on average, so that gives them a boost as well.
 
Last edited:

Kataploom

Gold Member
The XT and XTX are both limited to 375W.
They both use 2x8pin.
Assuming reference.

What would your powersupply manage to do with an XT it couldnt do with an XTX?

If you are going reference then the size difference is absolutely negliable.
If you are going AIB then they will be using the exact same coolers.
And now use 3x8pins.

So whats the real hold up with the XT instead of the XTX?

12-1080.5f4cfe24.jpg



RADEON-7900-1.jpg





And this is a 4090FE in your case.
You can guage how much more space you would have assuming you are going reference.
4090-with-a-small-side-of-case-formd-t1-v0-Q7lxECTH6OhNrCZjXnUvdecC8GBxoUeuvpotCN9Ak6Y.jpg
I already ditched that mini-ITX plans I had because of that... Now AMD is giving me hope... Dear AMD...

endgame-avengers.gif
 

//DEVIL//

Member
Yeah, surprisingly bad performance from the 7900 series if that lines up with what we see next week. Time Spy has been pretty neutral in terms of favouring one vendor or the other, while Fire Strike has always had AMD overperforming on a relative basis.

If you actually look at what the silicon that goes into the the 7900XTX is, only matching the 4080 would be a terrible showing. Of course, a GPU is only as good as it's priced, and the 4080 does terribly there.
Something is wrong with these drivers for sure . Must be early drivers ( go figure. AMD with shitty drivers are the one of the main reasons I don’t buy AMD)
The 7900xt and xtx performance are about the same which is incorrect
 

Buggy Loop

Member
BIOS issues?

Could be. Igorslab is saying that AIBs are having problems and having delays because of bugs. But although it's not unheard of to give AIBs drivers later than the reviewers... if reviewers like the above are seeing problems with review drivers/bios.. that's gonna be a very bad first impression.

This guy keeps implying that its hardware related



If they somehow fumbled the hardware... ouch

But even if it's drivers/bios and it's not fixed by reviewing time, that's the last thing AMD would want to have as a first impression. That's years of efforts to change public opinion about their drivers, evaporating into thin air.
 

Crayon

Member
Interesting if it really is the benchmarks saturating it and up against the power limit. Interesting in the stupid way. That seems too big an oversight to be true.
 

twilo99

Gold Member
Not looking good and it really didn't seem like they had to rush things orut the door unless they were aiming for that xmas crowd..
 

CrustyBritches

Gold Member
So some how and someway AMD has made RTX 4090 the best deal in performance per dollar.
Wouldn't this indicate that the 7900 series has better perf/dollar?
TimeSpy 4K:
4090-> 19398/$1600 = 12.12 points per dollar
4080-> 14005/$1200 = 11.67 points per dollar
7900XTX-> 13729/$1000 = 13.73 points per dollar
7900XT-> 13687/$900 = 15.21 points per dollar

I was surprised when looking at Cost Per Frame and Perf Per Dollar on TPU and TechSpot that for all the shit 4080 gets, it actually has better perf/dollar than the 4090.
 
Last edited:

hlm666

Member
Do they have optimal drivers? I'd be surprised if that was the case...
Reviewers are now working on the reviews for the 12th, the article says this. So if the drivers are broken it's going to be a repeat of intel arc and first impressions matter, no one is talking about the fact the arc gpu driver has improved for dx9.

"This is based on synthetic data collected from and confirmed it with multiple reviewers."

Something is wrong with these drivers for sure . Must be early drivers ( go figure. AMD with shitty drivers are the one of the main reasons I don’t buy AMD)
The 7900xt and xtx performance are about the same which is incorrect
The gap between the 2 being small could possibly be because the xtx is being power starved. It's mind boggling that AMD could make a case for the 12 pin power connector being needed by kneecapping their own halo gpu with not enough power. I guess the 3 socket aib partner cards will prove this right or wrong though, and possibly save AMD from themselves if power is the issue.
 
if they can get 10% better raster performance than 4080, then mission accomplished. Besides driver updates, im curious to see how the performance is when you combine:

-overclock to 3Ghz or higher
-pair with amd zen 4 smarshift
-FSR 3.0

might make it 20% better than 4080?
 

Crayon

Member
if they can get 10% better raster performance than 4080, then mission accomplished. Besides driver updates, im curious to see how the performance is when you combine:

-overclock to 3Ghz or higher
-pair with amd zen 4 smarshift
-FSR 3.0

might make it 20% better than 4080?

I don't think that three gigahertz thing was necessarily for Navi 31. I think that might have been the RDNA 3 architecture in general. My 6600 XT runs 2750 MHz all day. 3 GHz isn't that far off if it's just for the smaller chips.
 

//DEVIL//

Member
Reviewers are now working on the reviews for the 12th, the article says this. So if the drivers are broken it's going to be a repeat of intel arc and first impressions matter, no one is talking about the fact the arc gpu driver has improved for dx9.

"This is based on synthetic data collected from and confirmed it with multiple reviewers."


The gap between the 2 being small could possibly be because the xtx is being power starved. It's mind boggling that AMD could make a case for the 12 pin power connector being needed by kneecapping their own halo gpu with not enough power. I guess the 3 socket aib partner cards will prove this right or wrong though, and possibly save AMD from themselves if power is the issue.
Not gonna happen. The AIB cards won’t have much more power than reference card. AMD always lock this shit
 

Kataploom

Gold Member
I don't know if this have been posted, also it's not exactly s reliable source but here's some benchmark comparing 7900 xtx to 4090:

 

Rickyiez

Member
I don't know if this have been posted, also it's not exactly s reliable source but here's some benchmark comparing 7900 xtx to 4090:


There's alot of fake comparison videos like these, please don't fall for it. Another type of this is the phone camera comparison videos.

The rule of thumb to know which are the legit ones is by seeing the reviewers showcase the hardwares personally first.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
A total set?
People are assuming AMD SmartShift and AMD SmartAccess are gonna boost the performance of full AMD desktop builds.

They wont.

There's alot of fake comparison videos like these, please don't fall for it. Another type of this is the phone camera comparison videos.

The rule of thumb to know which are the legit ones is by seeing the reviewers showcase the hardwares personally first.
You can also just look at whether they show off the GPU/CPU clock rates and other stats.
These fake comparisons always hide the stats that would give away its an underclocked CPU/GPU.
 

GymWolf

Gold Member
People are assuming AMD SmartShift and AMD SmartAccess are gonna boost the performance of full AMD desktop builds.

They wont.


You can also just look at whether they show off the GPU/CPU clock rates and other stats.
These fake comparisons always hide the stats that would give away its an underclocked CPU/GPU.
You mean like having cpu and gpu both from amd?

Shit i didn't knew anything about this, so in your opinion even if i get an amd gpu i should be ok with an intel cpu?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You mean like having cpu and gpu both from amd?

Shit i didn't knew anything about this, so in your opinion even if i get an amd gpu i should be ok with an intel cpu?
SmartShift sends power to the device that needs it the most.
e.g If your GPU needs more power some CPU power will be shifted to the GPU and vice versa.

Or and hear me out cuz this is gonna be a crazy thought.
Set your PC to high performance when you need high performance so no power needs to be shifted and instead all your components can be fed what they need.

SmartAccess is an analog of ReBAR.



P.S I dont think AMD SmartShift even works with discrete GPUs.
 
Last edited:

OZ9000

Banned
Preliminary benchmarks look shit
Wonder what actual gaming performance is

Might skip a GPU this gen and wait 1-2 years if the XTX fails to consistently beat the 4080 tbh.
 

Irobot82

Member
I don't know if this have been posted, also it's not exactly s reliable source but here's some benchmark comparing 7900 xtx to 4090:


Probably fake, but damn did you look at the 4090's frametime graph during GOW. Yikes. Off also Horizon. Stutter city.
 
Last edited:
Top Bottom