• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

SLB1904

Banned
According to the Hamersnexua video, AMD did some tricky math on the 8k being misleading as it wasn't a true 8k being an ultra wide 2160p resolution. The same thing Nvidia did with the 3090 and they got roasted for.
The more these companies push for 8k. I see as a win even if it is bullshit
 

SlimySnake

Flashless at the Golden Globes
These stats indicate a raster performance which is generally 5-10% behind 4090, and a RT performance which matches or exceeds the 3090 Ti.

RE Village 4K RT
4090 - 175 fps
3090 Ti - 102 FPS

Doom Eternal 4K RT
3090 Ti - 128 FPS

Source: Techpowerup
Pzrc1bV.png


Interesting that village is so much better than the 3090 Ti whereas Doom is more or less the same. Still, a 4090 is 41% faster than a 3090 Ti so AMD RT being 41% behind in 2022 FOUR fucking years after RTX launch is just unacceptable. They matched Nvidia's rasterization performance with the 5000 series, exceeded their perf per watt with the 6000 series, and now, 2 years later, what? Same poor RT performance?

Whats the point of this chiplet design if you cant chiplet some fucking RT cores on there. EDIT: it seems they now do but the performance is still lacking? RT is going to be everywhere next gen or rather whenever this gen finally fucking starts, so these 2022 cards acting like 2020 nvidia cards is simply disappointing.

Also, this does not bode well for their mid range $500 and $700 7700 and 7800 cards. Are they going to perform like 3070 and 3080? If so what the fuck is the point?
 
Last edited:

Buggy Loop

Member
GOW has no built-in benchmarks but the 4090 maintains 120fps+ at 4K/Ultra according to computerbase and TestingGames. This favors NVIDIA. This needs an asterisk because they could have tested totally different sections.

AC Valhalla according to HU and Guru3d manages 116fps in the built-in benchmark at 4K Ultra. That game heavily favors AMD.

DOOM Eternal on computerbase at max settings+RT manages 198fps but it could also be a CPU bottleneck.

COD Modern Warfare 2 which can favor AMD GPUs by up to 40% averages exactly 139fps on Ultra/4K according to hardware Unboxed.

Kitguru has RDR2 at 123fps. Tomshardware, 137fps.

Unless the title heavily favors AMD, it doesn't look like the 7900 XTX will compete in rasterization at all with the 4090.

It'll have some clear outliers with 1.7x performance and some more in the range of 1.4, etc. In the end, i'm betting that at review time, these will compare to the 4080 in rasterization, but with 3090 RT (and probably worse with crazy RTGI games coming like Cyberpunk 2077 overdrive), for $200 less than the 4080, assuming Nvidia doesn't budge on price or doesn't pull a rabbit out of their hat.

What's shocking, and Dictator, Alex from DF says, is that the RT uplift is so bad here. Like it seems it scaled linearly with RDNA 2, no huge improvements like "leakers" were suggesting. Like he says, we're at the dawn of having waves of games with full RT, without any ways to toggle it off. RT remix sprinkled on top of that and... well i have to ask the question, what's the point of having so much rasterization performances if you're not a pro-gamer playing at 1080p with low settings? This range of card is basically scoffing at any pure rasterization games coming at them, it's wasted power left on the table, while RT NEEDS all the performance gain it can get.

Not sure i understand the proposition here. Benchmarks will be a cold shower. It'll make the 4080 look appealing in the end. Surprised pikachu right there, as everyone were down on it.
 
Last edited:

CuNi

Member
I just recently got a 3080, so I am not upgrading anytime soon, but if that trend continues, my next upgrade might even end up being AMD.
Already ditched Intel for AMD on the CPUs and if this trend continues on GPUs too, I am all in for the ride!
I'd even jump ship earlier if they would either bring in a compatibility layer or something equal to CUDA, as I do have some programs that require that to not run at snails pace.
If they released a compatibility layer that even runs slower by not more than 10%, I'd switch next gen in an instant.
 

Buggy Loop

Member
I'd even jump ship earlier if they would either bring in a compatibility layer or something equal to CUDA, as I do have some programs that require that to not run at snails pace.
If they released a compatibility layer that even runs slower by not more than 10%, I'd switch next gen in an instant.

They've only had... *checks release date*, oh, 15 years to provide a competent CUDA alternative. Should be soon
 

rodrigolfp

Haptic Gamepads 4 Life
Nvidia and Intel use dedicated chips, such as the Tensor cores, to calculate Ray Tracing, last gen AMD cards used to brute force it with the raster power, you still got Ray Tracing, but it was taking a hit as it had to share with the raster power, these new cards have dedicated chiplets for things so it frees up the pure raster power and allows the RT to do it's own thing, as they said they've completely reworked their RT pipeline for these.
So, they now have dedicated RT cores like Nvidia?
 

GHG

Gold Member
If I've learned anything about reddit, it's that people are hive-minded, delusional and generally have no fucking clue what they're talking about because (in this example) they construct charts based on press conference claims with absolutely no real world data. Then it gets passed around to the rest of the apes who believe it because they're still running 10 year old GPUs and have no basis for comparison.

EDIT: Now it's happening in this very thread. Wait for proper benchmarks.

I don't understand what there is to dispute. Of course people will wait for benchmarks but in the meantime people are trying to decipher the marketing and figure out what kind of ballpark performance figures we should expect. The same thing happened post Nvidia' s conference and it even resulted in them cancelling what would have been a disastrous product once people figured it out.

No harm in it, relax.
 

nikos

Member
I don't understand what there is to dispute. Of course people will wait for benchmarks but in the meantime people are trying to decipher the marketing and figure out what kind of ballpark performance figures we should expect. The same thing happened post Nvidia' s conference and it even resulted in them cancelling what would have been a disastrous product once people figured it out.

No harm in it, relax.

You right.

/r/nvidia has driven me mad in the last week or so.

Nvidiots pretty salty right now. Myself included. Just got me 2 4090's.

I also got a 4090 and I think we're getting the extra performance we paid for.
 

Amiga

Member
AMD seem to be targeting:

- E-Sport players
- Those who want an easy upgrade within their current rig
- 8K enthusiasts
- those who have a lower power budget

Combine that with the price. And the possibility the chaplet design will help put more cards in the market. AMD should set to increase their market share of active PC players.
 

Buggy Loop

Member
you realize the xtx is cheaper than a 4080

Who are you talking to?

As benchmarks will indicate, probably as it should be.

There will probably be more MSRP FE 4080s than the whole XTX lineup including expensive AIBs.

I tried scoring an RDNA 2 on amd.com btw. It was a unicorn. RDNA 2 MSRP was pure bullshit, to the point where AMD even wanted to cancel their reference cards until they got called on it. Watch AMD AIBs go fucking insane with price because the card is called red dragon or some bullshit. They make Nvidia AIBs look tame in comparison.
Take Asus who makes TUF for both, for the same cooler, 6800XT @ $970, 3080 @ $730, when MSRP of that chip is -$50. That was +$290 delta! INSANE.
 

MikeM

Gold Member
All I care about is raster per dollar. Seems the XTX is the overall winner. Waiting for benchmarks to confirm.
 

Neo_game

Member
Price and rasterization was some what expected, in fact 999$ is a bit surprise they have undercut 4080. This is good for consumers I think next year there will be really good midrnage cards which is what most people want anyways.
 
AMD firing the first warning shots with these prices on the high-end. Rest of the lineup is going to prove difficult for Nvidia to position against. Nice to see them not pushing up to ridiculous new highs.
 
What I really want is to see how this AV1 compare to Nvenc... I use it a LOT for online streaming, game streaming to my living room and also to my Quest 2 for PCVR.

I'm glad that AMD is investing in software as well as performance because that is what was stopping me to buy their cards.
 
  • Like
Reactions: GHG

lmimmfn

Member
This is impressive from AMD, should be a raster beast and just competent in Ray tracing but the price and power consumption is a win for AMD.
 

Rickyiez

Member
Good to see them making 4080 and it's MSRP looking like trash , raster performance wise
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Saw this in the Jayz two cents video. Didnt realize they had a 50% IPC gain for ray tracing performance. The RE8 results dont really indicate that level of performance uplift. In the castle, it runs at around 110-120 fps. outdoors 80 fps. are they using the outdoor comparisons or indoor?

Definitely need to see whats going on here, but im hopeful again. Especially if both Sony and MS use the RDNA3 design for their midgen refresh.

g2RSYkF.jpg
 
Last edited:

Buggy Loop

Member
What I really want is to see how this AV1 compare to Nvenc... I use it a LOT for online streaming, game streaming to my living room and also to my Quest 2 for PCVR.

I'm glad that AMD is investing in software as well as performance because that is what was stopping me to buy their cards.

Wouldn’t the Quest 2 have to decode AV1? It can’t. Maybe Quest 3 feature.
 
Wouldn’t the Quest 2 have to decode AV1? It can’t. Maybe Quest 3 feature.
No.

A lot of applications can decode Nvenc, when you use steam link or play together you have a option to use Nvenc and it doesn't matter the hardware the client receiving data is using.

Virtual Desktop can also(I use on it) use Nvenc... Also Parsec can also use Nvenc and it is hardware agnostic as well.
 
Last edited:

jaysius

Banned
AMD seem to be targeting:

- E-Sport players
- Those who want an easy upgrade within their current rig
- 8K enthusiasts
- those who have a lower power budget

Combine that with the price. And the possibility the chaplet design will help put more cards in the market. AMD should set to increase their market share of active PC players.
Let Nvidia take the “luxury” brand people and take everyone else if you can isn’t a bad idea, if you can’t compete 1:1 acknowledge a world upcoming recession and bet on that.
 

phinious

Member
First thing you need to do is get a damn SSD in there, they're cheap af these days.
I do have an SSD in it, I just copied/pasted the specs from my original e-mail;P
That CPU will hold back these GPUs a lot.
Well, I cant afford everything at the moment. Does this mean I should save up for a whole new pc before getting any upgrades? Sorry for my ignorance, looking for some legit advice.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Can anyone explain to me why they went from 7nm to 5nm and REDUCED the clockspeeds to 2.3 GHz? The 6900xt used to regularly hit 2.5 ghz.

Especially when nvidia went from 2.0 ghz to 3.0 ghz when going from 8nm to 4nm. I was expecting clocks to hit 3.0 GHz and 100 tflops. 61 is impressive but way below what the rumors were indicating. The total board power is also very conservative. If Nvidia is willing to go to 450-550 watts, why are they locking themselves to slower clocks and just 350 watts?

I really wonder what happened here. poor performance at higher clocks? logic failure? poor thermals? Even with 2.5 ghz they couldve beaten the 4090.
 

SmokedMeat

Gamer™
I’m interested in seeing how the 7700XT turns out.

Excited to see some benchmarks, but I want to upgrade my CPU/Mobo/Ram before looking at GPUs.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Saw this in the Jayz two cents video. Didnt realize they had a 50% IPC gain for ray tracing performance. The RE8 results dont really indicate that level of performance uplift. In the castle, it runs at around 110-120 fps. outdoors 80 fps. are they using the outdoor comparisons or indoor?

Definitely need to see whats going on here, but im hopeful again. Especially if both Sony and MS use the RDNA3 design for their midgen refresh.
Don't get your hopes up for the RT. Seems like AMD is falling way short of Nvidia yet again in that area.
 

hlm666

Member
Sigh, guess i'm waiting for reviews seeing AMD pulled all sorts of shit nvidia and intel would be proud of here. Like using a 5900x cpu for the 6950x and used a 7900x for the 7900xtx.

Then they used their frame generation tech in the RT slide saying up to 2x performance but hid the fact the 7900 was using frame gen while the 6950 just fsr2.


Especially when nvidia went from 2.0 ghz to 3.0 ghz when going from 8nm to 4nm. I was expecting clocks to hit 3.0 GHz and 100 tflops. 61 is impressive but way below what the rumors were indicating. The total board power is also very conservative. If Nvidia is willing to go to 450-550 watts, why are they locking themselves to slower clocks and just 350 watts?
Nvidia went from shit samsung 8nm to good tsmc 5nm, they got way more out of the process move than AMD because they were already on tsmc.
 

LiquidMetal14

hide your water-based mammals
Without getting ahead, I so like what they're saying in terms of their RT performance. Nvidia has been ahead and is staying ahead but AMD needs to make all the strides. And I hope both DLSS 3.0 and FSR 3 continue to make games even more fluid.

I will repeat what I said somewhere else that I can't remember because I'm old; A Plaque Tale Requiem gets me about triple the fps with DLSS frame generation. I don't notice a damn thing with any negative affects while observing high fps in that game.

If this is how things are early on then we're going to be in a great place as these things continue to iterate.

Sigh, guess i'm waiting for reviews seeing AMD pulled all sorts of shit nvidia and intel would be proud of here. Like using a 5900x cpu for the 6950x and used a 7900x for the 7900xtx.

Then they used their frame generation tech in the RT slide saying up to 2x performance but hid the fact the 7900 was using frame gen while the 6950 just fsr2.



Nvidia went from shit samsung 8nm to good tsmc 5nm, they got way more out of the process move than AMD because they were already on tsmc.

That's what I was saying earlier. Some gymnastics afoot with the numbers. The product looks good and it's not necessary to mislead.
 
Last edited:

sendit

Member
Can anyone help me figure out if I could swap out my GTX 1080 for one of these new AMD cards on this prebuilt PC? Or what would I need to do?

ProcessorIntel Core i7-8700 3.20 GHz
Processor Main Features64 bit 6-Core Processor
Cache Per Processor12 MB L3 Cache
Memory16 GB DDR4 + 16 GB Optane Memory
Storage2 TB SATA III 7200 RPM HDD
Optical Drive24x DVD+-R/+-RW DUAL LAYER DRIVE
GraphicsNVIDIA GeForce GTX 1080 8 GB GDDR5X
EthernetGigabit Ethernet
Power Supply600W 80+
CaseCYBERPOWERPC W210 Black Windowed Gaming Case
Forget upgrading your GPU. Why don't you have a Blu Ray drive in this beast?
 

Foilz

Banned
Currently in the planning stages of upgrading my rig which has a 3700x/amd Vega 64 GPU. I was going with a 5800x3d and a 6800xt or 6900xt depending on prices. A new 7800xt would e even better if it's priced at around 600.
The current 6800xt/6900xt should see a decent price drop
 
I mean sure, the RTX 4090 is still better, but considering the price difference, I think its crazy to invest in a RTX4090, because it offers more power than most gamers need right now. Better safe some money now and upgrade your card down the line with the money you have saved now.
 

Hoddi

Member
Color me impressed. I had no intention of upgrading my 2080 Ti after the 4090 launch but these numbers seem far more palatable.

I also found that comment about decoupled clockrates and how modern games are front-end bound interesting. I've noticed that many games have hit a performance wall in recent years even at ultra low resolutions like 640x360 and it's not always been because of the CPU.
 

Bitmap Frogs

Mr. Community
I dont' understand why amd is giong back to paralelist units when they publically identified that as one of the core weakness of GCN.

They performance uplift % do not match the increase in hardware processing units, which means effectively the system is at less than 100% utilization which is the problem with dual instruction paralelism.
 

hinch7

Member
Currently in the planning stages of upgrading my rig which has a 3700x/amd Vega 64 GPU. I was going with a 5800x3d and a 6800xt or 6900xt depending on prices. A new 7800xt would e even better if it's priced at around 600.
The current 6800xt/6900xt should see a decent price drop
I'd be down for that. If it offers 3080/Ti-like RT performance and raster approaching a 4080 (don't expect it to beat it here).. all while keeping under 280W, at $649. That should be a enticing buy for a lot of people.

Hopefully it'll come with at least 16GB VRAM as well, considering they are offering 24GB and 20GB with the 7900 cards at $1000, and $900 respectively.
 
Relax. It’s ok if AMD releases a product people want. The 4090 is still a very good card and it’s okay if you want one.
I'm all relaxed. It's just funny how everyone bashing Nvidia since forever (because of what?) then other company gives false advertising and people are okay with that. :messenger_grinning_sweat:
Nvidia is using more advanced nodes for Ada Lovelace - TSMC 4N vs TSMC N5 + N6 on RDNA 3

Cost wise, these cards will be significantly cheaper to produce and thus less passed onto us - the customers.
I know. I think RDNA3 is comparable only to 8nm RTX3000 series. And we will see it shortly.
 
Top Bottom