And considering everything that comes along with a 4090 if that XTX does well in real world benchmarks looks like I can finally get off NVidia for awhileRX 7900 XTX has ~61 TFLOPS FP32 compute while RTX 4090 has ~82.6 TFLOPS FP32 compute. Modulate your expectations.
Pixel RateRX 7900 XTX has ~61 TFLOPS FP32 compute while RTX 4090 has ~82.6 TFLOPS FP32 compute. Modulate your expectations.
Why are people comparing a 7900XTX to a 4090?
AMD aren't.
There won't be many major games releasing with RT support as a requirement in the coming years because …..
2. you need to purchase a 1200/1600 dollar card to even use it
I think they said most powerful under $1000 and I am ok with thatFlagship GPU vs Flagship GPU I guess. Along with AMD stating theirs is the most powerful.
I doubt that AMD cards will be CPU limited at 4K in ray tracing, and it looks like the same CPU was used for the rasterization benchmarks. But I agree we should wait until the reviews.
Flagship GPU vs Flagship GPU I guess. Along with AMD stating theirs is the most powerful.
ok you caught my bluffYou need a 1200/1600 card just to use ray tracing?
Come on man, you aren’t doing the Red Team any favors by going full retard like this.
they stated "the worlds most advanced gaming GPU, powering your gaming rigs for years to come"They didn't state anything of the sort. In fact they avoided referencing Nvidia at all when it came to performance claims.
"the worlds most advanced gaming GPU, powering your gaming rigs for years to come"
Why are people upset comparing two flagship cards?They didn't state anything of the sort. In fact they avoided referencing Nvidia at all when it came to performance claims.
they stated "the worlds most advanced gaming GPU, powering your gaming rigs for years to come"
So, they did state something of the sort, actually.
44m:15s
Why are people upset comparing two flagship cards?
Sorry if my memory from a youtube video 17 hours ago isn't perfect."Advanced" != "powerful"
You originally said powerful.
Sorry if my memory from a youtube video 17 hours ago isn't perfect.
"we've created the most world's most advanced gaming GPU - but that doesn't mean it's powerful or can run games as good as other GPU's"
you're really stretching this you know.
I mean technically speaking the XTX is more advanced just AMD didn't push it to the edge of making fire to compete with the 4090Sorry if my memory from a youtube video 17 hours ago isn't perfect.
"we've created the most world's most advanced gaming GPU - but that doesn't mean it's powerful or can run games as good as other GPU's"
you're really stretching this you know.
yeah yeah. They use a new display port connector so now it's the most advanced gpu. sure.No I'm not. It has features that are yet to appear on other vendors graphics cards. They are allow to advertise that.
You mean that other thread where it turns out I was correct?Don't you have burning power cables to go and defend anyway?
yeah yeah. They use a new display port connector so now it's the most advanced gpu. sure.
You mean that other thread where it turns out I was correct?
As already mentioned, it seems you want to turn this into an us vs them when it isn't. Nothing is stopping me from putting the AMD into my 2nd PC
Yup, the price is what makes this a good product. More specifically, the fact that they kept their flagship at $1000 while Nvidia is launching 4080 at freaking $1200.I think they said most powerful under $1000 and I am ok with that
I'm not getting upset, you are. I replied to a question. You called me out as a liar. I pointed you to what I meant, and you decided to turn it in to a word play argument and bring shit up from other threads I took part in.If there's nothing stopping you then there's no need for you to get upset about any of this.
Has nothing to do with AMD optimizations. Cyberpunk is a heavier RT workload than Metro Exodus and the greater the RT workload, the wider the impact because in pure RT benchmarks, Ampere is twice as fast as RDNA 2.I updated my post with a graphic demonstrating the behavior I'm talking about:
I'm not getting upset, you are. I replied to a question. You called me out as a lair. I pointed you to what I meant, and you decided to turn it in to a word play argument and bring shit up from other threads I took part in.
you're really stretching this you know.
Has nothing to do with AMD optimizations. Cyberpunk is a heavier RT workload than Metro Exodus and the greater the RT workload, the wider the impact because in pure RT benchmarks, Ampere is twice as fast as RDNA 2.
Just to lean in here with an 'actually'I'm not getting upset, you are. I replied to a question. You called me out as a liar. I pointed you to what I meant, and you decided to turn it in to a word play argument and bring shit up from other threads I took part in.
No, it has fuck-all to do with what you're claiming and I have no idea where you even get that information. The game is simply a much lighter workload than Cyberpunk 2077 which runs at a whopping 23fps at Ultra/4K on a 3090.LOL it has everything to do with optimizing around the RDNA2 consoles. That's why you see this behavior in UE5 as well.
I was referring to @Mister Wolf regarding future games requiring RT hardware to run properly (a position I was agreeing with). I was just pointing out that because the minimum base for these games will be the consoles, any RT or GI required for basic operation will be performant on AMD hardware. As you can see from the example I listed. RTGI is going to be the standard for some upcoming games, but those games are not going to be unplayable on AMD hardware, to the contrary they will be built first for AMD hardware (the consoles).
Pixel and texture rates are raster graphics.Pixel Rate443.5 GPixel/s
479.6 GPixel/sTexture Rate1,290 GTexel/s
1,395.2 GTexel/sFP16 (half) performance82.58 TFLOPS
83.07 TFLOPS (1:1)FP32 (float) performance82.58 TFLOPS
83.07 TFLOPSFP64 (double) performance1,290 GFLOPS
1,298 GFLOPS (1:64)
*Gigabyte Gaming OC 4090
All you've done since the start of this thread is talk about how the 4090 is more powerful. Everyone knows this, it's just that people who are looking to get a 7900 over a 4090 don't care, they are in 2 different price categories.
No, it has fuck-all to do with what you're claiming and I have no idea where you even get that information. The game is simply a much lighter workload than Cyberpunk 2077 which runs at a whopping 11fps at Ultra on a 3090.
Your claim that the game is better optimized for AMD hardware has no basis. It runs well on AMD because the RT effects simply aren`t as heavy as Cyberpunk. The moment you crank them up more, the gap widens. That`s like saying a game is better optimized on weaker hardware because it runs at lower settings.
It's easily solvable by nvidia by just dropping a 4080ti.
So basically RDNA2 all over again.
Can't compete with nvidia's top, sits around x80 level, but is a gen behind with RT.
Price is only good because nvidia is asking way way to much. It's easily solvable by nvidia by just dropping a 4080ti.
The games will work. They certainly won`t hang in there just fine compared to NVIDIA. AMD will be woefully outclassed.You just explained the optimizations yourself. LOL
Because of the consoles, games that require RT/GI by default will need to run on lower powered RDNA2 parts. The result of this, just like with F1, ME EE, and the UE5 demo is that the RT won't be as heavy by design, and this lowers the Nvidia advantage. The consoles are the optimization in this scenario because they limit what the base RT requirements can be.
Nvidia will continue to win big in superfluous modes that Nvidia pays devs to include to sell their GPUs, but when the tech is a required building block for new games, AMD will hang in there just fine.
It makes me wonder what Nvidia is thinking right now in regards to the 4080The 4080 is already $200 more than the 7900xtx, a higher tier 4080 model would just make that value proposition worse.
That sounds like a plausible strategy.The 7900XT probably has a $100 price reduction built in to its launch pricing, just in case Nvidia respond by dropping the price of the 4080.
The 7900XTX will stay at $999 but dropping the 7900XT to $799 would make more sense...but AMD won't do it until they have to.
You have any source on this? “Any RT or GI required for basic operation” - what does that even mean? To a skeptic this just sounds like you’re saying “any console-quality RT can also run well on a 7900”. Well duh.LOL it has everything to do with optimizing around the RDNA2 consoles. That's why you see this behavior in UE5 as well.
I was referring to @Mister Wolf regarding future games requiring RT hardware to run properly (a position I was agreeing with). I was just pointing out that because the minimum base for these games will be the consoles, any RT or GI required for basic operation will be performant on AMD hardware. As you can see from the example I listed. RTGI is going to be the standard for some upcoming games, but those games are not going to be unplayable on AMD hardware, to the contrary they will be built first for AMD hardware (the consoles).
The 4080 is already $200 more than the 7900xtx, a higher tier 4080 model would just make that value proposition worse.
??
A 4080Ti is likely to cost a minimum $1399 & probably $1499 MSRP, and is probably still nearly a year away.
That's still a significant price difference that will keep many from going Nvidia.
Their only option appears to be to reduce price at this point to remain competitive. Once they do - I wouldn't be surprised if AMD cuts the price of the XT another $100-150 in response.It makes me wonder what Nvidia is thinking right now in regards to the 4080
The Ti model better have DP 2.1 as well
Too much ewaste, use half as many Wii's.Damn, that is a LOT of Gamecubes
Umm, not at the price of 5nm wafer nodes and their monolithic die size. There is just no way they can lower the price without taking a huge hit on profits. The current 4000 series is pegged around $200+ just for a wafer cuts. not to mention the added cost of wafer fail coverage and a multitude of other logistical factors beyond R&D costs.Nvidia pricing, they can lower it any time they want.
Lowering 4080 16GB price to $999 would basically be a kill shot aimed at 7900XTX but I highly doubt they’d actually do it.Nvidia pricing, they can lower it any time they want.
You have any source on this? “Any RT or GI required for basic operation” - what does that even mean? To a skeptic this just sounds like you’re saying “any console-quality RT can also run well on a 7900”. Well duh.
Sounds to me that all this “optimized for AMD consoles” just means lower settings than what a PC is capable of (and the vast majority of PC games will allow for higher RT settings just like all the other settings). Not that there is some AMD specific optimization that won’t run as well on RTX.
The games will work. They certainly won`t hang in there just fine compared to NVIDIA. AMD will be woefully outclassed.