AMDs drivers are immature?It might have something to do with AMD drivers too - i think they are still rather immature.
Don't buy a reference boardAMDs drivers are immature?
I dont think the XT will magically be worth the money with some driver updates.
Granted I am certain they will boost performance as drivers keep getting updated, it is the lay of the land, the same with Nvidias driver updates.
But the gap isnt going to change.
And the hardware issues.....are hardware issue.
Coil Whine and Transient Spikes arent something you can software away.
Not exactly. It is beating some and winning some current gen graphic engines.It's on par with 4080 right? If so... why would you go for AMD XTX when they are the same price? If you go for the 4080 you get the benefits of much better DLSS and RT.
Don't buy a reference board
LOL, I remember people on GAF doing "math" based on AMD slides from their reveal presentation. Those were the days.
I had both of those cards! Ole' blowers too. Then I got a 7950 Sapphire Pulse.
I havent done that since the 8800GTS......no im lying I had a reference GTX 260c216.
But for people who wanted 2x8pin and reasonably sized cards the reference boards are their best bets.
That’s a big nail in the coffin. There’s many many upcoming games on that engine.
GN calling out the 7900XT as overpriced like others today:
The XT is 80% of the performance at 90% of the price of the XTX.The weird thing is that totally doesn't bother me, because it makes sense to position the full chip product as the best performance value within the group of cards manufactured with that chip. The cut down cards should offer you a little cheaper pricing and help fill a price gap, while at the same time not being quite as good of a value per frame. At least the FPS per $ still improved over launch pricing of the previous generation.
The XT is 80% of the performance at 90% of the price of the XTX.
That actually makes OC'ing worthwhile like in the old days.Excellent performance, especially at 4K resolution which was the opposite last gen.
With that said, It's definitely power starved. Undervolting+OC is the way to go until Igors MorePowerTool becomes compatible. The XTX is going to be insane @ 500w as it's clear the architecture has much higher headroom in OC'ing than the 40 series. Makes sense now when AMD was referring to AIB's unleashing the beast (higher vBios PL limits)
Link
3.2ghz.
Once MPT becomes compatible we should see some cards hitting close to 3.5ghz regularly. Lots of OC potential here.
Cuz the people most concerned with perceived value are people lower down the totem pole.That's really the way the cut down cards should have always been sold. Sell the full chip the most aggressively, the cut down is a cheaper option, but you loose a little of the fps per $ value.
I've always thought it was crazy that the GPU vendors would kneecap the full sku by making the cutdown the better value, that's how you end up with full chips being used to create the lower priced sku. Always seemed like a waste of silicone resources.
Cuz the people most concerned with perceived value are people lower down the totem pole.
At the upper end people dont give a shit about value.
Halo products can be priced at whatever you want regardless of how "worth it" they are.
The RTX 3090 and especially the 3090Ti were joke level cards from a value perspective, but 3090/Ti customers didnt/dont give a shit cuz they just want it and can hopefully afford it.
"Average" consumers look at the 3090 and think I get 90% of the performance from a 3080 for much less?
Im in, SOLD.
Halo product buyers look at the 3090 and think i get 10% better performance for theycantseethis and think....fuck yeah I need it!!!!
The xx60s and xx70 sell so well partly because their perceived value is high.
xx70s would always beat the range topper of the previous generation.
Even the 3070 looked like good value at 500 dollars cuz it could match or beat the 1100 dollar 2080Ti. (Cant believe I pissed myself laughing at the price then and now Nvidias cheapest card is 200 dollars more expensive than it)
Now if you try sell me a 4070 at a higher or equal price to the 3080 and tell me it about matches it.......im not seeing the value.
That thing can go fuck itself.
Pricing people out of products you are actually hoping move is bad for business.
Youll get your markup from the halo product.
Everything below the halo product has to prove its worth, otherwise potential customers simply wont buy it.
Being, stuck with stock is more costly than simply lowering the price so people actually buy the product.
Yeah I mean, I don’t want to denigrate AMD’s engineers. They have made some awesome advancements over the years.Nailed it
I was in that hype train for oh so long on ATI/AMD. Always was to be fixed at X driver or Y card, that they've got a master plan to catch Nvidia with their pants down.
When i went to university and started my electrical engineering, including a semiconductor course, then reading all these universities and research papers that Nvidia participates with for developing the tech, that it sort of clicked that they know wtf they are doing at another level. They've had no competition for CUDA for like 15 years, they have 80% of the ML market, they're way ahead in RT and now we see with this release that they're also way ahead in chip design and efficiency.
It is doa in all europe and all places that have major price difference compared to america.The reference 7900xtx is £1,050 at Overclockers UK. The lower end 4080s are around £1,150 to £1,200 and the 7900xtx reference cooler seems a bit shit so you'll probably want a AIB one so add at least another £100 and you are looking at..../drumroll....£1,150 for no DLSS and worse RT performance.
Is this DOA in the UK at least? Maybe 4080 prices will go up if people were waiting to see what AMD offered. Maybe there is a use case where you need more raster performance and more than 16gb RAM but otherwise the already rip-off 4080 looks an ok deal in comparison.
If you managed to get a £1,600 4090 FE you hit the jackpot as completely gross as that sounds.
Edit: they put the AIB prices up for the 7900xtx and they are £1,200 to £1,300. AMD don't even have price advantage in the UK as you can pick up a 4080 for less than that..
What I think it is, is just the yearning for someone to dethrone Nvidia so people latch onto anything they can. AMD is the great hope for a lot until Intel decides what they're gonna do when it comes to Arc. But folks should be thankful for AMD for putting a little pep in Nvidia's step. The 4090 didn't just become god tier outta nowhere. Just like the rest of the world, Nvidia thought AMD might actually bring some smoke with the 7000 series and brought out the boom stick to make sure the 4090 was still top dog. They saw the 6950 XT creep up on the 3090/Ti just like the rest of us. With that said, Nvidia is still King and will be for the foreseeable future. AMD got some work to do with their chiplet/hardware design.Yeah I mean, I don’t want to denigrate AMD’s engineers. They have made some awesome advancements over the years.
There’s just something about this hype machine that is especially ridiculous. The fans/media have a way of fixating on some upcoming AMD tech and building it up in their imaginations to be some secret super weapon that’s going to restore them to former glory.
The fanboys choose to believe it because they have this romanticized The Empire vs Rebel Alliance view of the tech industry. Tech sites/YouTube channels run with it because, well, it makes for a hell of a story and it gets them lots of clicks.
Then when we find out it was mostly hyperbole, instead of choosing to learn some healthy skepticism, they kick the can down the road with “oh just wait for drivers to improve, just wait for the next version, THEN it’s really going to shine.”
It’s almost like some kind of religious faith in the Second Coming.
Anyway TL;DR I respect AMD and their engineers, but fanboys and clickbait media deserve mockery.
Wont you still have to buy a psu if you go an rdna3 gpu? the xt uses about the same power as the 4080 and the xtx uses more.i don't know if i can live without DLSS. RTX? yeah definitely but i think if it wasn't for DLSS i'd have upgraded my GPU by now.
i have a 2080. i wanted to get a 4080 but the price is insane plus i'd need to factor in the cost of a new PSU.
not sure what to do.
They're not even in stock at many retailers. They're listed because they'll be available soon but out of stock because they ain't got them yet.Wow. They're already out of stock... Guess everyone wanted them.
Lets look at last gen for example.But you should have different chips for the different price brackets.
Big chip:
Full chip best frames per $ for the chip
Cut down weaker value for the dollar
Maybe a further cut down chip with a little more frames per $ loss
Medium chip:
Full chip best frames per dollar for the chip
Cut down again with a little less fps per $
and so on down the line.
According to this the merc has a 3rd power socket and boosts to 2615 vs 2500 on the none merc, I just looked at the summary below there may be more detailed info if you go to the merc pageI snagged a 7900XT XFX. What is differerence between XFX and XFX merc?
I snagged a 7900XT XFX. What is differerence between XFX and XFX merc?
So reference model is worse?The XFX is simply a rebranded AMD reference model vs a custom cooler/board design for the Merc w/ higher PL vBios.
There's quite a few partners selling the reference model rebranded.
So reference model is worse?
I made a mistake. The merc was 100 more euro so i didnt order itYes.
Not to say the reference model is bad, as once Igor's MorePowerTool becomes compatible you'll be able to raise the max power limit on the reference model like last gen. However, out of the box, the custom models definitely have an advantage in noise/cooling/potential OC moreso than usual.
Lets look at last gen for example.
You wanted the 3090Ti full GA102 to be best bang for buck?
Then the 3080 GA102 to be much worse value bang for buck?
No one will buy the 3080 then, and if the 3090Ti is 2000 dollars, then again very few people buy it.
So you are left with a shit ton of GA102 chips some which made yield some which didnt, none of which are seen as being of value?
It leads you right back to the same point.
The halo product doesnt need to be the most logical from a value perspective, it simply needs to be the best.
Which is why 3090s and 3090Tis actually sold.
Everything else needs to make sense to consumers.
Doesnt matter what the underlying chip is, cuz who the fuck cares.
Outside of those halo products price/performance starts to matter.
And if you cant meet consumer expectations there, you are left with stock on your hands.
23.1.1 driver should be fix alot issues with RDNA3, especially efficiency and gaming perfomanceWhich the 7900 obviously needs some tuning in with where the idle usage is.
Yikes, 7900 XTX perf/watt is abysmal.
Yikes, 7900 XTX perf/watt is abysmal.
Folks love a good underdog story. AMD coming back from the brink with Ryzen was good for the industry, and I think a lot of people are hoping they repeat that success with RDNA3 against nVidia. Foolish, but to be expected.There’s just something about this hype machine that is especially ridiculous. The fans/media have a way of fixating on some upcoming AMD tech and building it up in their imaginations to be some secret super weapon that’s going to restore them to former glory.
If you have a 144hz monitor why wouldn't you limit the FPS to 144hz? It's not weird to me, I lock the FPS to match the display, otherwise I'm wasting energy and heating up my room unnecessarily.^ Using a limiter is a weird choice. You aren't really testing for the peak perf/watt there, just the way the cards scale power. Which the 7900 obviously needs some tuning in with where the idle usage is.
Max it out and compare the fps per watt to the 4080 or 6950xt, kind of useless the way the info is presented there
Came here to share the same thing. Yikes.
Some techtubers claimed that AMD saw Nvidia's Lovelace as "hot, loud and noisy", MLiD even claimed that Nvidia would struggle with diminishing returns in the future with Blackwell (50 Series) because they're cramming everything into Lovelace.
It's so laughable now.
If you have a 144hz monitor why wouldn't you limit the FPS to 144hz? It's not weird to me, I lock the FPS to match the display, otherwise I'm wasting energy and heating up my room unnecessarily.
It might have something to do with AMD drivers too - i think they are still rather immature.
Something tells me that just like Zen4, these GPUs are going to sell poorly at launch and AMD will have to lower prices soon after. Especially the 7900XT.
I wouldn't be surprised if a month or two from now, we'll find the 7900XTX under 900$ and the 7900XT at 750$.