HeWhoWalks
Member
So? That's a purely performance difference and it's clearly not enough to justify paying $350 more for Titan X than 980Ti.
You said "only". I was merely correcting your error.
So? That's a purely performance difference and it's clearly not enough to justify paying $350 more for Titan X than 980Ti.
The only "architectural" advantage Titan X has above 980Ti is it's additional 6GB of RAM. Well, that and the price which is bigger as well.
http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/6
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,28.html
I need actually read more about how these sites test frame timing, but I find it curious that TechReport has insane frame timing spikes while in Guru3D's tests frame timing is pretty much flawless.
Huge noise reduction is the reason. Noise is a huge factor for a lot of people. Don't tell me you don't care about noise...
Huge noise reduction is the reason. Noise is a huge factor for a lot of people. Don't tell me you don't care about noise...
Semi-passive air cards are silent during idle and low load periods. You can't turn off a pump.
1440p vs 4k.
http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/6
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,28.html
I need actually read more about how these sites test frame timing, but I find it curious that TechReport has insane frame timing spikes while in Guru3D's tests frame timing is pretty much flawless.
1440p vs 4k.
I'd love to think their incompetence is so high they honestly didn't know, and someone at AMD right now is in a meeting going "you showed me the benchmarks! You said it was faster! We look like idiots now!"
1440p vs 4k.
guru3d said:Note: The AMD Radeon Fury X does not have a DVI output. For FCAT at 2560x1440 (WHQD) we need a Dual-link DVI connector, for which we split the signal to the frame-grabber. This is not possible. We converted the HDMI output to DVI, however that's not a dual-link and as such the highest resolution supported is Full HD. So we had a dilemma, not do FCAT at all, or revert to 1920x1080. We figured you guys would love to see FCAT results, hence we compromised for Full HD over WHQD.
Did any of the reviews look at memory use? AMD claimed they found 4GB was enough because they hadn't optimized driver video memory use enough before now, so I assume that means they're planning on making it more efficient in new drivers for the Fury and pare down memory use, which could possibly also help with frame times if more relevant stuff stays in memory.
1440p vs 4k.
ATM the there's really no reason to get the Fury X over the 980ti unless you prefer cooler temps. AMD dropped the ball IMO.
HardOCP did and they found 4 GB wanting.
AMD didn't include HDMI 2.0 which is the input of choice for the majority of 4K displays.
They do.I thought most 4K monitors used DP.
This isn't a PS4. Shouldn't be an issue.I searched the thread for anisotropic filtering but got nothing. Did any of the reviews find an issue with it and the Fury X?
This seems a good card but AMD really needed great here.
Huge noise reduction is the reason. Noise is a huge factor for a lot of people. Don't tell me you don't care about noise...
It's actually 1440p vs. 1080p.
This isn't a PS4. Shouldn't be an issue.
Yea, exactly.Right, though AMD used 0xAF in majority of the tests for the benchmark it published a week ago, which showed Fury ahead in a lot of games.
I imagine all review sites are using maxed AF though, no one games on PC without AF.
This isn't a PS4. Shouldn't be an issue.
Right, though AMD used 0xAF in majority of the tests for the benchmark it published a week ago, which showed Fury ahead in a lot of games.
I imagine all review sites are using maxed AF though, no one games on PC without AF.
For those just considering jumping into variable framerate gaming, a FuryX + freesync monitor combo would give good performance with quite a monetary saving.
For those just considering jumping into variable framerate gaming, a FuryX + freesync monitor combo would give good performance with quite a monetary saving.
Do you want a GPU in a small enclosure? Check your favorite case for clearence measurements. The radiator for the Fury while excellent is far thicker than your average AIO so it will be a challenge installing the radiator even though the GPU is tiny.
Was the Fury X running so hot via air cooled that AMD resorted to using an AIO water cooler. It seems to completely nullify the small dimensions of the card when you have a thick radiator and piping to deal with.
A Fury X and a 1440p+ Freesync monitor does have a lower entry price, but Freesync does not have feature parity with GSync.
The refresh rate window for Freesync varies in the monitors that have been announced and released thus far, with some monitors having minimum refresh rates high as 48 FPS. Meaning that if the game drops below 48 FPS, you can kiss Freesync goodbye. You even get weird instances where the monitor is capable of running at 120 or 144 Hz and Freesync is only a fraction of that range; the recently released ASUS MG279Q ranges from 35-90 HZ. Outside of that range, it's back to VSync or tearing.
G Sync is 30 Hz minimum across the board and hits 144 Hz if the monitor supports the refresh rate. A 980 Ti + GSYNC is well worth the price difference considering how much greener the grass is in NVIDIA land.
And this sums up the entire value proposition.
Do you want to play at 4K above 24 FPS? Get nvidia.
AMD didn't include HDMI 2.0 which is the input of choice for the majority of 4K displays. An active adapter doesn't exist for a few more months and when released will set you back $70 at minimum.
Do you want to play at 1440p at 120+ fps with IPS? You better have moved on from those affordable Korean monitors because dual DVI input wasn't included either. Atleast an adapter would set you back about $30.
Do you want the best possible performance but don't like to overclock? Get nvidia.
Do you want the best performance per dollar after installing a water cooler? Still buy Nvidia as long as you're overclocking.
Do you want a GPU in a small enclosure? Check your favorite case for clearence measurements. The radiator for the Fury while excellent is far thicker than your average AIO so it will be a challenge installing the radiator even though the GPU is tiny.
There were too many compromises in the Fury.
The true strikeout for the Fury X isn't HDMI 2.0 (though it's important for some), or how it compares in FPS to 980ti (it does pretty well at 4k) -- it's the frametime charts mkenyon posted earlier in this thread.Unless you're using a TV, Displayport is the ideal input and is what most monitors come with. Why does everything have to be so black and white around here? 980ti seems to have a clear advantage at lower resolutions and a slight to negligible advantage at higher resolutions at the moment, but it very much depends on the game. Overclocking is really unknown until voltage is unlocked. I really doubt the radiator is going to be more of a clearance issue than a 12"+ GPU. The Fury X doesn't knock it out of the park like many hoped it would, but it also doesn't strike out either like you seem to believe it has.
Ah hell as an owner of a 2gb 760 this isn't going to shake up the landscape like I imagined
Very true. Those are the real weakness of the Fury. They are even worse than a 295x2, which is a dual-GPU card, that would (theoretically, at least) perform worse than a single-GPU one in that test.The true strikeout for the Fury X isn't HDMI 2,0 (though it's important for some), or how it compares in FPS to 980ti (it does pretty well at 4k) -- it's the frametime charts mkenyon posted earlier in this thread.
Those need to be rectified, otherwise it's simply not a good buy.
Unless you're using a TV, Displayport is the ideal input and is what most monitors come with. Why does everything have to be so black and white around here? 980ti seems to have a clear advantage at lower resolutions and a slight to negligible advantage at higher resolutions at the moment, but it very much depends on the game. Overclocking is really unknown until voltage is unlocked. I really doubt the radiator is going to be more of a clearance issue than a 12"+ GPU. The Fury X doesn't knock it out of the park like many hoped it would, but it also doesn't strike out either like you seem to believe it has.
More importantly, sub 30 hz, GSync actually still has a solution to reduce tearing or VSync. It actually does something across the entire refresh range.
G-Sync treats this below the window scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being drawn one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. Its a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
What are the benches for the Non-X variant?