The AMD Radeon R9 Fury Review, Feat. Sapphire & ASUS
by Ryan Smith on July 10, 2015 9:00 AM ESTThe Test
On a brief note, since last month’s R9 Fury X review, AMD has reunified their driver base. Catalyst 15.7, released on Wednesday, extends the latest branch of AMD’s drivers to the 200 series and earlier, bringing with it all of the optimizations and features that for the past few weeks have been limited to the R9 Fury series and the 300 series.
As a result we’ve gone back and updated our results for all of the AMD cards featured in this review. Compared to the R9 Fury series launch driver, the performance and behavior of the R9 Fury series has not changed, nor were we expecting it to. Meanwhile AMD’s existing 200/8000/7000 series GCN cards have seen a smattering of performance improvements that are reflected in our results.
CPU: | Intel Core i7-4960X @ 4.2GHz |
Motherboard: | ASRock Fatal1ty X79 Professional |
Power Supply: | Corsair AX1200i |
Hard Disk: | Samsung SSD 840 EVO (750GB) |
Memory: | G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26) |
Case: | NZXT Phantom 630 Windowed Edition |
Monitor: | Asus PQ321 |
Video Cards: | AMD Radeon R9 Fury X AMD Radeon R9 290X AMD Radeon R9 285 AMD Radeon HD 7970 ASUS STRIX R9 Fury Sapphire Tri-X R9 Fury OC NVIDIA GeForce GTX 980 Ti NVIDIA GeForce GTX 980 NVIDIA GeForce GTX 780 NVIDIA GeForce GTX 680 NVIDIA GeForce GTX 580 |
Video Drivers: | NVIDIA Release 352.90 Beta AMD Catalyst Cat 15.7 |
OS: | Windows 8.1 Pro |
288 Comments
View All Comments
Shadow7037932 - Friday, July 10, 2015 - link
Yes! Been waiting for this review for a while.Drumsticks - Friday, July 10, 2015 - link
Indeed! Good that it came out so early too :DI'm curious @anandtech in general, given the likely newer state of the city/X's drivers, do you think that the performance deltas between each fury card and the respective nvidia will swing further or into AMD's favor as they solidify their drivers?
Samus - Friday, July 10, 2015 - link
So basically if you have $500 to spend on a video card, get the Fury, if you have $600, get the 980 Ti. Unless you want something liquid cooled/quiet, then the Fury X could be an attractive albeit slower option.Driver optimizations will only make the Fury better in the long run as well, since the 980Ti (Maxwell 2) drivers are already well optimized as it is a pretty mature architecture.
I find it astonishing you can hack off 15% of a cards resources and only lose 6% performance. AMD clearly has a very good (but power hungry) architecture here.
witeken - Friday, July 10, 2015 - link
No, not at all. You must look at it the other way around: Fury X has 15% more resources, but is <<15% faster.0razor1 - Friday, July 10, 2015 - link
Smart , you :) :D This thing is clearly not balanced. That's all there is to it. I'd say x for the WC at 100$ more make prime logic.thomascheng - Saturday, July 11, 2015 - link
Balance is not very conclusive. There are games that take advantage of the higher resources and blows past the 980Ti and there are games that don't and therefore slower. Most likely due to developers not having access to Fury and it's resources before. I would say, no games uses that many shading units and you won't see a benefit until games do. The same with HBM.FlushedBubblyJock - Wednesday, July 15, 2015 - link
What a pathetic excuse, apologists for amd are so sad.AMD got it wrong, and the proof is already evident.
No, NONE OF US can expect anandtech to be honest about that, nor it's myriad of amd fanboys,
but we can all be absolutely certain that if it was nVidia whom had done it, a full 2 pages would be dedicated to their massive mistake.
I've seen it a dozen times here over ten years.
When will you excuse lie artists ever face reality and stop insulting everyone else with AMD marketing wet dreams coming out of your keyboards ?
Will you ever ?
redraider89 - Monday, July 20, 2015 - link
And you are not an nividia fanboy are you? Hypocrite.redraider89 - Monday, July 20, 2015 - link
Typical fanboy, ignore the points and go straight to name calling. No, you are the one people shold be sad about, delusional that they are not a fanboy when they are.redraider89 - Monday, July 20, 2015 - link
Proof that intel and nvidia wackos are the worst type of people, arrogant, snide, insulting, childish. You are the poster boy for an intel/nvidia sophomoric fanboy.