The AMD Radeon R9 Fury Review, Feat. Sapphire & ASUS
by Ryan Smith on July 10, 2015 9:00 AM ESTBattlefield 4
Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.
When the R9 Fury X launched, one of the games it struggled with was Battlefield 4, where the GTX 980 Ti took a clear lead. However for the launch of the R9 Fury, things are much more in AMD’s favor. The two R9 Fury cards have a lead just shy of 10% over the GTX 980, roughly in-line with their price tag difference. As a result of that difference AMD needs to win in more or less every game by 10% to justify the R9 Fury’s higher price, and we’re starting things off exactly where AMD needs to be for price/performance parity.
Looking at the absolute numbers, we’re going to see AMD promote the R9 Fury as a 4K card, but even with Battlefield 4 I feel this is a good example of why it’s better suited for high quality 1440p gaming. The only way the R9 Fury can maintain an average framerate over 50fps (and thereby reasonable minimums) with a 4K resolution is to drop to a lower quality setting. Otherwise at just over 60fps, it’s in great shape for a 1440p card.
As for the R9 Fury X comparison, it’s interesting how close the R9 Fury gets. The cut-down card is never more than 7% behind the R9 Fury X. Make no mistake, the R9 Fury X is meaningfully faster, but scenarios such as these question whether it’s worth the extra $100.
288 Comments
View All Comments
Shadow7037932 - Friday, July 10, 2015 - link
Yes! Been waiting for this review for a while.Drumsticks - Friday, July 10, 2015 - link
Indeed! Good that it came out so early too :DI'm curious @anandtech in general, given the likely newer state of the city/X's drivers, do you think that the performance deltas between each fury card and the respective nvidia will swing further or into AMD's favor as they solidify their drivers?
Samus - Friday, July 10, 2015 - link
So basically if you have $500 to spend on a video card, get the Fury, if you have $600, get the 980 Ti. Unless you want something liquid cooled/quiet, then the Fury X could be an attractive albeit slower option.Driver optimizations will only make the Fury better in the long run as well, since the 980Ti (Maxwell 2) drivers are already well optimized as it is a pretty mature architecture.
I find it astonishing you can hack off 15% of a cards resources and only lose 6% performance. AMD clearly has a very good (but power hungry) architecture here.
witeken - Friday, July 10, 2015 - link
No, not at all. You must look at it the other way around: Fury X has 15% more resources, but is <<15% faster.0razor1 - Friday, July 10, 2015 - link
Smart , you :) :D This thing is clearly not balanced. That's all there is to it. I'd say x for the WC at 100$ more make prime logic.thomascheng - Saturday, July 11, 2015 - link
Balance is not very conclusive. There are games that take advantage of the higher resources and blows past the 980Ti and there are games that don't and therefore slower. Most likely due to developers not having access to Fury and it's resources before. I would say, no games uses that many shading units and you won't see a benefit until games do. The same with HBM.FlushedBubblyJock - Wednesday, July 15, 2015 - link
What a pathetic excuse, apologists for amd are so sad.AMD got it wrong, and the proof is already evident.
No, NONE OF US can expect anandtech to be honest about that, nor it's myriad of amd fanboys,
but we can all be absolutely certain that if it was nVidia whom had done it, a full 2 pages would be dedicated to their massive mistake.
I've seen it a dozen times here over ten years.
When will you excuse lie artists ever face reality and stop insulting everyone else with AMD marketing wet dreams coming out of your keyboards ?
Will you ever ?
redraider89 - Monday, July 20, 2015 - link
And you are not an nividia fanboy are you? Hypocrite.redraider89 - Monday, July 20, 2015 - link
Typical fanboy, ignore the points and go straight to name calling. No, you are the one people shold be sad about, delusional that they are not a fanboy when they are.redraider89 - Monday, July 20, 2015 - link
Proof that intel and nvidia wackos are the worst type of people, arrogant, snide, insulting, childish. You are the poster boy for an intel/nvidia sophomoric fanboy.