XFX AMD R9 Fury X 4GB Graphics Card Review

/ 1 year ago

Next Page »



This is what we’ve all been waiting for, the R9 Fury X graphics card is finally here! This particular card has been given a lot of hype in recent weeks and months thanks to rumours and leaked performance benchmarks all pointing towards a ‘Titan X Killer’. These rumours shook the entire enthusiast market and resulted in NVIDIA fighting back with another high-end graphics card; the GTX 980Ti. With the dismissal of a reveal at Computex 2015 by AMD, NVIDIA got the upper hand with the launch of the GTX 980Ti. This meant a huge focus was looming over AMD at the PC Gaming Show event at E3. We covered the event and what it had to provide in terms of graphics cards here!

Previous to this, AMD held multiple conference calls and events to slowly unveil what they had planned for the GPU marketplace. The most notable nugget of information was the ‘Ace’ up AMD’s sleeve; High Bandwidth Memory (HBM). This is the idea of an engineer at AMD and has been in the pipeline for around 7 years. GPU memory technology up until now has been referred to as GDDR memory and in the most popular state, GDDR5 is the current standard of memory technology. HBM hopes to completely demolish what GDDR5 is and bring in a new standard; the key is in the name, High Bandwidth. Compared to traditional GDDR5, the bandwidth will increase from approximately 28BG/s per chip to over 1000GB/s per stack. The key difference between HBM and GDDR5 is the placement of the DRAM chips, in GDDR5, they are laid out around the GPU; in HBM, they are stacked directly onto the GPU die. This massively decreased distance to travel not only increases bandwidth, but also decreases the overall footprint of the PCB. One of the limitations with HBM however, is that HBM v1 will be limited to 4GB, or 1GB per stack; when compared to what NVIDIA has to offer or even the R9 390X, it doesn’t look very appealing.


In our recent news coverage, we have had to gather our information from other sources. This has led us to be portrayed as being critical to AMD due to only primary issues being brought forward. Problems such as poor HBM production speeds have affected the overall availability of the R9 Fury range, undoubtedly annoying water pump buzzing and what can only be described as a poor choice of review linking on the AMD Facebook page; have been the main stories, with very few pro-AMD articles. Due to the poor availability of the R9 Fury X samples, we were skipped over, however, we purchased our own and now we can figure out for ourselves if these issues are in fact issues at all.


The XFX box is very simplistic, with a diamond pattern in the background, it allows focus on the main points of the box.

Inside the box, we find a warranty card, driver disk, manual and screws.

The card is a lot longer than I thought it was going to be, despite all of the pictures I’ve seen already. It has a metal box frame across the entire card with soft touch panels on each side. It is completely closed off apart from a small gap where the pipes enter the card.


Along the top, we see both Radeon logos and the 2x 8pin PCI-e Power connectors; the Radeon logo on the side illuminates when turned on.


A key feature on the R9 Fury X is the load LEDs. These light up depending on how much load the GPU is currently under.

AMD have given the tubing and fan cable extra attention with the addition of dense weave sleeving.


It enters the card so you cannot see any heat shrink or fraying ends.


The radiator is a standard 120mm size design with an addition lip along the top; The OEM for the AIO cooler is Coolermaster.


At the business end of the card, we see no vents for air cooling. However, there is an etched Radeon logo should you forget the manufacturer. The card is also equipped with 3 x DisplayPort and 1 x HDMI.


Topics: , , , , , , ,

Next Page »

  • 12John34

    Thanks for the review.

  • mr2k9

    Why arkham Knight? not to mention no setting shown for it too. Perhaps you guys should benchmark The Witcher 3 instead of arkham knight. I sure hope amd will release a much better driver for fury x. Sad to see a new architecture(hbm) falling behind an old architechture(GDDR5). So much promise yet so much disappointment.

    • rav55

      Arkham Knight has been withdrawn by Warner Brothers as the programming is seriously flawed. It is unplayable on DX11. WB will be releasing AK with a DX12 port and patch.

  • Why AMD why………..

  • grumpytrooper

    So basically if you are gaming at 1440p or 4K this is worth serious consideration I am a little surprised at the price point, I did expect it to be a little cheaper being an AMD card but things do look promising for team red.

    Unfortunately for me all of my gaming is done at 1080p so I personally would buy the 980Ti. Still I am interested in seeing where AMD take HBM.

  • (>_<)

    When will you run the 3dMark API Overhead Feature Test?

  • (>_<)

    @ Rikki Wright

    When you last wrote about DX12 benchmarks you pointed out that Radeon 290x was 33% faster than GeForce 980Ti and as fast as Titan X.

    So Fury should be 20% faster than Titan X and 50% faster than 980 Ti.

    Right? You know, you ran the benchmarks already yet you aren’t telling us.

    Peter Odonnell rote this piece 4 months ago.

    www dot

    Your own site knows that nVidia GPU’s are drastically slower than AMD using DX12 yet you lie to your readers by omitting these very important facts.

    Don’t consumers have a right to make educated buying choices?

  • (>_<)

    With DX12 Fury x is 50% faster than GTX 980 Ti and crushes Titan X.

    nVidia does not have Asynchronous Shader Pipelines nor do they have Asynchronous Compute Engine IP.

    nVidia is great with an obsolete API. But it is absolute trash with Mantle or DX12.