4K Gaming Showdown – AMD R9 290X & R9 280X Vs Nvidia GTX Titan & GTX 780



/ 1 year ago

Next Page »

40 Flares Twitter 9 Facebook 29 Google+ 2 LinkedIn 0 StumbleUpon 0 Pin It Share 0 Reddit 0 Email -- Filament.io Made with Flare More Info'> 40 Flares ×

Introduction


4k_gaming_combined_sized

***Please see our latest 4K Gaming article here which adds GTX 780 Ti, GTX 780 Ti SLI and R9 290X CFX to the mix***

With GPUs getting more and more powerful and 4K monitors becoming available for consumer purchase we thought we’d use AMD’s R9 290X launch as a spring-board to look at the 4K Gaming performance of AMD and Nvidia’s top 2 single GPU graphics cards. Of course since writing this article Nvidia have revealed their intentions to release a GTX 780 Ti graphics card which is worth considering when looking at these benchmarks. AMD are also expected to reveal an R9 290 graphics card at some stage this year too. So this is by no means a comprehensive or complete look at 4K performance on high end AMD and Nvidia GPUs, but we think it is an interesting place to start.

Firstly let’s recap the graphics cards we’re using, all four are pictured above and they are:

  • AMD R9 290X – read our review here.
  • Nvidia GTX Titan – read our review here.
  • Nvidia GTX 780 – read our review here.
  • Sapphire Vapor-X OC AMD R9 280X –  read our review here.

Next we’ve managed to get a hold of a 4K monitor for this testing as AMD were kind enough to provide us with the Sharp PN-K321 4K monitor.

DSC_9518

The Sharp PN-K321 uses a 32 inch IGZO panel providing a resolution of 3840 x 2160. Being a first generation 4K panel it uses two 1920 x 2160 displays stitched together with an advanced display controller chip. The 4K monitor is able to stream 4K at up to 60Hz which is best done through DisplayPort.

PNK321

We’ve used the usual selection of games that we’d normally do in our graphics card reviews so we’ve got a selection of 7 games and one synthetic benchmark to show you: Alien vs Predator, Bioshock Infinite, Hitman Absolution, Sleeping Dogs, Unigine Heaven 4, Tomb Raider, Dirt Showdown and Metro Last Light. Without any further ado let’s see exactly how these AMD and Nvidia GPUs got on at some 4K gaming.

40 Flares Twitter 9 Facebook 29 Google+ 2 LinkedIn 0 StumbleUpon 0 Pin It Share 0 Reddit 0 Email -- Filament.io Made with Flare More Info'> 40 Flares ×

Topics: , , , , , , , , , ,

Next Page »


  • Valeriu Steel Rod Crainic

    Nvidia got anihilated, simple as that and got what was coming for a while to them. Definitely I’ll welcome to my builds AMD cards, qutie shameful to see GTX Titan, the oh-so-hyped graphics card currently on market, get beaten by R9 290X which is situated $350 lower.

    • Anthony Burt

      I wouldn’t say annihilated, AMD looking good this time around though, only took them 8 months to release something that actually has some balls. You know what everyone is thinking though – what will Nvidia’s response be? Price cuts?… not likely to be significant enough to sway anyone that wants the best. New card?.. hope so, but only if it isn’t a dual GPU card.

      • Joe Wingett

        I do believe nVidia are struggling to make a dual-GPU card with their current 700 series, they’re too much the blowtorch GPU for that.

    • http://www.eteknix.com/ Ryan Martin

      $450 :P

    • JMHJ

      You forget that the GTX Titan came out just after GTX 680 and when it arrived it was about 1½ – 2 times as powerfull as the strongest single GPU on the market. Thus the hype… Ofc it starts getting beaten when Nvidia put out a new series of cards and AMD even later a new one too… “Nvidia got nihilated” are the words of a AMD fanboy and make no sense to put out now. The Nvidia prices will probably fall to match those of AMD and winter/spring 2014 Nvidia will release their 800 series….

      • Valeriu Steel Rod Crainic

        So you’re sitting behind the screen criticising me how much of an AMD fanboy I am (and I admit I am, built computers since 2005, back when having a Radeon 9550 was beyond god-tier), while you immediately defend Nvidia by the fact that they built the most powerful singe-GPU card (until AMD dished out R9’s). Sorry but that is a really weak argument, and if you find no sense, I don’t care, your lack of reading skills shines. When Nvidia launched the Titan, I knew and I’ve even bet money between my colleagues that AMD will push something much better and cheaper, and the current situation speaks by itself: Nvidia keeps hype along with their 4-figure premium card, while AMD will run away with the buyers who were patiently waiting. We are still waiting to see how will 780 Ti will be priced, but I could bet another series of money it will run higher in cost than R9 290X, just you watch, I’ve been too long in the computer business to not spot the trends.

        • JMHJ

          Im not a Nvidia fanboy, I’ve had cards from both manufactures and I’ve been equally pleased with both. Ofc the companies will keep beating eachother, if they didnt, one of the companies probably would be gone by now. But you could turn it around and say Nvidia will have the best cards in 6 months when they launch the 800 series (If 680 Ti doesnt take that spot). Its been that way, for a long time, Nvidia pushes the limits and makes the highest performance cards on the market at a very steep price, while AMD gives you alot more bang for the buk with their cards. The thing is, there is value in both, there will always be people wanting the best of the best, who have enough money to buy it, and there will always be people who need to get the most for the least money. If you think my point was to defend Nvidia you missunderstood me, my point was that its wrong to say that they got anihilated by AMD or that the Titan isnt what it was made out to be (overpriced or not). If the R9 290X had been out a month after the Titan I’d agree with you, but one company being over 6 months ahead of another is quite alot, which is also the reason for the almost insane price of a Titan. I wouldnt be suprised if the Titan false in price now and in sometime in Q1 2014 we will se a Titan 2 which again will be 1,5 as powerfull as the 290X, at a mad price no doubt.

          • Joe Wingett

            Thing is, currently you can pick up two R9-290x’s for the price of one Titan. So even if nVidia release a new all-killing GPU, which no doubt they will, AMD won’t go down because their prices are better. And that’s the real deal here. If you were to consider price to performance in this case, you could be looking at things a whole new way.

          • Phil Deakin

            these cards need to be half the price of a titan as they will have half the life span due to high temps and extreme power consumption that will take it’s toll on the cards components….

          • Joe Wingett

            Look, I know your bromance with nVidia means you get angry with any card that starts beating them, but still, try to understand. Most of these cards have a two or three year warranty, and if you intend to be running a card like this for more than that then, well, chances are it’s going to fail after a while be it nVidia or AMD. And a 95c operating temperature is nothing new – just because it’s rated to hit 95c doesn’t mean it always will, either – my A10 is rated for 115c. nVidia’s cards tend to run hotter by default, but they are not rated to hit as high temperatures. Same goes for Intel CPU’s. Not knocking either, just understand that the temperatures you are used to dealing with may not be the same as the temperatures you are likely to see on an AMD chip – AMD’s cards are rated for higher temperaturres and last longer at those temperatures than nVidia’s equivalents.

          • Phil Deakin

            firstly 95c is not the highest it is the average temp that these cards will run at whilst installed in the average persons pc just as 80c is the average for the Titan. The titans absolute max safe temp is 95c as stated on the Nvidia website, where as I can’t seem to find the R9 290x’s maximum temp.

            secondly I do not have a bromance with Nvidia I just appreciate good quality products with enough room in them to be tweaked without the need for custom cooling nor do i want them heating up the rest of my components by 10-15c thus causing the life of them to diminish.

            If this card produced the impressive results it has whilst using the same amount of power and running at the same temps as the Titan I would have been more impressed than I currently am. I just hope that when the likes of ASUS, MSI etc get hold of them they can sort the cooling out like AMD should have done in the first place.

            This is a good card no-one can deny it but in the 8 or so months that the titan has been out I would have thought they would have figured out how to make this piece of tech a little more economical and cooler.

          • Joe Wingett

            I know man, just teasing a little. Thing is though, the card running at 95c is probably not going to shorten the card’s life, nor reduce your bounds for overclocking – realistically though, reference coolers are never going to be that good for overclocking either way, and suppliers such as overclockers will supply cards with custom coolers preinstalled. The other components on the board are likely to be rated for temperatures up to 120c or higher. Until the next process is available – that is, until the next generation of cards – these cards are going to run hot. The only thing in your system that you should be concerned about is the CPU – and this being a leaf-blower fan, you don’t really need to worry yourself too much there. Motherboards are also designed to keep the ridiculously hot components apart from heat-sensitive components, either way. The only issue you might have is were you to run a Haswell i7 on a 120mm radiator in a cooling loop with four of these cards.
            But then, you wouldn’t do that with the Titan, either, would you?

          • Phil Deakin

            hell no :)

          • Carlton Moore

            Other than Haswell and the ancient Pentium-D, that’s completely backwards on the CPU side. In general, Intel CPUs run cooler than AMD. And first through third generation Core i5/i7 processors overclock better than competing CPUs from AMD. Intel CPUs are historically better on thermals and power consumption. On the GPU side I agree… AMD’s cards are rated for higher temps, and typically run hotter.

          • random name

            I really hope one of them just die, so we wont argue anymore. gtx 780ti will cost more than r9 290x and if it still below Titan performance, they may die now.

    • Insane

      All i’v seen so far is a very small and insignificant performance difference between a R9 290X and a Titan, migh also add the GTX 780 there, since it seems to just a bit weaker than 290. The price difference is a problem, but dont expect nvidia to just sit around watching AMD owning the market because they have low prices. A 5-10 FPS difference is not “annihilation”. I expected to see 20+ fps difference, maybe more ? Because with all the hype and the “Titan Killer” bs it sounded like it will be capable do to just that, what i get instead for now is something that just sits right there on par with the competitors performance wise, only the price is a good pro so far.

    • GH0ST_SE7EN

      Lol, “anihilated”…8 months later, by a card that uses more power, runs hotter, and much louder. Right..

      • tgrech

        Maxwell is Q42014 minimum, they’ve taped out too late. Maxwell will probably come after AMD’s next gen GPU’s if the current info we know is correct.

        • GH0ST_SE7EN

          1H 2014 according to WCCFTech. It makes sense if you think about it. Nvidia has had nothing to do besides develop Maxwell since they completed their final Kepler iteration with the GK110 earlier this year. And the 290X is already part of AMD’s next-gen Volcano Islands range. It will be a long while before AMD releases its next-gen GPUs. http://wccftech.com/nvidia-maxwell-gm108-gm107-engineering-samples-spotted-wild/

          • tgrech

            It’s normally about a one year time period before designs being finished/early engineering samples and the product reaching the consumer in an actual card. Also from what I can see there these are mobile rebrands and not actual Maxwell chips. AMD has had stacked memory ready and working well for a couple of years, and with Kaveri launching soon, if HSA picks up it wouldn’t be totally crazy if AMD released a new fully HSA compatible, TrueAudio compatible line using stacked memory to complement the 290X. They could do it in a hurry too, as they could use GCN designs on the newer denser(But still officially 28nm) process node Hawaii/Bonaire is on, most of the work would be on the PCB, which is relatively light.

          • bob

            These are mobile samples for notebooks. Maxwell is coming 2H2014, Q3 at best. They’re both waiting TSMC volume production. Maxwell and GCN 2.0 are very similar from a top-down point of view. Nvidia is going the 1/4 fp route, AMD has moved the geometry units to the SIMDs and doubled the ROP count. The only major difference must be the Texture Units. AMD is aiming at smaller dies, Nvidia at less power for the same throughput at the cost of a larger die.

    • Carlton Moore

      Whatchatalkinbout Willis? The introduction of the GTX 780 pretty much killed the Titan’s appeal for single-card setups. Almost the same performance for a 40% lower price. The GTX 780 Ti will probably also beat the Titan- it’s a card that’s almost a year old and was overpriced from the start.

    • jigar7

      Titan is not just a gaming card.. The cuda cores are very important for 3D rendering softwares.. it’s a mix of both Quadro and gaming cards whereas ATI is solely for gaming. Try rendering heavy scenes in Maya/Max/Vray… you’ll want to break the ATI cards into two! That said, it may be good from the gaming point of view… but that still excludes gorgeous visual effects that only nVidia provides… subtle differences, but if you’ve got the eye for details… nVidia is the way to go. Too less GOOD games are powered by ATI/AMD… it has always been nVidia. Of course, I’d love it if ATI gives them a tough competition, its better for all the consumers.. no point in arguing with each other :)

      • Valeriu Steel Rod Crainic

        Right, but the people who do 3D renderin are a gross minority, and I’ve done rendering using a render box packed with FirePro’s. Granted it became a habit for me to buy aftermarket cooling, mostly Corsair, but still they did a bloody good job rendering projects that had 10mil+ polys.

        • jigar7

          nVidia GeForce is for people like me… I do rendering quite a bit… but I can’t afford to have 2 different cards… and I do a lot of gaming as well. My colleague chose one of the HD 7000 cards and he is miserable now! Cuz the HD series doesn’t do a good job of rendering when compared to GeForce cards. Of course it works great with games.
          FirePros and Quadros lie in the same category… both are bad at gaming… I’ve tried LA Noire on a FirePro and a Quadro FX4000 at my workplace… both suck at gaming… lol.

  • not a fanboy

    this is good, so we dont need to get headache anymore, just go AMD, cheaper, stronger, what else do you need?

    • Phil Deakin

      If the Titan was to run using that much power and getting to those temps I’m pretty certain it would crush all cards easily……

      the r9 290x is too hot and too power hungry, the life of this card will be tiny in comparison to other cards….

      • not a fanboy

        yes, it is hot, hopefully we’ll soon see the non reference design

      • http://www.eteknix.com/ Ryan Martin

        I understand your point but here’s how I see see it. You can ONLY buy reference GTX Titans, these cost $1000. Non-reference R9 290X cards will be about $575. These non-reference designs will eliminate temperature problems. With regards to power our results suggest about 15% more than the Titan at absolute peak load. In terms of performance our results show around 7.5-15% more than the Titan depending on the game. In my opinion a non-reference R9 290X would hands down be better than a GTX Titan at current pricing. It costs a lot less, has more performance, would have similar temps and noise (that will vary by partner solution) and will have more power consumption but more performance. Anyone that then pulls in the GTX 780 argument needs to acknowledge that this has higher power consumption than the Titan and less performance than the 290X for a higher price. The only reason the current “fanboy war” is able to perpetuate is because of two reasons:

        1) AMD isn’t letting AIBs release non-reference solutions (yet) so there’s no point of comparison
        2) Nvidia isn’t telling anyone how they are going to respond (in terms of new GPUs) or adjusted pricing.

        Both AMD and Nvidia need to get their act together because as soon as they do the consumer will benefit IMO.

        • Klimax

          Actually not correct. There is a variation on Titan:
          http://www.gigabyte.cz/products/page/vga/gv-ntitanoc-6gd-b/
          Titan Overclocked and including custom cooler. (not mounted)
          Also of note is, that if you force fan to increase substantially speed (85%) thus trading cooling for noise, you can push Titan up to 950 or 1056MHz. (I have seen that and it improves performance)
          Note: Tested with stock Titan using Gigabyte’s OC program.

          • http://www.eteknix.com/ Ryan Martin

            I understand, the GTX Titan has a boost clock in a similar fashion to AMD’s R9 290X. The point is that Nvidia’s limit is 80 degrees so if you increased that to 90/95 degrees it would pretty much never reach a throttle point (as the Nvidia cooling solution is very good) but AMD’s R9 290X does because its cooler just sucks.

            With regards to your Gigabyte example, yes so what they sell the GTX Titan with a bundled cooler? Retailers also do that? My point is users don’t want to have to build their own cooling solution. It should come pre-fitted but Nvidia doesn’t allow that so non-reference designs are not allowed unless consumers buy it and DIY a cooler on.

          • Klimax

            I raised only fan speed. (I tried few times to play with limit, but it didn’t change anything…) Forgot to note, boosted stable frequency was observed under Dirt 3 Showdown.
            My link goes to factory overclocked Titan. (Otherwise, OK, understood)

  • Sean Patrick DeMarco

    2 cards or more, waterblocks, and a 4K monitor or don’t bother getting these cards

  • weeges

    I wonder if NVidia will issue 250-300 dollar credits back for those of us that got our wallets tromped? I still like Nvidia – for now…. but that’s the fun in competition – we will see!

  • Qbex

    open bench case – LOL @ those who think 290x will perform even remotely close to that in closed, real life case

    • http://www.eteknix.com/ Ryan Martin

      You’d be surprised actually. If you have a well ventilated case it often performs better than an open air test bench because you have concentrated and channelled air flow that removes heat effectively. (Providing you have 2 fans in the front and one at the back that can shift a decent amount of air). Of course if your case has poor airflow it will perform worse as you say, turning effectively into an oven/hotbox.

    • BazingCrazyDog

      water cooling

    • BazingCrazyDog

      water cooling

    • Sly Cooper

      Tri-X

  • Andrew

    r290x is 4k gaming gpu, u can see it shining at 2k and up. It was designed to beat titan in 1080p and it does, not completely, i call losing to titan by 5-10 fps a win because if you’r screen runs at 60hz and titan dishing out 80fps while 290x 70, what’s the point, u pay almost 50% less, also it’s future proof, 4k rez that is. The heat is a problem, u gotta go liquid cooling, but if you’re getting 290x you know what you’re getting into anyway, anyone buying a $500 dollar anything knows what they’re doing right?

  • I-C-E-D

    where can i find that wallpaper that’s shown on the monitor on the first page?

    • FreAkisHKiD

      Yeah, its a sick wallpaper.. I also want to know :P

    • blastx

      I oont know if you managed to found the image, but il tell you how. Crop the image and use a reverse image search engine online. Such as tineye.com . Or you can find it here with author name and everything. But i dont think it’s free. It’s copyrighted. http://www.shutterstock.com/pic.mhtml?id=87201394

      • I-C-E-D

        yeah i used it and found shutterstock :'(

        looks like i’d have to pay to get it…nty

  • Αντωνης

    The Geforce 780Ti is faster from the R9 290x in most tests i have seen. You should remake the test and add the latest Geforce 780Ti in your benchies

  • Justin Gabriel

    just out of curiosity i have a gtx 780 and i am thinking of buying a 4k monitor. will i have to buy another card and go sli or will the 780 do ok.?

    • http://www.eteknix.com/ Ryan Martin

      The GTX 780 will handle 4K just fine. However, if you want to hit those higher framerates (45+) then you’ll need to roll back some of those settings to medium-high, whereas if you’re happy with 30~ FPS then you can keep high-very high- ultra settings. A lot will vary by game and settings chosen. If you want every game to be playable at max settings you’ll need SLI 780s.

      • klepp0906

        if we’re being honest… my rig is extremely overclocked and overvolted. Everything is under water. Running Quad Titans. As luck would have it – nvidia sucks my sack and has yet to put out working drivers that will allow SLI to work while using surround in windows 8.1. Yea so i have this 10,000 dollar set up, and I get around 25fps in my game of choice. Rock on right?

        The reason im telling you this is while im running higher than 4k reso 4680×2560, I’m also running overclocked. So we can assume they cancel eachother out for the most part give or take. As the former poster mentioned – With maximum settings you will likely get around 30 fps give or take with one. That being said, some games will annihilate your framebuffer so keep an eye on that. Nothing you can do w/ adding another 780, your stuck as far as that goes. For comparisons sake, even in wow when using AA I use more than 3gb :P

        In most games without AA you should be completely fine though and you can always add another 780 in the future. Just remember due to nvidia’s shoddy marketing tactics, you cant run 4, they put a limit to force people into buying titan to run 4 (worked on me and now 4 wont work in most games becuase im using surround and microsofts newest OS)

        Im in the middle of building a new rig and Im going AMD this time. Have 3 290x coming. Wont go nvidia again until far inthe future when I know the customer support and software/drivers have caughten up to the ridiculous price premium you pay.

        • shadowhedgehogz

          4k is a waste of time and is uber expensive, better get a 1080p 120hz screen or 1440p and 1-2 high end cards to go with it.. 4k runs like crap on most newer games anyway.

          Some people are happy with 30fps but i want silky smooth fast games, not ones that run like a slug and look a bit sharper.

          • Sly Cooper

            Some people have the money. Judging from your comments, you obviously don’t.

            That does not mean it is a waste of time at all.

        • Gboss

          Damn ill take your sloppy seconds you build sick rigs! 3 290x cards will crush…

  • shadowhedgehogz

    4k sucks, unless you are happy with like 20fps minimums i wouldn’t bother.. i’m stickin with 1080p for now and 120hz, may go 1440p max but that would require 1 high end card or 2 max to get “nice fps”

    Proof in the pudding, look at the benchmarks of a 780 running the UE4 elemental demo at 1080p, roughly 40fps minimum and over 60 average, i’d class that as pretty good because dips to 40 are tolerable but not the best.. now imagine upping the res to 4k it would run like ass.

    Well, i think i made my point.

  • chrislcookie

    has anone tried a 4K monitor with 3D? I have not seen any active 3d monitors that are 4K but was wondering if anyone has tried using a 60Hz 4k monitor with passive glasses and what the frame rates were like

    • Jack Attack

      Firstly, I doubt anything would be playable unless you owned a 3 SLI/CF build and even then, the game/card/movie would have to support it. 3D requires twice the amount of bytes as the original resolution.
      Secondly, even if you could DP1.2 or HDMI 2.0 probably couldn’t do it at 60Hz, given the amount of data that would have to stream through it.

      Keep dreaming big though.

  • emanuel wooten

    why didnt you use the r9 280x toxic?

    • http://www.eteknix.com/ Ryan Martin

      we didnt have it at the time

  • Fanatoli Guyoff

    i have the same system as you on the intel 8 core haswell (yes i spent like 5k on this computer) does not run most of the newest games ive tried lately in 4k. I went back to my 1080p monitor for that reason. IDK I have my doubts about what you say.

  • iamserious

    Lol