4K Gaming Showdown – AMD R9 290X CrossFire Vs Nvidia GTX 780 Ti SLI



/ 7 months ago

Next Page »

29 Flares Twitter 2 Facebook 24 Google+ 3 LinkedIn 0 StumbleUpon 0 Pin It Share 0 Reddit 0 Email -- Filament.io Made with Flare More Info'> 29 Flares ×

Introduction


DSC_0892

When I wrote our first “4K Gaming Showdown” article, nearly 6 months ago, the article was very popular among our readers. However, one of the most common pieces of feedback I received was something to the tune of: “As 4K monitors are so expensive most people who would be buying a 4K monitor will also probably be going with SLI or CrossFire of high end graphics cards, can you test that too?”. To a large extent I agree with that idea, of course there are cheaper 4K monitors out there but most of the good quality 60Hz ones (which you really need for a fluid gaming experience) still cost thousands of dollars, if you’re willing to spend thousands of dollars on a monitor then you would be likely to spend a similar amount on GPUs. I’ve therefore taken this as an opportunity to see what SLI and CrossFire bring to the table for (60Hz) 4K gaming – as requested by our readers. In this article we will be doing a “smackdown” of Nvidia GTX 780 Ti SLI against AMD R9 290X SLI in a variety of games. Based on specified MSRPs the GTX 780 Ti SLI ($699 x 2) is a much more expensive option than the AMD R9 290Xs ($549 x 2) but with recent (mining induced) inflationary pricing on AMD R9 290X graphics cards the pricing is actually a lot closer than you might think and so it begs the question – which combination should you choose for the best 4K Gaming experience?

As you can see above we have a rather complicated mix of cards – sadly we do not have two identical cards for each GPU. That said we will both running all cards at reference clocks for the respective GPUs and the solitary reference graphics card, the Nvidia GTX 780 Ti, will be set to maximum fan speed to prevent thermal throttling as it is the only card out of the four where cooling is a limiting factor. With those modifications made we have 2 reference speed R9 290Xs and 2 reference speed GTX 780 Tis all able to performance at their maximum potential without any thermal throttling. Without any further ado let’s get on with looking at the monitor, the performance and my thoughts on 4K with dual GPUs.

Anyone interested in the reviews of the above graphics cards can find those listed below in order of their appearance in the above picture (top to bottom):

  • Gigabyte GTX 780 Ti GHz Edition 3GB Graphics Card – read our review here.
  • Nvidia GTX 780 Ti 3GB Graphics Card- read our review here.
  • Powercolor R9 290X PCS+ 4GB Graphics Card – read our review here.
  • Gigabyte R9 290X WindForce OC 4GB Graphics Card – read our review here.
29 Flares Twitter 2 Facebook 24 Google+ 3 LinkedIn 0 StumbleUpon 0 Pin It Share 0 Reddit 0 Email -- Filament.io Made with Flare More Info'> 29 Flares ×

Topics: , , , , , , , ,

Next Page »


  • Alistair Hardy

    So, firstly, i hate you so much right now. Average day in the office, playing with such things. I just stare at databases all day….
    now that’s out of the way
    If i was choosing between the two, I would have to side with green team today. sure it costs more but as you said, you gain for it. I also think nvidia’s drivers are better made in general. (though i haven’t had an Nvidia card since the 8800GTS 512mb)
    you should see (just because you can) how all the setups work with two or even 3 4K monitors OR 1 4K monitor and two rotated 1080 monitors, one on either side

    • http://www.eteknix.com/ Ryan Martin

      Haha it isn’t as fun as you think, although I am hoping we can get a pair (or at least one) GTX Titan Black in for testing – now that would be fun! In response to your last request sadly we only have one 4K monitor, so that isn’t something we can really do. I can’t imagine two 1080p panels either side of 4K would be any good for gaming, maybe productivity but definitely not gaming – not sure you can even eyefinity/surround different res monitors.

      • Alistair Hardy

        I think you can eyefinity different res monitors as i seem to recall doing it myself by mistake once. I’ve got two monitors that run at 1080 and 1050 respectively. Either way, i would think it would work quite well if the sizes matched. seeing as the monitor you used is 1920 high and a normal 1080 monitor is 1920 wide, the resolutions should match, which means you’d only have to make sure they’re the same physical size. I think it would favor racing games more, especially if you have a smaller desk.

  • zombie

    Green team.
    More performance,
    Less heat,
    Less noise.
    If you are going for dual cards and 4K, do it right and pay those few extra bucks.

  • John Plummer

    nvidia sli beats amd by a few points. but the AMD spanks the sli buy 20 points!!

  • Αντωνης

    The R290X is very fast and cheaper but has some significant disadvantages like heat and a very noise fan. The 780Ti is more expensive but its cooling solution is much better

    • Black

      And it has a lot higher overclock potential.

  • hizzyshizzlizz

    I would like to point out that the weighted method is fine for most reviews, HOWEVER this review includes all games at the one resolution and all setups 780 TI SLI and 290X Crossfire. So if you took all of the GAMES and added up the frame rates the 290X comes in at 327, 290X Crossfire 591, and 780 Ti SLI 579. Adding in the BENCHMARK which you cannot play, brings the crossfire and SLI setup within 2 total frames/sec, but with the 290X still on top. I want you guys to realized that in GAMING, according to this websites own results, the 290X Crossfire is a bit better than the 780 Ti SLI setup. Each setup has its wins and losses, so choose your setup based on the games you play or go for the VALUE and realize that more times than not the AMD system wins or has a margin that is too small to FEEL.
    Also, in terms of the runt frames for AMD, remember you can always use those shiny new beta drivers that eliminate almost all instances of them.

    I like the testing method of this site, but it bugs me that some of the tests they perform are not included in the performance analysis and because of this there is a MUCH different conclusion.

    • http://www.eteknix.com/ Ryan Martin

      It’s also worth noting that 780 Ti SLI hits a framerate cap caused by a glitch in Hitman Absolution which balances out the 780 Ti SLI advantage in Unigine. We’ve never tested 290X or 780Ti SLI before so not sure what your comment “according to this website’s own results” is meant to mean. I’m also unsure as to what your last comment means “I like the testing method of this site, but it bugs me that some of the tests they perform are not included in the performance analysis and because of this there is a MUCH different conclusion.” We haven’t left anything out, we simply re-ran the same tests in our previous 4K analysis and added BF4 but couldn’t retest on all graphics cards as some of them we no longer have so we left that out of the index.

      • hizzyshizzlizz

        I did not know about the Hitman frame cap for Nvidia, I was just saying based on your own results in THIS article there was a different performance conclusion than what was stated if you used all of the results. I am of the opinion that they are, for all intents and purposes, equal at 4K. You quoted “I like the testing method of this site, but it bugs me that some of the
        tests they perform are not included in the performance analysis and
        because of this there is a MUCH different conclusion” Here I am talking about how you did your FINAL performance analysis, your weighted average. Because certain games are omitted from this particular article’s final weighted performance, even though all of the games were fully tested at 4K as this was just a VS article and not a regular graphics review article. So in my opinion, because this is not a standard graphics review article, all games should have been included in your final performance because this shows that they are essentially identical in performance with the ‘winner’ jumping back and forth based on the game being tested.

        What I am trying to say is it looks like you, the reviewer, are being biased towards Nvidia because of how the FINAL performance review is CALCULATED. Once again I did not read anywhere in your article about the artificial frame cap for Nvidia on Hitman, if it was stated I am sorry for that and personally I think the game should be removed if there is a glitch hindering the performance of the game with one graphics card manufacturer or another.

        • http://www.eteknix.com/ Ryan Martin

          Interesting points but what games have I omitted? Only Battlefield 4 was omitted because we didn’t have results for all the cards. If anything most observers would say this review is bias towards AMD because of the number of AMD optimised titles (Sleeping Dogs, Dirt Showdown, Bioshock Infinite). If we include the same number of Nvidia optimised titles (like Batman Arkham Origins, Call of Duty Ghosts, etc) then Nvidia’s lead would be even more. I’m really struggling to see how this is an Nvidia bias article. And I will also add that we work very closely with both AMD and Nvidia so there is no reason for us to actively try and make one look artificially better than the other.

          • hizzyshizzlizz

            According to testing procedure.

            Those 7 weights are 1 for each test we do excluding Battlefield 4 and
            Hitman Absolution (so 6 games and 1 benchmark) which are omitted as we
            do not have results for all the graphics cards in those games.

            So according to this you did not include Battlefield 4 and Hitman.

            I should note that I do not think that you did this on purpose. You just used what you normally do because you have not tested older equipment on the newer games.

            My point is that this was a specific two man battle between 290X crossfire and 780Ti SLI, at least this is what I read from the title, and because of this fact the two that matter are fully tested on all games at 4K. So why not include all in the performance conclusion or add a section that does or something along those lines. Have the performance conclusion that is the standard but also include that 290X Crossfire does win if ALL games are included.

            I also want to note that I am not biased, I am just an engineer and looked at the full data noticed that they are so close yet the recommendation goes to Nvidia for ultimate performance at 4K, while i would say it is a toss up, too close to call, or however you think it should be worded.

            Each definitely has their pro’s and con’s. I would recommend to buy based on the games you play because they switch sides or are nearly even on so many occasions.

            Once again I am not attacking your article or writing style or your conclusion, I am just trying to point out that your method of calculating final performance might not be the most accurate for this particular article as this one is a kinda of special article talking about one resolution and two graphics setups.

          • hizzyshizzlizz

            Adding a clarification.

            I look at the data not the words, your data shows that the SLI setup is better by a statistically significant margin. I know that the last thing you say is it is a tough choice but your final calculated DATA shows that 780Ti is the much better option for maximum performance.

            All is associated to the calculated data because graphs speak volumes, at least to engineers like me.

          • http://www.eteknix.com/ Ryan Martin

            Okay I see your point – you think that the inclusion of Unigine and the exclusion of BF4 and Hitman skews the results towards Nvidia – I guess it does BUT all games will serve to skew the results in either direction unless they are “neutral games”. The way I see it is we have Tomb Raider, Bioshock Infinite, Sleeping Dogs that skew towards AMD while only the Unigine benchmark skews towards Nvidia. I also understand your point that Unigine isn’t strictly a game – in an ideal world we would have had more Nvidia games to choose from to make it balanced (that is something we’re implementing in upcoming reviews) but I guess the testing is what it is right now. I appreciate your feedback and your time, do you think that Unigine should be removed from the summary? I won’t reinstate Hitman though because of the aforementioned FPS glitch on Nvidia cards.

        • Zealot

          I have to agree with hizzyshizzlizz. Let’s start with why Unigine is a bad choice to benchmark: it uses tesselation up the wazoo without them being observed by the human eye. If we can’t see that much tesselation, why is it there in the first place? To simulate a potential workload I guess. But if it’s not visible, then it’s a useless workload, and therefore I disagree with its use in benchmarks in 2014.
          As for non-DirectX 11 games: drop them. AvP is from 2010 and it’s not even an MMO or MOBA, meaning it’s only interesting to niche fans who never got to get the game when it came out in the first place. Useless game to benchmark in 2014.
          If you want to test non-DirectX11 games, then go with the really popular ones, like World of Warcraft or Starcraft 2 or something. Otherwise no one in their right mind would waste any time playing an old game at 4k with the cream of the crop graphics cards, unless all my friends are still playing it.
          Adding more “nvidia friendly” games is the last thing that you should ever do. Like mentioned above, one should be guided by the popularity of a game when not testing a specific technological achievement in a game. Battlefield 4 is an important game because it’s the “latest & greatest” in a very popular series, not because it’s “AMD friendly”.
          You also left Mantle out of the picture. For people who play a lot of EA games, then choosing the 290X is a complete no-brainer, namely to people who will play Dragon Age when playing something in single-player and Battlefield 4 when playing something in multiplayer. For those who WON’T play either the decision logic should be different. Now, I know there’s only BF4 and Thief supporting Mantle at the moment, but do keep this in mind for 2014 and 2015.

  • fernandinands

    780Ti GHZ is maximum tier GTX 780Ti. Is a GTX 780Ti performing up to 20% faster than the original 780Ti.

    • Αντωνης

      Yes but it is also 20% more expensive from the original. The price tag is extremely high….

    • Black

      And? All cards were ran at reference clocks as stated in the review. Quote “As you can see above we have a rather complicated mix of cards – sadly
      we do not have two identical cards for each GPU. That said we will both
      running all cards at reference clocks for the respective GPUs”

      • fernandinands

        Good.

  • Luay

    Respect for reviewing $8,000 worth of hardware. I’m sure it took a very long time because once it was posted, a lot has changed to the better in this week in particular!
    Starting with the display, Samsung is selling a 28″ 60Hz 4K monitor for $700 on Amazon. The Power Color R9 290X PCS+ was listed for $580 on Newegg earlier, and it overclocks on air like no other 28nm card that exists today. CrossFiring it on a 7 slot ATX board does worry me but a 140mm quality case fan shooting air at the cards should keep them cool.

    • http://www.eteknix.com/ Ryan Martin

      Holy sh!t that’s good pricing on a 60Hz 4K monitor…link? I might add it into the article, could help a lot of people out.

    • 86james randy

      i own Samsung U28D590D (buy $565 (RM1,840) in Malaysia, shop Jayacom) good 4K lcd!! brilliant color production(for TN panel) fast respond lcd… oh ability to have 60Hz(oc to 75hz :P) but at moment im using a single Sapphire R9 290X TriX Oc (but still geting 35-40Fps on 4K as i disable MSaa kinda useless in 4K my opinion)

      • e92m3

        Yes, higher resolutions need less AA without question.

        Considering how utterly ridiculous and inefficient the concept of AA is to begin with, I would say you’ve hit the most relevant aspect of 4k…

        Getting rid of trashy AA, especially blurry crap like FXAA and TXAA.

  • Luay

    Two things I missed in this review;
    Another configuration I would have liked to see on your 4K display is the R9 290 Crossfired and for reasons other than price alone. I strongly feel the 290 will punch in the same weight class as the two more expensive cards but I won’t take the plunge until someone proves it.
    The 290X and 780ti in a dual card setup were putting out more than 60 FPS on a 4K display in most games. However your 60hz display can not show you those frames and this is the highest end of the high end. Each frame costs a lot of money, and the processing power that generates frames that can not be displayed is wasted and misused power!
    A paying customer would raise the graphics detail and perhaps add a little AA, I mean find the optimal setting to put out 60 FPS.

    • http://www.eteknix.com/ Ryan Martin

      Yes we are currently looking into R9 290 CFX and GTX 780 SLI to update our 4K results, we’re just having trouble sourcing 2 of each cards whereas we had two R9 290Xs and two GTX 780 Tis at hand to test with. Yes I agree with you about frame rates going above 60 but obviously it makes more sense for us to show the maximum it can do rather than all our graphs stopping at 60 because of V-Sync! The reason the settings are as they are is because during the first stages of our 4K testing we were doing single GPUs so we had to find a balance between playability and quality.

    • The_Freak

      Worth noting that as all the numbers are averages a portion of the power would still be put to use as min FPS would still be dropping below 60 if the average is approx 60. I’d love to see some min FPS numbers as that’s the number one thing that would effect playability on a 4k monitor :)

      • http://www.eteknix.com/ Ryan Martin

        I’ve never really had someone request minimum FPS figures from us before, however, it is something I will look into for future testing. Thanks for the feedback.

        • e92m3

          Umm… It’s really the only metric of any relevance if we want to talk about play-ability.

          Average to some extent and peak are completely useless metrics….

          You do not feel that spike in FPS as anything but roughness, it would feel far better and smoother to eliminate upwards spikes…

          Do you think the more variable input latency is a characteristic anyone wants when both solutions provide comparable minimums? It’s pointless, feels bad and throws off the interaction timing.

          Average metrics give very false impressions of how well a card actually performs, unless you do something like minimum weighted average.

  • miako

    these is just pitiful.. NVIDIA really? 2, 4 frames highest is 9 frames difference with AMD.. two thumbs up for AMD!!! not only it is cheaper but in two games the difference is 20+ and 18 frames against NVIDIA.. nvidia 2 to 9 frames difference is nothing.. the difference in performance is not worth the money plus it is more expensive and consumes more power.. TRULY GREAT JOB AMD!!!

    • http://rightwingreport.org/ Matthew Luce

      We must be seeing different benchmarks. What I saw was NVIDIA matching or outdoing AMD in most of the benchmarks. And “worth the money”? The cards are both $700!

      • 86james randy

        hmm i don’t know… Powercolor R9 290X PCS+ is selling for $550(http://www.newegg.com/Product/Product.aspx?Item=N82E16814131548) meanwhile cheapest GTX 780 TI selling $ 690+ … this price base on Newegg pricing… : D but there good chance you are Green fan.. and please not meant to start gpu bashing just what current pricing is..
        x
        True GTX 780 Ti performance 5-8FPS better then AMD R9 290X but… do 5-8 Fps worth the extra $100-140 ? ? :D

        • http://rightwingreport.org/ Matthew Luce

          hmm… you are right… but nvidia still has better drivers as well as physx. 290x also runs hotter and louder. however, 290x might be a better option if going strictly for price:performance.

      • Psycho_Blade

        Get .You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr

    • Thun3r

      I wish… I want the AMD cards but they dont have …. g-sync, 3D vision 2, shadow play, dedicated nvidia physx cards (amd cards did work with dedicated nvidia cards but Nvidia disabled it outside of their own a while back). I love eyefinity and hate surround but nvidia still has more features at this time.

      • http://www.eteknix.com/ Ryan Martin

        AMD has a 3D Technology (AMD HD3D), AMD has a G-Sync rival (FreeSync, not available yet but likely coming this year and is open source not proprietaru), AMD doesn’t have PhysX because it is an Nvidia software (AMD has things like TressFX though – neither is really relevant as DX11.2 is the programming standard of choice,AMD also has better OpenCL/GL support) and AMD’s Eyefinity is more flexible than Nvidia’s Surround (Eyefinity supports more displays and is more mature). Shadow Play is Nvidia’s one advantage out of that I’d agree with you on, it’s a very strong piece of hardware level software. G-Sync is a win for Nvidia right now because you can buy G-Sync monitors but FreeSync is not ready yet.

        • e92m3

          ‘Shadowplay’ is no longer an advantage. Since your post, AMD has released their own GPU hardware encoded recording system (the basic functionality has been there for a long time).

          It’s part of Raptr, called ‘Game DVR’, with essentially zero measurable effect on performance, just like nvidia’s GPU encoding.

      • https://twitter.com/streaky81 Streaky

        Nothing you really want to play supports physx and even where it does it’s not exactly killer feature time. Shadow play? Can live without that all day long.

    • Psycho_Blade

      .Get You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr..

  • Taylor

    In Canada the R9 290X averages for $599.99 – $629.99 while the Nvidia GTX 780 Ti is $769.99 – $799.99. I’ll save the nearly $200 and lose the few frames.

  • Ruggedy

    so when NVIDIA outperforms AMD, its only by 3-6 frames, but when AMD outperforms, its by 8-10 frames, considering that most games are optimised for NVIDIA cards as opposed to AMD, I guess I’m gonna be getting me me some AMD cards.
    thanks for helping me make my decision .. you’re awesome.

    • Psycho_Blade

      Get You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr..

  • Psycho_Blade

    Get You free AMD R9 290X CrossFire.. Giving out now in http://sh.st/eMaMr

  • Lachanche

    You are missing the fact that 4k monitors cap at 60hz, what really matters is consistency, how often you dip below the 60hz treshold.

    • Raff

      Almost never. Lowest is about 45 fps.

  • JodyHighrollah

    Nvidia paid the display so no wonder its shown to won at all aspects, and they didn’t even try or speak about mantle.

    • http://www.eteknix.com/ Ryan Martin

      You’re such a moron. Nvidia did not pay the display, they supplied one which we used and later was returned so other media could use it. In our first 4K article AMD provided us with a display. It makes no difference who provides the display, the display is for testing, not a bribe, I think your statement just reflects your own character and assumptions. Oh boo hoo we didn’t talk about Mantle in our article, big deal – only one game on the list supports it. No we just wrote an entire article about Mantle here, obviously that’s because we love Nvidia so much and because they buy our displays…..-_- : http://www.eteknix.com/testing-amds-mantle-battlefield-4-thief-and-pvz-garden-warfare/

      • spat55

        Well he wasn’t being rude and then you come up with this argument, I don’t think I will trust a word you write from now on. Next time try and be respectful to the people that offer constructive criticism.

        • http://www.eteknix.com/ Ryan Martin

          Accusing me of being bought by Nvidia and having bias in favour of them without any evidence is constructive criticism? I don’t think so.

  • Raff

    Wondering if you turned off AA or kept it on?, because I watched a video on Youtube where the guy stated in the description that at 4K resolution it is no longer necessary because hard edges aren’t visible any more which makes sense as there’s no need to smooth out already smooth edges.

  • Caleb Faulkner

    Holy shit…. talk about being unprofessional. He probably is an AMD fanboy (which is fine, I am too), and while his once sentence opinion about your review may have been a little off, it did not constitute you acting like a 15 year old. This is how people lose their jobs, Ryan. Smooth.

    • http://www.eteknix.com/ Ryan Martin

      You’re right, I’ll make a concerted effort to not respond to comments from now on.