porno

The Best Graphics Solution You Can Buy For Around £1000: Sapphire 295X2’s



/ 2 years ago

Next Page »

Introduction and A Closer Look


11003060_10205135957652286_1769078358_o

The battle of performance is one key area for buying a high-end graphics card like the AMD Radeon 295X2 or Titan/Titan Black/Titan-Z from Nvidia, but for those with a sensible head on their shoulders, you have to factor in pricing, though as an impulse buy, you may not want to. It’s always been a tight margin between AMD and Nvidia on the extreme segment market, as they both know that a premium can be requested from the consumer, and those wanting the best of the best will quite happily delve into their pockets to have it, and will most likely get it in the ear from their partners shortly after.

AMD have been quite generous with their Radeon 295X2 mammoth, water-cooled, uber, dual GPU monster as of late with price cuts left, right and center. While MSI and other brands have taken some money off to give the consumer a better deal, we’re finding one brand who has taken price cuts to the extreme. Sapphire are a market leader and for a very good reason, and with the 295X2 selling at a staggeringly low £599 including VAT at Overclockers UK, we want to see exactly how much performance you get for this amazing price.

We’re not going to talk too much about the cards aesthetics or cooling performance, as we’ve done all of that around 9 months ago, and if you fancy a brush up, you can check out our fully fledged review of the card on its own here. For now, we’re going to jump straight into how two of these Goliath cards operate in CrossFire and if they really can offer extreme unrivalled performance for an amazing price.

spec


Topics: , , , , , , ,

Next Page »


  • stranger

    Oh please, it’s not, Titan X all the way(WCE will probably easily go over +20% to stock performance), but i guess it’s apples and oranges, for me single-gpu performance >>>>> multi-gpu perf., then again, Nvidia is too arrogant with its 3/4(if not 4/5) market share…

  • Iluv2raceit

    Uh, Rikki – you just lost all credibility in the PC hardware review world. You actually set out to compare apples to oranges? You compared a dual-GPU and quad-GPU setup to a single GPU and dual-GPU setup? Really? Idiot.

  • Nathanael Freihart

    I dont understand the other commentors. HE says its the best GPU setup for 1000€, and it is. He may could have compared it to trippel-sli-970, but aside from that he did alright.

    • stranger
      • Nathanael Freihart

        I know these Charts, and you literaly only choosed the one where the TitanX is superior 😀
        Lets See if you dare to post the whole chart. Overall the 295×2 kicks the TitanX around.

        • LoG4n27

          Well, TitanX is a monoGPU, the 295×2 is a biGPU so can we really say that the 295×2 is superior anyway?

          • Nathanael Freihart

            Lets look it in three ways:
            Best Card: yes the 295×2 is the card with the best performance,
            Best GPU-setup for 1000€: two 295×2 are the best bang fort the given price, it would beat three 970, two 980 or a TitanX (maybe two TitanX could beat it but that would be double the price.)
            Best GPU: TitanX is the best GPU right now, but it may will be replayced soon by the 390x.

          • LoG4n27

            Not to me, the consumption of the 295×2 is way too high (~1200Watts for CF, 600watts for one card). A TRI SLI 980 “only” consumes 470 watts : http://www.guru3d.com/articles-pages/geforce-gtx-980-sli-review,4.html

            Best card : Definitely not the 295×2 to me, ok for the performances, but regarding the noise, the TDP, etc… ouch!

            Best GPU: I do agree with you, we’ll see when Nvidia will release Pascal GPUs

          • Nathanael Freihart

            Its relly funny how NVidia-fan(boy)s are bitching about powerconsumption, Temps, TDP etc right now… do you remember the 480?, or lets compare 700 series to there competitors the 200 series, oh and dont forget the titanX. Btw its LOUDER, HOTTER and SLOWER then the for DOUBLE the price.
            http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/16
            http://www.anandtech.com/show/7930/the-amd-radeon-r9-295×2-review/17
            Nvidias “improvment” in Powerconsumption: http://www.eteknix.com/gigabyte-g1-gaming-geforce-gtx-980-4gb-graphics-card-review/17//
            More then a 290x in “UBER” thats a OUCH.
            A and Pascal: will take about 1,5 years its released ,so 400series is right behind it. Besides that i still wont bet on it, i could explain you why but that takes some time, but if you want wecan do it.

          • LoG4n27

            Don’t start with the Nvidia Fanboy, kinda ridiculous and you’re not talking to a fanboy so don’t be disrespecteful. I remember the 400 series cause I had a 470gtx so thank you. And comparing the current cards with a 4-5 years card is not pertinent. So yes, I remember the old 400 series, but we are talking about TODAY, not years AGO.

            1/ Taking a furmark load as a basis is not very clever cause a GPU will not need that load of power anyway in games. Oups, please : http://images.anandtech.com/graphs/graph9059/72533.png
            2/ You can compare a 970gtx to a 290X, not a 980 (the performances between a 980 and a 290X are not the same. You have to wait the future 380 from Nvidia if you want to do so. Be objective.
            3/ In Guru3D link, the TDP of the 980gtx is 171watts, the 290X TDP is 281watts. No need to talk about the temps, last Geforce series is beating AMD’s GPUs.

            4/Do you really think that Nvidia will sit quietly 1.5 years without releasing a card to compete with AMD? That’s kinda cute.

            Anyway, was nice talking to you. Conversation’s over cause you have your ideas, I have mine.

            Cheers,

          • Nathanael Freihart

            I hust said that these are often forget, and that has to do with tzhis topic that it was NVidias card with which People cocked scrambled eggs, and so they shouldnt call 290x “The Oven” or “Alaska Room heater” when Nvidia fucked up harder not long ago, and agian i compared the gtx 700 with the R9 200 so ay you may remember, they didnt have much difference in Perf/watt, but now that Nvidia claims to be so efficient its the most important thing buying a gpu, also the powerusage they claim to have a jaust wrong and the cards often use double the watttage.
            1) It is important to see whit what PSu you have to calculate and also to see that NVidia just plain lied.
            2) Again they said it would use x<165Watt, and a slight oc leads to ~270% the wattage.
            3) Only in refference coolers and you normaly don get/buy these. the lowest temp-cooler for the 780Ti i found was from Inno3D with 63°C while the lowest-temp cooler for a 290x from Powercolor gets below 50°C, so Nvidia isnt better on Temps ether.
            4)No but all the hype for Pascal are about technologies that arent even meant for gaming and do you think AMD is sitting on is Hands all day, at least they wont lie to their customers.
            Have a good night.

          • LoG4n27

            Hey Nath,

            1/ Yes sure, but this consumption is not even close to ingame consumption. I admit it’s interesting.
            2/ Yes ok, what about the AMD’s GPUs? To me, 270 watts is quite correct for a 980.
            3/ The 780Ti was an “Ouch” yes. Maxwell’s GPUs are better speaking of temps and wattage.
            4/ 970 Gtx is not a lie, I have it, it’s a very good card (price/quality). We all know about the 3.5+500mo hype, but to me and many 970 users, we don’t care, this card is a beast and it does not change anything. Plus, many reviews have shown that this configuration have no or at least a veryyyyyyyyy small impact.

          • Nathanael Freihart

            Hey Log,

            1/ You are right that this Power usage will hardly appear in games, but the problem is it sometimes can. furthermore will the new API`s lead to a better Usage of the overheads and the GPU-hardware in general, so they will be used much closer at their limit,which is good in the first place because higher fps, but it will also push the Powerusage hard (as seen by Mantle). If somone planed on the stats that Nvidia has given it may could lead to a dfamaged PSU or even a dead PC.
            2/I meant it uses way more then double (ca. 270%) of the Wattage then NV said, and this differnece is not OK.
            3/ The gtx 700 and the r9 200 where pretty close, back then. But now that Nvidia claims to have made hugh improvements its used as the main argument not to buy a amd GPU, also the given stats arent right, the comparission of a old generation of GPUs and the Point that Performance and Performance/Price are way more important.
            4/ Its a beautifull card and its also a poerfull card, i am not claiming anything else. But its build on lies, 1)The memory-layout wasnt as discribbed 2)The Badwith was lower as advertised 3) The ROPps were 56 instead of 64 3)the L2 cahe was 1792kb instead of 2048kb 4)The Power Consumtion is in Games more then double as high then what they claimed it would be 5) Nvidia didnt apologize to their customers, they are still havent changed wrong statsout off their oficcial data-sheet and they didnt make it up to the customers they conned.
            So I hardly can like a companny like NVidia

          • LoG4n27

            I do agree with you this time. Thanks for all the explanations, very useful Nath.

          • Nathanael Freihart

            Sorry it took so long, but we had a nice conversation while it lasted.
            Haave a nice day Log.

        • stranger

          Well, that’s because it is, multi-gpu doesn’t work in a lot, i mean, a lot, of cases, so in worst case scenario you’re gonna be stuck with not the best example of 290x(and 4th gpu is pretty much useless), while with TItan X it’s 100% usage all the time, that’s why for me it’s clear, that it’s best to get everything you can out of the single-gpu(“WCE will probably easily go over +20% to stock performance”) and go on from there, if you want, i think most people with this kind of budget can do that.

          I do hope that 390X will be faster, even if it will cost $800(because of HBM and all).

      • Nathanael Freihart

        You literaly only choosed the slides where crossfire is disabled 🙂 a 295×2 is two 290x combined in a single PCB, so if CF is disabled its just in 290x. And you can geht one und er 250€, so you may should compare it with 4 290x

    • Nathanael Freihart