PlayStation 4 & AMD Radeon R9 290X GPU Share The Same 8 ACE’s

/ 3 years ago

We’ve seen quite a lot of details about the PlayStation 4 GPU and the new Radeon R9 290X Hawaii GPU, but this latest set of slides show that both the Liverpool GPU that is inside of the Sony PlayStation 4 could also be part of the same Volcanic Islands family of GPU’s as the R9 290X.


R9 290X Hawaii GPU

Launch games for both Xbox One and PS4 look fantastic and while the PC Master Race may debate that PC gaming is better (in terms of graphics, certainly in some cases) there is no doubt that the new consoles are leaps and bounds ahead of their current gen counter parts and better is simply better. The question is, how much raw power really sits inside the PlayStation 4 GPU, because as we all know, release titles are absolutely no indication of the true power of the hardware, just look at PS3 launch titles compared to games released this year!


R9 290X Hawaii GPU

Slides of the R9 290X Hawaii GPU revealel that the Volcanic Islands GPU will also feature 8 Asynchronous Compute Engines. Each of the 8 ACE’s can manage up to 8 compute queues, at a total of 64 compute commands. Compare that to the HD 7970 which only has 2 ACE’s that can only queue 2 compute commands for a total of 4 & the Xbox One only has 2 ACE’s but they also manage up to 8 compute queues for a total of 16 compute queues, still a long way short of the PS4’s 64.


PlayStation 4 Liverpool GPU

PS4 GPU has a total of 2 rings and 64 queues on 10 pipelines

Graphics (GFX) ring and pipeline

  • Same as R10xx
  • Graphics and compute
  • For game

High Priority Graphics (HP3D) ring and pipeline

  • New for Liverpool
  • Same as GFX pipeline except no compute capabilities
  • For exclusive use by VShell

8 Compute-only pipelines

  • Each pipeline has 8 queues of a total of 64
  • Replaces the 2 compute-only queues and pipelines on R10XX
  • Can be used by both game and VShell (likely assign on a pipeline basis, 1 for VShell, and 7 for game)
  • Queues can be allocated by game system or by middleware type
  • Allows rendering and compute loads to be processed in parallel
  • Liverpool compute-only pipelines do not have Constant Update Engines (presented in R10XX cards)

Sure this is all technical as hell, but when it comes to hardware like this, biggest numbers often translate to better performance. Great news for developers, but also a good indication that the hardware no doubt has a few tricks saved up that will benefit games in the future.

Thank you Gamenmotion & VGLeaks for providing us with this information..

Images courtesy of Gamenmotion.

Topics: , , , , , , , , ,


    this is insane!

  • Guy Parris

    I smell bullshit

    • Richard Watkinson

      For once, I don’t! Sony have already explained they’ll make a loss on the PS4 but fully expect to make it back through PS+ subscriptions. Even though they had the cash and the reputation to seal certain exclusives for the PS3 they still caught flak for it being (compared to the competition) underpowered and hard to work with. To see them overcompensate this time and try to shove as much power in the machine to guarantee superiority wouldn’t be unrealistic?

      • Guy Parris

        your telling me they can fit really powerful components in that tiny little case get the fuck out of here. It’s all false advertising bullshit just like how they said games will run 60 fps on our system yet developers are still targeting 30 fps. There is no way in hell that the ps4 is this powerful it’s shitty 1.6ghz cpu would bottleneck a good gpu!. I really can’t see sony loosing 100’s on making ps4 due to picking powerful components.

        • John smith

          Hey hey hey, no need to curse. Remember, it’s a console. Optimization is everything. And the case has nothing to do with it. Just look at mini itx comps. There are a few that can fit pretty powerful components in it.

          • Guy Parris

            better optimization really? you want to play games on pc and see the difference. when I played darksiders 2 on consoles it was horrible. screen tearing, stuttering, buggy shadows. on pc none of that smooth as anything frames, no buggy shadows runs like a dream. same goes for other games. when consoles start to die out there is no stopping screen tearing like I said consoles are so limited and so overpriced. You ever seen steam sales?. Even looking at the 360 with bf4 on youtube you see the horrible screen tearing same will happen with ps4 and xbox one. They will start out okay but once the start to suffer shit comes with it.

          • Stefan

            I’m pretty sure you have no idea what you are talking about.
            Screen tearing occurs at high framerates not low framerates, stuttering occurs at low framerates.

            The PS3 and Xbox 360 cannot be used as a comparison to PC as they use completely different architectures which aren’t even remotely compatible.
            The PS4 and Xbox One (to some extent) are quite compatible with PC as they use similar architectures and are easier to optimize for, and with their APU’s specifically designed to increase performance and negate bottlenecks the games might perform better than PC counterparts.
            I know you couldn’t build a PC with better performance than the PS4 for the same retail price simply because the PS4 is making a loss since its components cost higher than the income they are receiving from sales don’t cover it, hence why Sony went with pay-to-play online subscriptions with PS+.

          • Guy Parris

            screen tearing is caused when it goes over your refresh rate I know what I’m talking about lol. not at high frames you dick even if it was to go over 61 you would get screen tear doesn’t have to be 80 fps or 120 fps as long as it goes of your refresh rate it screen tears.

          • George Hillier

            Please be quiet, yes, screen tearing is caused by that, but that would mean that the tv which the console is connected to would have to be about 30 years old. I’ve never experienced screen tearing on a console as their fps is a lot lower than their refresh rate. I don’t know what tv and console you were using but the tv must’ve been the 1st even lcd screen. I own a pc, but it’s not for everyone, many people want to be able to put a disk in and play, not build a pc then have to download a game on their crappy connection. Consoles are good, just for another group of people, who actually aren’t “lazy” they’re just not interested in setting up a custom pc and paying a load of money for it and upgrading every few years.
            Please, sit down, stop flaming, and go back to your nvidia fanboy corner. Neither are ‘better’, they have their own pros and cons, amd are better at slightly cheaper, but nearly just as powerful cards, nvidia are better at slightly more expensive, powerful cards. Basically, amd for “bang for buck”, nvidia for performance at a higher cost (oh, and also for retards like you who think they know everything).

          • Guy Parris

            awh really well I have had screen tearing with the wii u on darksiders 2 and with games on 360. The t.v I’m using is a 1080p LED sony bravia not one of the shitty things your mentioning. How the hell are they the best bang for the buck when origins has stopped using them for there pc’s because of bad customer feed and problems, there more like best bang for being cheap bastards. I’m no fanboy as I stated. I just don’t use a company that gives you a very bad experience. amd have nothing on nvidia apart from being slighty cheaper that is it. as for console’s not screen tearing please have a look at this eh are you’s people rather thick or just a bunch of noobs????


            If you want another lesson feel free to get in contact.

          • So Here we go

            Just look up “amd beats nvidia” on Google, then look up “nvidia lied” on google, then look up “nvidia GPUs sued”…….. You get the picture. FACT is that Nvidia also loses out to AMD and makes poor GPUs, as well as lies about what it’s hardware can do ( even when they don’t have the hardware up and running ) LOL. Just look at the (( Fermi Card at GTC )) where Nvidia lied about it’s GPU tech..

            As GUY PARRIS likes the site called (( extremetech )) lets not forget that (( AMD destroys Nvidia at Bitcoin mining )) ………..LINK = . Now as you should know the future is GPGPU and that’s where AMD rules over Nvidia as the link shows us on page two….

            “The Radeon 7790, a $149 GPU,
            offers 80% of the GTX Titan’s performance for 15% of its price. The
            Radeon 7970 is twice as fast at half the price. Even the
            CUDA-accelerated kernel doesn’t bring Nvidia hardware into the same
            league as AMD’s” …..

          • Guy Parris

            lol all you AMD boys were like awh after market coolers will drop they temps right down that turned out to be bullshit!! they still hit 94c now that’s a cheap as card you’d be lucky if my gigabyte 770 went over 50c in bf4!. AMD bullshitted about how the 290x can handle heat when the card throttles due to it being so hot and reviewers got different version of the 290x because reatailed versions performed worse and the thing sounds like a fucking jet engine man!. Amd are not good at all there all hype!. the 770 matches the 290x in batman origins that’s a £270 card matching a £430 lol. The £780 and the 780 ti beat the 290x. Nvidia are the kings man there unbeatable there beating amd in nearly every game even in bf4 a game supposedly for AMD and there getting beat same for the rest of there AMD games nvidia wins!. AMD are cheap and nasty if you want to get a £440 microwave card that will die in a year then you go for it lol. AMD are cheap for a reason and there in debt for a reason nvidia and intel hold most of the market and the hold it for a reason because there the kings of the pc world and there aint fucking all your cheap AMD ass can do about it apart from talking shit :).
            Subject: Re: New comment posted on PlayStation 4 & AMD Radeon R9 290X GPU Share The Same 8 ACE’ s

          • You really are quite the troll aren’t you lol

        • Richard Watkinson

          > Sure they can! with the right airflow and design you can put anything anywhere. Companies stick dual 780m’s and quad core mobile i7’s inside laptops, its just how you lay out your space!

          > The clock speeds on this card arent as high as the desktop version (800mhz vs 1ghz) which, with the right power consumption and programming can really be a hefty saving on on power draw and temperatures.

          > Sony havent released ANY details on the clock speeds of the cpu in the system, apart from the documentation supplied with the devkits that state a maximum frequency of 2.75ghz. Remember that this is an APU setup with shared memory space between the GPU and CPU which goes a loooong way towards reducing bottleneck. AMD designed this stuff, you think they don’t know how to stitch their own hardware together?

          > Hundreds is what WE pay. with a heavy supply deal on these GPU’s Sony could get an amazingly close-to-cost deal with AMD.

          Instead of being critical, HOPE that this is true, because it would bode well for cross platform development. 🙂

          • Guy Parris

            mini components suck compared to desktop components. AMD are useless when it comes to components intel and nvidia beat them all day long. The reason nvidia didn’t go with sony was because they said themselves consoles are weak and there not interested in them the gpu that’s in there system is half the performance of a 670.

          • Richard Watkinson

            its like arguing with a rock created from compounded ignorance

          • Joe Wingett

            I know that feeling…

          • Guy Parris


            got to loooooooove this.

            “Based on our 15+ years of experience building and selling award winning high-performance PCs, we strongly feel the best PC gaming experience is on Nvidia GPUs.”

          • Guy Parris


            have you seen this? amd is cheap for a reason and there a pile of garbage. good luck to sony and microsoft for using the cheapest components possible and good riddins.

          • Guy Parris

            as for temps. Pc’s will always always be better with temps on components there is no question. you can get better cases with better airflow you can add fans, you can take your pick on what cpu cooler and gpu’s have the best cooling. You can even clean the dust out to reduce temps. consoles are so limited you can’t even open them up to clean them out without avoiding warranty. Consoles will overheat much more than pc’s.

        • Peter Donnell

          You sir, are a moron. To say the issue with power is the form factor doesn’t make any sense, to deny that the console packs a good level of performance is just ignorant.

          • Guy Parris

            I didn’t say it didn’t pack a good level of performance. I said they will start out good and slowly degenerate as per usual and there’s nothing stopping all the nasty thing’s like screen tear ect. My rant is this article there’s no way a ps4 has this kind of power.

  • Drayton

    So I guess porting PS4 games to PC (and also the opposite) with Mantle will be easier, right?

  • Geoffrey Chen

    ummm how much is the 290X? and how much is the PS4? If they were to implement the 290X at all, everything would be downscaled by a lot.

    • Yes the same number of ASEs doesn’t mean the same amount of compute power. It has the same structure but a reduced capacity architecture.

      • Geoffrey Chen

        Precisely, they need to change some parts to this article…

  • Diego

    Okey……so Sony put a <600$ GPU on a 500$ish console????? Ya sure man -.-

  • Sp1k3

    The 290X has about triple the floating pointing point operations per second, don’t expect the PS4 games to perform anywhere near the R9 290X on a modern PC. TFLOP/s is probably the only fair way to compare different designs of GPUs, therefore PS4’s Liverpool GPU is between a Radeon 7790 and GTX 660.

  • Guy Parris

    have you amd fan boy’s read this!! horrible horrible cheap ass company that make cheap ass component’s

    • Joe Wingett

      You sir, do not understand how businesses or computers work. If one company, namely Origin PC, drop AMD citing problems, chances are they’re really doing it because the profit margins are higher on their other brands, such as nVidia. There is also a good chance that nVidia have entered a contract with Origin, providing cheaper GPU’s at the cost of Origin’s ability to use AMD cards. So, until you actually know what you’re talking about, don’t talk.

      • Guy Parris

        listen amd gpu’s have always always suffered from problems. The company is a joke and so are there component’s.

        • Joe Wingett

          Tell that to Sony or Microsoft. Hell, tell that to most sane people and they’ll laugh at you. My little A10-powered laptop kicks ass – and it’s not exactly suffering problems, despite using asynchronous crossfire. And I might add, it runs cooler and achieves better framerates than a friend’s intel/nVidia laptop. Yes, if you’re willing to pay more, you’ll get more power out of an intel-nvidia rig, currently at least, but the cost difference doesn’t reflect on the power. Besides, if you actually understood the reasons behind the report you posted, you would know that running dual dual-gpu cards in SLi or Crossfire is always a bad idea – 690’s have as much of a bad habit of melting as the 7990 does. Heck, nVidia can’t even make a dual-GPU card for their current series – it would run too hot.

          • Guy Parris

            gaming laptops are extremely gay. crossfire is much worse than sli. give them time I’m sure the 790 will be coming.

          • Joe Wingett

            When you’re mature enough to word a complete answer tell me more about how you know so much about computers.

          • Guy Parris


            amd’s 280x get’s beat by the 770 and 780 I wonder if the 290 is actually going to beat the titan as it has been stated. I really don’t think so. If the 280x can’t even beat the gpu it’s targeted too which is the 770 then I highly doubt the 290x will beat the titan, 690 or the 790 when it’s released. Just another bullshit gpu series yet again from amd. no wonder there in a finical state.

          • Joe Wingett

            Actually, you really don’t know what you talk about. The only new card in the series is the 290x, which benchmarks indicate is at least equal with the titan, if not better. And the 790 will not be released – the amount of cooling required on two 780 chips is too high. Besides, a dual-gpu card is only likely to be used in situations with limited slots, and those situations tend to have poor airflow – nVidia know this and know that their target market would rather buy two 780s than a 790. AMD also know this, which is why dual 7990s is a bad idea.

            Look back to the 500/600 series for inspiration. The 7970 came out to beat the 580. It did this with a huge margin. It then carried on beating every card until the 690 came out, a dual-gpu card. AMD produced the 7990 as a result, beating the 690 hands down. Then, nVidia made the titan, and used it to base their next line of cards on. The R9-290x is following in a similar suit, as a base for the next line of GPUs.

            And yes, the stock 280x cannot beat an aftermarket-cooled GTX 770. But nobody buys the reference card and keeps the stock cooler or clock speed.

          • Guy Parris

            how do you know it can beat a titan when it hasn’t even been released yet or benchmarked? I will believe it when I see it. So far its the 260, 270,280 and they all get owned by nvidia.

          • Joe Wingett

            Because the GPU itself has been completed and released for testing, just not for consumer use yet. And you should add that the nVidia cards that beat them are easily £100 more expensive in almost every case. You should also mention you are comparing non-reference nVidia cards against reference AMD cards.

            And if you don’t understand what that means, you do not belong here.

          • Guy Parris

            are you 100% sure that there 100 more expensive. here in the uk the 3gb asus 280x is £310 and the 4gb msi gaming 770 is £354 . do some looking before you jump to conclusions and talk rubbish. I understand non- reference cards and then you get aftermarket cards like gigabyte, msi, evga ect which have higher clock speeds, better cooling solutions. reference cards are your bog standard cards with nothing done to them and reference cards suck there bad for temps and nosier.

          • Joe Wingett

            So the highest price of the AMD card versus the lowest price of the nVidia card clearly shows the average?
            And the 7970 happily competes with the 770. It’s the 780 it can’t beat.

          • Guy Parris

            no because for that 44 more you getting 1gb more vram, higher clocks, better drivers, better temps, smoother frames, better in hd, adaptive vsync, physx, shadowplay, gpu boost 2.0. you are so misleading the 770 beats the 7970 hands down.

            if the 280x can’t even beat the 770 what is the 7970 going to do hm? even at 4k the 770 owns the 280x that’s amd’s territory better at higher resolutions/multi monitoring. Yet nvidia just come along and steal the show as per usual 🙂 nvidia will alwaaaaaaaaays be better than amd because amd are cheap for a reason.

          • Joe Wingett

            For a start, you don’t seem to notice the fact that at 4k the AMD GPU is beating the 770. Again, you’re comparing a reference card to an aftermarket card. And once more you fail to back up your argument with reputable evidence. I’m not going to reply to you any more – you’re too full of bull. If you can explain what adaptive vsync, shadowplay, or physx actually do, they you must understand that they are not features to target.

            Geez, arguing with a fanboy is a nightmare – do they teach you how to come out with that crap at nVidiot classes or do you just inject yourself with advertising garbage?

          • Guy Parris

            they both beat each other in the 4k benchmarks the point I’m proving is that’s where amd is supposedly suppose to shine over nvidia but they don’t. Where does it say aftermarket and reference? you do know there are aftermarket 280’s available now yeah?. adaptive vsync provides extremely smooth frames, shadow play records gameplay and physx add’s really cool effects in games. not features to target hahaha who wouldn’t want really smooth frames?. It’s just called common sense. false advertising is saying your going to change latency issues and saying your next gpu’s are going to destroy the 700 series yeah looks like it. At least nvidia back there reputation up and constantly provide better drivers and there frames are so silky smooth. Okay you do that because whatever you say is irrelevant and you are just getting made out to be a fool. AMD suck get over it man. consoles were doomed from day one when nvidia rejected them now look what’s going to happen there going to get terrible gpu’s. Imagine how it feels to be ditched by an awarding winning company because of there horrible nasty gpu’s I feel bad for ms, sony and you. Like it said in the article nvidia is the better experience. Awh I bet the whole customer feedback was bullshit then? just stick to amd like a cheapskate.

          • Guy Parris

            I’m not a fan boy I have just experienced amd’s horrible gpu’s I would never touch there cpu’s after reviews on the 8350. That’s false advertising right there again they said awh were going to blow intel out the water blah blah and did they???? no they got pumped by intel as per usual. nvidia and intel hold most of the market. When I had my 7950 it was terrible, I was getting white flashing in games, stuttery frames, artifacts horrible horrible dirvers all it was was beta’s and finally a proper driver suite would come 3 months lol later. Even darksiders 2 when you put vsync on through the catalyst control centre I still got screen tearing because it kept going over my refresh rate by like 1/2 fps that’s how bad they are they can’t even do vsync right. nvidia on the other hand is lush. you might wonder why I needed to force vsync on well games like dead space 3 when you use in game vsync it caps it at horrible 30 fps so you had no option to play with screen tear then I found out it’s best to force vysnc okay I will try that. Awh would you look at that still screen tearing because of the shitty ccc vsync not working properly. moved to nvidia caps it at 60 fps and no more lovely now I can have smooth frames and no horrible screen tear. the thing is forcing vysnc is smoother than in game vsync so your kinda fucked if you go with amd then eh? you either stuck with latency issues/laggy frames or screen tear :).

          • RedBearded T

            The 8350? You mean the one that would cost another $100-$200 for the equivalent Intel CPU?

    • Well it is very interesting that you’d use this article to attack AMD when even the author of the article questions the validity of Origin PC’s claims stating “stability and overheating issues? Radeons and GeForces from major board partners often ship with similar coolers, and their GPU temperatures are usually in the same ballpark—whether on entry-level or high-end parts”. I think you’d be an idiot to suggest AMD is “worse” than Nvidia. Even the most bitter fanboy with a shred of intelligence can see both AMD and Nvidia have their respective advantages and disadvantages. Nvidia often benefits from better power efficiency at the high end while AMD benefits from having dramatically more compute. Nvidia offers excellent CUDA acceleration while AMD offers excellent OpenCL and GL acceleration. AMD has more mature multi-display technology while Nvidia has a more mature clock-speed boosting algorithm (GPU Boost 2.0).

      • guy parris

        haven’t amd been saying for months that there fixing latency issues and some people are saying they have for single gpu’s looks like it!. please, comparing amd to nvidia is like comparing a snez to a ps4. nvidia are way better and more advanced than amd. amd are the noob’s of the pc industry and they don’t know any better. nvidia uses less power consumption like you said but they provide smoother frames, physx, adaptive vsync, better temps, looks better in HD, better and more frequent drivers. What’s amd good for, more power consumption, shit drivers, looks worse in hd, higher temps, worse frame times. latency issues. takes forever for new drivers or problems getting fixed. AMD suck. There is no point in comparing the 2 becuse amd is not in nvidia’s leauge. Amd are for people on a budget or don’t want to spend 30 or 40 more on quality components.

        • Frame latency issues exist only in some CrossFire configurations and on some 4K panels, AMD fixed the vast majority of them and they never existed in single GPU configurations. All your other points are just unsubstantiated, better temps – not true. Looks better in HD – not true. Better and more frequent drivers – not true (both output beta drivers at a similar pace). Higher power consumption is only true at the highest end right now (7970 vs 770 and so on) at the mid range both are competitive in some cases AMD is better (HD 7870 is one of the most power efficient GPUs on the market). Anyway I don’t intend to respond again, it’s like feeding a troll.

          • Guy Parris

            amd even suck at 4k

            see that’s what you’s boys keep saying when I had my gigabyte 7950 2 month ago I still had worse frames hence the reason why I got it refunded and went straight for a 770. The only reason I went to amd is because people were saying how the frames are smoother no more latency issues. It’s a load of nonsense. are you being serious about the drivers? it took them 3 months to make another suite all the rest were beta’s, they do look better in hd more detailed amd looks less crisp, lower temps by far. right now my gpu on idle is 22c my 7950 use to idle around 28/29. 770 is not mid range lol what are you going on about? the 770 beats the 7970 by 10 to 12fps in some cases. the 280x can’t even beat the 7970 and it’s suppose to be a 7970 rebrand it’s suppose to perform better. FAIL!!. 770 is a high end card. midrange is like a 7870, 660 ti, 7950 ect.

          • Georg Dirr

            You are retarded. The latest 390x fury figures clearly show amd is superior at 4k gaming. Go suck an Intel weeny.

        • Georg Dirr

          Lol 290x gets same figures as titan x on 3dmark draw calls. I will pay 300 or less for my 290x. Enjoy paying over 700 for your titan x.

        • Multiplataformgamerz

          “AMD the noob’s of the pc industry” lmao, we are talking about AMD who took all the intel sales down over the years, the noob that INVENTED 64 bit processors, an almost 40 years old noob. Dude make us a favor and remove your mouth and fingers from your body so you don’t write/talk sht anymore

    • Major Tom

      Consumer brand loyalty is kinda dumb, and has no advantage to you, the consumer. I don’t think, as a Fanboy, you should be able to label others a fanboi.

    • Georg Dirr

      Lol, enjoy paying for Intel ceo bonuses. We amd are the future. Not money hungry corporates.

  • Georg Dirr

    390x fury is now the 4k gaming king and cheaper then nvidia titan x. Enjoy. Also fury x can only get better with Vulkan and dx 12. Game over nvidia, get your golden handshakes and parachutes ready for nvidia ceos, don’t worry nvidia tards helped pay for them lavishly.