AMD HD 7000 series to use PCI-E 3.0



/ 7 years ago

The next generation of AMD graphics cards will be branded under the HD 7000 series name tag which follows on from the current HD 6000 series cards. According to Donanim haber the HD 7000 series cards will all use the PCI-E 3.0 interface, as well as being backwards compatible with older PCI-E 1.0 and 2.0 interfaces. Most motherboards today feature the PCI-E 2.0 interface but some of the latest motherboards are being shipped with PCI-E 3.0 capabilities including those most recently from ASRock.

The new bus doubles the bandwidth over PCI-E 2.0 with 1GB/s of bandwidth per lane. PCI-E 3.0 16X will have 32GB/s of bandwidth compared to 16GB/s in PCI-E 2.0 16X mode. The Codename of the next generation 28nm GPU’s will be “Southern Islands”, the first platforms to be PCI-E 3.0 compliant are likely to be Intel’s Sandybridge-E processors. It is also unknown whether the motive for AMD moving to PCI-E 3.0 is to avoid a PCI-E 2.0 bottleneck or whether it is to meet compliance of the new PCI-E standard.

Attached files

radeonhdgraphics.jpg (22.3 KB) 


Support eTeknix.com

By supporting eTeknix, you help us grow. And continue to bring you the latest news, reviews, and competitions. Follow us on Facebook and Twitter to keep up with the latest technology. Share your favourite articles, chat with the team and more. Also check out eTeknix YouTube, where you'll find our latest video reviews, event coverage and features in 4K!
eTeknix FacebookeTeknix TwittereTeknix Instagram

Check out our Latest Video

Comments

4 Responses to “AMD HD 7000 series to use PCI-E 3.0”
  1. Eternalchaos says:

    odd how can AMD release cards with pci-e 3.0 and not a single one of AMD motherboards (990FX) support it :confused:

  2. gaetan says:

    Eternalchaos;22207 wrote: odd how can AMD release cards with pci-e 3.0 and not a single one of AMD motherboards (990FX) support it :confused:

    that's how companies work…incomprehensibly :p

  3. Leo Bien Durana says:

    Epic fail for AMD or epic win for Intel? lol

  4. gaetan says:

    Good question, I'll have to think about that 🙂

Speak Your Mind

Tell us what you're thinking...
and oh, if you want a pic to show with your comment, go get a gravatar!


Optimized with PageSpeed Ninja