AMD Inc. (AMD) went into 2017 with an aggressive product roadmap that both excited bulls and invited skepticism from bears: The company was promising to roll out PC and server CPUs that would make it competitive in several markets where Intel Corp. (INTC) has long been quite dominant, as well as GPUs that sought to do the same in high-end PC and workstation markets that Nvidia Corp. (NVDA) has been dominating.
On the CPU front, AMD, aided by its new Zen CPU core architecture, has largely been making good on its promises, even if there questions still remain about how much share it will take in the notebook and server CPU markets. But in GPUs, judging by the details recently shared for high-end desktop chips based on AMD’s new Vega architecture, the company faces a more uphill battle.
At the SIGGRAPH graphics conference, AMD unveiled the Radeon RX Vega 64 and 56, a pair of desktop GPUs that respectively carry graphics card MSRPs of $399 and $499 and begin shipping on August 14th. Both products will be paired with 8GB of high-speed HBM2 graphics memory and should easily outclass AMD’s prior flagship desktop GPU, the Fury X.
With the caveat that teraflops (TFLOPs) are an imperfect way to measure a GPU’s performance — until reviewers share benchmarks in a couple weeks, they’re the best metric we’ve got — the Vega 64 delivers 12.6 TFLOPs of 32-bit (FP32) processing power and the Vega 56 clocks in at 10.5 TFLOPs, while the Fury X, based on AMD’s older Fiji architecture, delivers 8.6 TFLOPs. The Fury X, which has carried a $649 MSRP, has been an afterthought in the high-end market thanks to the presence of superior Nvidia GPUs based on the Pascal architecture Nvidia unveiled last year. That has effectively restricted the AMD-Nvidia wars to sub-$300 products, where AMD’s low-power Polaris architecture GPUs have been pretty competitive.
In addition to standalone Vega products, AMD announced three “Radeon Pack” bundles that come with $300 worth of discounts on a 34-inch Samsung display and a motherboard containing a high-end AMD Ryzen 7 CPU, and (in some regions) two free games. Vega 64 and 56 bundles cost $100 more than standalone cards, while $699 gets you a bundle with a liquid-cooled Vega 64 that delivers 13.7 TFLOPs of performance.
Also unveiled: the Radeon Pro WX 9100 and Radeon Pro SSG, a pair of Vega-based workstation graphics cards that will compete against Nvidia’s popular Quadro line. The WX 9100 has a $2,199 MSRP and is said to deliver a 160% improvement in peak performance per GPU clock cycle relative to AMD’s older FirePro W9100 card. The SSG, meant for video-editing and other workloads where a ton of data is being processed, has a $6,999 MSRP and (interestingly) comes with 2TB of flash storage on board. Both cards arrive on September 13th.
The Vega 64 and 56 are being pitched as superior alternatives to Nvidia’s GeForce GTX 1080 and 1070 GPUs, which respectively feature $499 and $379 MSRPs (for now, the 1070 is selling for more than that due to demand from Ethereum miners). A set of leaked benchmarks showing average game frame rates have the Vega 56 outperforming the 1070 by an average of 19% when running popular titles at a 1440p resolution and high-quality settings. With the 1080 outperforming the 1070 by 20% or so in many game tests, this bodes well for the Vega 64’s ability to at least hold its own against the 1080, given the 64’s superior specs relative to the Vega 56.
And AMD isn’t just banking on competitive average frame rates, but also superior minimum frame rates, which matter a lot when it comes to delivering smooth gameplay at higher resolutions and quality settings. At its launch event, AMD claimed the Vega 64 had an 18% edge relative to the 1080 in a minimum frame rate test involving several games, even as average frame rates were similar.
But Nvidia still has some big advantages. The obvious one: The company offers two GPUs much more powerful than the 1080, the $699 1080 Ti and $1,199 Titan Xp. The 1080 Ti outperforms the 1080 by over 30% in many benchmarks, and should have little trouble outperforming the Vega 64. For gamers wanting to indulge themselves, the Titan Xp can boost frame rates by another 10% or so.
Also: Nvidia’s high-end Pascal GPUs are less power-hungry than the Vega 64 and 56. The Vega 64’s 295-watt graphics card max power draw (TDP) is far above the 1080’s 180 watts, while the Vega 56’s 210-watt TDP easily tops the 1070’s 150 watts. In addition to upping a gamer’s electricity bill, higher power consumption can significantly increase system noise for a non-liquid cooled card, since a card’s fans have to spin faster to keep a GPU from overheating.
Nvidia’s software feature set also acts as a selling point. It includes things like the company’s ShadowPlay technology for livestreaming gameplay with limited overhead, and its Ansel technology for taking ultra-high-res game screenshots..
AMD, to be fair, also has some selling points outside of performance. For example, while both AMD and Nvidia offer technologies meant to match a monitor’s refresh rate with a graphics card’s frame rate, AMD’s solution (known as FreeSync) is cheaper and supported by far more monitors than Nvidia’s (known as G-Sync). But overall, Nvidia arguably has an edge in this field.
Moreover, it might not be long before Nvidia takes steps to eliminate any price/performance edge Vega claims. The fact Nvidia can still sell the 1080 Ti and Titan Xp to gamers demanding top-notch performance could give it some room to cut 1080 and 1070 prices without badly damaging its overall high-end margins. And with having unveiled a new flagship server GPU based on its next-gen Volta architecture in May (it’s due to ship later this year), the company might just be a few months away from launching Volta-based PC GPUs. Whereas AMD’s Navi architecture, the successor to Vega, isn’t expected until late 2018 or early 2019.
Ultimately, while AMD deserves credit for once more making itself competitive in a portion of the high-end market, it’s hard to overcome the big gap that exists between the company’s GPU R&D budget and Nvidia’s. AMD spent $1 billion on R&D in 2016, and a large portion of that went towards developing CPUs. Nvidia spent $1.46 billion on R&D in fiscal 2017 (it ended in January), and — though the company is also investing in things like Tegra app processors and Drive PX autonomous driving systems — the lion’s share appears to have gone towards GPU development, which of course is also relevant to product lines such as Tegra and Drive PX.
It’s hard to contend with such a spending disadvantage in an R&D-intensive business without avoiding some battles and making compromises in some others. The specs for the first Vega desktop GPUs make this clear.