Will AMD "Save" the GPU Market or Maintain the Status Quo?

2 30
Avatar for LateToTheParty
1 year ago

Co-published on Publish0x.

The graphics card market has gone through quite the ringer over the past year or so. Jon Peddie Research reported Q3 2022 saw the biggest quarter-to-quarter drop since 2009. GPU shipments saw a quarter-to-quarter drop by 10.3% and a year-to-year drop by 25%. This comes after GPU shipments also experienced a massive year-over-year drop in Q2 2022.

This is because consumers are not buying graphics cards in large numbers. While prices have dropped throughout the year, plenty of RTX and Radeon cards remained in stock. With Lovelace and RDNA3 coming soon at that time, AMD and NVIDIA prepared to offer more price cuts in September to move inventory.

With the current world economic situation, many consumers are looking for the best bang for their buck as opposed to spending exorbitant amounts of money to get the best of the best. Consumers' eyes were on NVIDIA and AMD to offer cards that provided great value.

NVIDIA's Lovelace

Back in September, I expressed my suspicions over NVIDIA's Lovelace lineup. The 4080 12GB model, in particular, was just a 4080 in name only. With significantly fewer CUDA cores, and a much smaller and slower RAM pool, the 4080 12GB model was really a 4070. Even worse, NVIDIA planned to launch the card at a $900 asking price, nearly twice the money of the 3070's launch MSRP. My suspicions proved to be correct as NVIDIA "unlaunched" the 4080 impostor on October 14.

On October 12, 2022, NVIDIA launched its flagship Lovelace card, the RTX 4090. Independent third party reviews such as Hardware Unboxed's found the card to be extremely impressive for its performance. The 4090 offered around 60% better rasterization performance and up to twice the ray-tracing performance over the RTX 3090Ti. Unfortunately, it comes at a hefty price tag of $1,600 for the Founder's Edition and even more if you're talking about AIBs. Not to mention, NVIDIA took a non-trivial PR hit when reports of melting 16-Pin 12VHPWR connectors began to circulate.



About a month later, the RTX 4080 launched to lukewarm to negative reception. While performance was still impressive, sporting about a 25% improvement over the 3090Ti at 4K, the $1200 MSRP is an extremely hard pill to swallow. Unlike the 4090 which sold out at launch, the 4080 is struggling to get off shelves and this is in spite of NVIDIA shipping fewer 4080s than 4090s. There are even cases of stores gaining 4080 stock.



NVIDIA's modus operandi is pretty clear. It wants consumers to buy the 4090 and one way to do it is to make price the card a tier below horribly. On top of that, NVIDIA wants consumers to buy off the remaining Ampere stock so the Lovelace cards do not end up competing with their previous generation predecessors. The asking price for the 3090Ti Founder's Edition, for example, is $1,100 which is just below the 4080's price.

AMD's RDNA3

On November 3, AMD announced its RDNA3 cards, the 7900XTX and 7900XT. These cards will be first consumer GPUs to utilize a chiplet design which will theoretically reduce the cost of production compared to their monolithic chip counterparts. Both cards are slated to launch this December 13 at $1,000 and $900, respectively.

Because these cards have yet to be out in the market, there are no third party independent reviews to confirm AMD's first party benchmarks. AMD claims that the 7900XTX will provide from a 50% to 70% rasterization uplift and 50% to 60% ray-tracing uplift over the 6950XT depending on the game. Daniel Owen did some napkin math and determined that for the 7900XTX to match the 4090 in rasterization, it would need to improve over the 6950XT by 75%. In ray-tracing, the 7900XTX would need to improve by a whopping 165%. As a result, the 7900XTX will fall 10% behind the 4090 in rasterization assuming a 60% average uplift and not even come close in ray-tracing.


Relevant part at 11:48


Interestingly, AMD confirmed that the 7900 series are competitors to the 4080, not the 4090. When looking at the price, it does make sense. The 7900XTX is $600 less expensive than the 4090 and $200 less expensive than the 4080. On top of that, the 7900XTX and 7900XT have more video memory in spite of their lower price tags at 24GB and 20GB, respectively, which may come in handy when it comes to textures.

Digital Trends did its own investigation on how the 7900XTX stacks up against the 4080 and the results are positive. In Watch Dogs: Legion and Cyberpunk: 2077, the 7900XTX beat the 4080 by 14% and 18%, respectively, with the 7900XT falling behind by the slimmest of margins. Both the 7900 series cards handily beat the 4080 in Call of Duty Modern Warfare 2. That said, the 4080 still beats the RDNA3 cards in ray-tracing, either handily in Cyberpunk: 2077 or by a narrower margin in Hitman 3. The silver lining is unlike RDNA2 which ran very poorly with ray-tracing on, the 7900XTX appears to have around RTX 3090/3090Ti levels of RT performance which is "acceptable".

Closing Thoughts

Will AMD "save" the GPU market? Based on the MSRPs and guesswork performance numbers, it appears that AMD is willing to be aggressive against NVIDIA's offerings. When I wrote about the Lovelace cards in September, I was pessimistic and thought AMD would only undercut NVIDIA's cards by $100. I was quite surprised when AMD announced the 7900XTX's $1,000 MSRP which is not only $200 less expensive than the RTX 4080, but beats it handily in rasterization.

AMD's comments on its 7900 series indicate that it is prioritizing power efficiency and cost effectiveness. Unlike the 4090 which can draw up to 450W of power (though to be fair, it usually does not go that high), the 7900XTX "only" has a TBP of 355W max. On top of that, AMD has taken a page out of the Ryzen playbook and is using a chiplet design. Going with chiplets improves yield and AMD can choose to node shrink specific parts while keeping other components that don't need it intact. This reduces costs as well as speeds up the production.



My one big gripe is how AMD priced the 7900XT. I don't think $900 is low enough and with the 7900XTX likely offering more than a 11% uplift in performance, you might as well get the 7900XTX. At maximum, it should be $850. It's a bit ironic how AMD prices its products very competitively against NVIDA's Lovelace, but also adopts NVIDIA's "upselling the flagship card by pricing the tier below poorly" strategy.

Overall, I don't think AMD will "save" the GPU market... yet. The NVIDIA mindshare is a tough shell to crack and AMD sometimes gets in its own way (just look at the Zen 4 CPU sales, for example). However, I think Radeon will follow a similar trajectory to Ryzen. RDNA3 will make some noise as AMD's first foray into making chiplet GPUs. With that experience under its belt, AMD can make an even bigger improvement with RDNA4 similar to how Zen 2 was a huge leap over Zen 1. Perhaps AMD will even bring the 3D cache technology to Radeon, too. Who knows?

3
$ 0.26
$ 0.25 from @TheRandomRewarder
$ 0.01 from @Geri
Avatar for LateToTheParty
1 year ago

Comments

The current 200+ watts graphics cards are a desperate attempt from nvidia to maintain sales and market shares.

Modern integrated graphics cards (pc, laptops, mobile) are fast and capable to supply enough performance for basically every costumers, including typical gaming.

Around 2010, the integrated graphics ate up the market for low-end graphics chips, and by now, the integrated solutions also ate up the market for the midrange.

As nVidia failed to deliver a marketable SOC - such as the fiasco of their tegra lines - the only way for them is upwards, with crazy complexity and power consumption, and of course, price. Its basically a death spiral, the decreasing demand is unable to sustain such a giant engineering effort that is required to design graphics chips.

The question is, how many years till integrated cards will powerful enough to compete with the highest-end graphics cards, and will nvidia be able to pull out a rabbit from the hat against them?

$ 0.00
1 year ago

That was why NVIDIA tried to acquire ARM, but that deal fell through and backfired on the company badly. Not to mention, Ethereum's move to proof of stake tanked the already declining demand for GPUs at the time of the merge.

And as you've said, integrated graphics have become pretty good. You can get a Minisforum mini-PC with the Ryzen 6900HX, a chip that is more powerful than the Steam Deck's Van Gogh chip and that device performs pretty well. Even Intel is improving while not as good as AMD. Battlemage will at least exist in laptops, but I hope Intel keeps going with desktop GPUs and get the drivers all sorted out.

$ 0.00
1 year ago