Investing in GPUs for AI – AMD GPUs vs NVIDIA GPUs

It was only 6 months ago that we wrote about “The Artificial Intelligence Stock That Rocked Wall Street” and the title of that article was hardly exaggerating. What we were referring to was the +29% jump in the price of NVIDIA (NASDAQ:NVDA) after an earnings surprise which took the share price to $88 giving the Company a market cap of around $45 billion. A 29% jump for a company that size is rarely seen unless it is explained by a corporate event.

You would not be condemned for thinking that the move was too dramatic and deciding to “wait until it settles down” to begin building a position in what is the closest thing to a pure-play artificial intelligence (AI) stock out there right now. Fast forward to today and as we type this, shares are breaking the $125 barrier after another surprise earnings call which saw another +20% jump bringing NVIDIA’s market cap to $88.5 billion. NVIDIA is now more than half the size of Intel (NASDAQ:INTC).

If you already hold shares in NVDA or you are thinking about opening a position, one of the first things you have to be wondering about is the competitive space for AI chips. Here are some relevant questions we might ask:

  1. What is NVIDIA’s current market share for GPUs? Who are the main players?
  2. Are there other types of chips that can be used as substitutes for GPUs in the context of AI?
  3. Are there any startups building AI chips that could challenge NVIDIA’s dominance?

In regards to the third question about startups, a year ago we wrote about “5 Startups Building Artificial Intelligence Chips” and it seems about time we take another look at what’s going on in this space. In this article though, we’re going to try to answer questions 1 and 2. First, let’s start by understanding some of the basics in layman’s terms while using as few acronyms as possible.

What is a GPU?

Can anyone remember saving up their money from small jobs as a kid to pick up one of these followed by a frustrating weekend trying to install it on your living room floor hoping “the static didn’t fry it”?


No? We’re just huge nerds then? Well for those of you not in the know, the box above contains a video card or graphics card that you would use to play video games (warez) that you copied from someone else (pirate BBSs). In those days you needed to soup up your desktop computer to play proper video games and the most important thing next to the processor speed was the graphics card. These graphics cards use chips from NVIDIA called Graphical Processing Units (GPUs), and the very first GPU came out in 1999 under various brands (such as the brand seen above). The GPUs on these cards are referred to as “discrete GPUs” which means that the GPUs are installed separately from the computer’s main system or motherboard. A discrete GPU will use its own RAM instead of the system RAM so it has better performance.

GPU Substitutes for AI Hardware

What NVIDIA has going for them is a first-mover advantage. There are quite a few substitutes for GPUs such as the Field Programmable Gate Array (FPGA) which is an alternative to GPUs… but not really. Here’s a paper put together in February of 2017 by a bunch of engineers over at Intel titled “Can FPGAs Beat GPUs in Accelerating Next-Generation Deep Neural Networks?” which concludes that “FPGAs may become the platform of choice for accelerating next-generation DNNs“. Intel has probably been thinking that for a while because back in 2015 they plunked down $16.7 billion to buy FPGA manufacturer Altera (the largest ever acquisition for Intel). Then, the following year Microsoft does this:

Microsoft has revealed that Altera FPGAs have been installed across every Azure cloud server, creating what the company is calling “the world’s first AI supercomputer”.

The above quote came from an aptly titled article by Michael Feldman, “Microsoft Goes All in for FPGAs to Build Out AI Cloud“, which pretty much says that Microsoft and Intel are betting on a future where FPGAs are the dominant AI hardware, not GPUs. Intel is also making other AI-related investments such as their acquisition of Nervana, one of the startups we covered in our last article on 5 startups building AI chips.

So now that you have some major tech players betting on FPGAs as the future of AI hardware, Google then decides to go and release their own AI hardware called “Tensor Processing Unit (TPU)” which is optimized for their AI software (Tensorflow) which could possibly be the most advanced AI technology there is. And that’s not even the whole story. There are many other types of chips out there, each with its own application for machine learning. Take the below graph from an expert over at Moor Insights & Strategy who looks at this stuff constantly:

The above chart was taken from an article by Moor Insights & Strategy published on Forbes a few months ago. The article is titled “Machine Learning Landscape: Where AMD, Intel, NVIDIA, Qualcomm And Xilinx AI Engines Live” and it does a great job of showing how there won’t necessarily be a single winner. With potentially limitless applications for AI, nothing says there has to be one winner. The fact is though that NVIDIA seems to be the one with all the positive earnings surprises and that could be because they appear to be dominating the discrete GPU market.

Who Sells Discrete GPUs Today?

Over the years there were quite a few players but as time went on, the number of players selling discrete GPUs came down to two – AMD and NVIDIA. Below you can see a chart compiled by which shows the market share breakdown of discrete GPUs over time:

AMD GPUs vs Nvidia GPUs

It’s important to emphasize that the above numbers do not reflect the total market share for GPUs, only discrete GPUs. While discrete GPUs were initially used for gaming, over time they have become useful for emerging technologies like VR and AI, in particular, AI. Now that we know there are two players in the game, we want to try and understand how formidable a competitor AMD is. We’re not going to compare products, but rather we’re going to look at their stated commitment to developing AI hardware.


We know that there are two main players who sell discrete GPUs. As investors, we want to understand if both companies are equally committed to developing GPUs specifically for AI and marketing to that audience. Are they both committing the R&D needed to develop new products and achieving the profitability needed to further commit funds to R&D research? Here’s a look at a small graphic we put together which answers some of these questions:

That last part we really found to be remarkable. AMD does not appear to have any interest in showing how their product can be used for AI. Actually, that’s not necessarily true. When we went to AMD’s webpage, we really had to dig before we could find anything about their commitment to AI but what we finally came across was this presentation on their Investor Relations page which talks about how their graphics card can be used for “Machine Leaning”:

As our readers are quick to point out to us, if you want to be taken seriously you can’t make obvious typos in important places. The fact that nobody has seen that yet means that the marketing team just isn’t on point and not too many people read that page. Now there could very well be something that’s called “Machine LEANING” and we’re just ignorantly unaware of it so we’ll leave that door open. The real problem here is that the only mention of future plans for AMD to develop GPUs to be used in AI hardware is a single deck.

The key takeaway appears to be that AMD is not a contender when it comes to selling GPU-based AI hardware and while there is expected to be lots of non-AI related growth like VR or general gaming, AI is where investors see some unrivaled growth opportunities. That’s why we’re all looking for the “Cisco of artificial intelligence“. These are the picks-and-shovels plays that create wealth and we want to have some skin in the game.

Most people probably compare AMD vs NVIDIA by looking at the quality of their product offering. Gamers are an incredibly passionate bunch when it comes to choosing technologies and having an allegiance to a vendor, such as this Russian dude that killed his “friend” over an AMD vs NVIDIA argument. As investors, we don’t care so much about these rivalries. Sure, we can do market surveys that try to figure out “which GPU is best” but it all comes down to this. Which company is showing their investors a commitment to developing AI hardware internally, aggressively marketing that commitment externally, and then showing the results come earnings time?

The co-founder and CEO of NVIDIA told Tech Crunch recently that “someday, we would likely just become an AI computing company“. There are of course people who will plaster the words “AI” and “deep learning” all over their website and marketing collateral and then not follow through by “showing you the money”. In the past 6 months, NVIDIA has “shown us the money” twice and simply rocked Wall Street with their results. Is this getting a bit frothy? If you’re measuring “frothiness” by volatility, then less. If you’re questioning which company is making it most obvious about the extent to which they are committing to AI and showing results today, then the answer has to be NVIDIA.

NVIDIA is one of the core holdings in our tech stock portfolio along with four other AI stocks. The complete list of disruptive tech stocks and ETFs we’re holding can be found in the “Nanalyze Disruptive Tech Portfolio Report,” now available for all Nanalyze Premium annual subscribers.

4 thoughts on “Investing in GPUs for AI – AMD GPUs vs NVIDIA GPUs
  1. Have you really searched AMD’s website? They have a full branch of Machine Learning GPUs coming, most notable the Radeon MI25 which is claimed to outperform NVidia’s latest discrete GPUs.

    1. Hi Vincent,

      Thank you for the comment. These machine learning GPUs are coming yet few mentions seem to be made in any of their collateral? You shouldn’t have to “really search” if their marketing team was making customers and shareholders aware of this.

  2. Nice sales pitch at the end. Do you guys really know about any of the technology or markets your talking about or is this the sophomore-writing level spam mill that I think it is.
    Yeah, it’s the latter. I feel bad for your readers, assuming there are any.

Leave a Reply

Your email address will not be published.