Transcript for:
Nvidia Earnings Call Insights

Nvidia just had their most important earnings call ever. Whether you're a long-time Nvidia shareholder or looking to buy Nvidia stock for the first time, you should know that this isn't just about revenues and profit margins. Nvidia's earnings decide the speed and scale of the entire AI revolution.

So, in this video, I'll break down everything you need to know about Nvidia's latest earnings call and what it means for the entire stock market. Your time is valuable, so let's get right into it. To say this was one of the most anticipated earnings calls of all time would be a massive understatement. in many years because of what this represents in terms of tech. Let me explain the reason for all this hype.

Ever since OpenAI released ChatGPT in November of 2022, the stock market's returns have been largely driven by the Magnificent Seven. That's because companies like Amazon, Meta Platforms, and even Tesla already have huge data center infrastructures, they have global products and services that benefit from AI, and they have very deep pockets. That puts them in the perfect position to invest aggressively in AI and outspend the startups and smaller businesses that compete with them, especially at a time where interest rates and inflation are making everything more expensive. Ha, boy do I feel that. So, that's where investors put their money.

As a result, the Magnificent 7 outperformed the other 493 companies in the S&P 500 by a massive margin. But even among those 7, one company far outshined the rest, and that's Nvidia since they're supplying all the AI chips that everyone else is investing in. In a gold rush, you want to be the company that sells picks and shovels.

If Nvidia's revenue drops below expectations, that means companies are investing less into AI infrastructure than we thought. and the ai gold rush is slowing down for the rest of the market if nvidia's revenues grow that means the opposite this ai revolution could be bigger than we thought or at least happen faster which drives up forward estimates for the rest of the market so in a very real way, Nvidia has been driving the returns for the rest of the S&P 500. That's one reason why their earnings are always so important. But I said this earnings call specifically was the most important ever, and that's because of Blackwell. well, Nvidia's next generation AI chip.

I believe the Blackwell ecosystem will be the primary picks and shovels in this entire AI gold rush, so its speed and scale will largely be determined by how fast Nvidia can get Blackwells into data centers and at what cost. So that's what I'll cover in this video. I'll walk you through Nvidia's earnings results focusing on data centers.

I'll explain what's going on with Blackwell, their hugely important next-generation AI chip, I'll talk about Nvidia's biggest risks right now, like their supply constraints and their competition, and of course, what's in store for Nvidia stock in the short, medium, and long term. There's a lot to talk about, so let's dive right in to Nvidia's most important earnings call. Nvidia posted record revenues of $30 billion for the quarter, which is up 15% quarter over quarter, and a whopping $180 billion for the quarter.

and 22% year-over-year. Nvidia also posted earnings per share of 68 cents, which is 171% higher than a year ago, after accounting for their 10-for-1 stock split. Things get even crazier when we focus on Nvidia's data center revenues, which now account for about 88% of their total revenues today.

That's why I'm not covering their other business units. Nvidia's data center revenues came in at $26.3 billion, up 16% quarter-over-quarter, and an insane 154% year over year. Just to be clear, that means Nvidia's data center revenues grew by 2.5 times in one year after they already became a trillion dollar company, and they're on pace to make over $100 billion in AI accelerator revenue this year alone.

Compare that to Nvidia's two biggest quote unquote competitors. AMD isn't even close to Nvidia's data center sales, and Intel is, well… As a result, Nvidia holds over a 90% share of the data center GPU market. And that's before their Blackwell GPU sales ramp up. That huge market share is why Nvidia has so much pricing power, even as a hardware company.

They reported gross margins of 75.1%, putting Nvidia's profit margins on par with most software companies. And that's huge since, according to MarketUS, the global artificial intelligence market is expected to almost 12x over the next eight years, which is a compound annual growth rate of 36.8%. But many of the companies building next-generation AI applications are not publicly traded.

Think about the 90s and early 2000s. Companies like Amazon and Google went public The next Amazon, the next Google, the next Nvidia. I spent a lot of time digging. getting into this and the Fundrise Innovation Fund is a great way to invest in some of the best tech companies before they go public. Venture capital is usually only for the ultra-wealthy, but Fundrise's Innovation Fund gives regular investors access to some of the top private, pre-IPO companies on Earth, without breaking the bank.

The Fundrise Innovation Fund has an impressive track record, already investing over $100 million into some of the largest, most-in-demand AI and data infrastructure companies. So, if you want access to some of the best late-stage AI companies before they IPO, check out the Fundrise Innovation Fund using my link below today. Alright, so Nvidia reported record revenues and earnings per share, both of which also beat AnalystX. expectations.

On top of that, they issued stronger than expected guidance for the current quarter. So why did Nvidia stock drop 8% after such good earnings? Well, even though their 75.1% gross margins are actually up 5 points year over year, they're down 3.3 points from the previous quarter. And this is where things get really interesting.

Nvidia's margins dropped because of something in Blackwell's designs that I've actually been calling out for a few quarters now. Something that's really important for investors to understand. This is Blackwell, the successor to Nvidia's current generation of GPUs called Hopper. And compared to Hopper, these new Blackwell chips perform a whopping 4 times better at AI training and an insane 30 times better at AI inference. That's why every company wants these things since this huge leap in performance means data centers get a lot more compute power from a Blackwell system of the same cost, size, weight, and power.

power, or whatever is limiting their current data center footprint. But Blackwell GPUs are actually two separate dies connected by an ultra-high speed 10TB per second link. This connection is so fast that it actually tricks the two dies into thinking they're a single chip.

It's a really clever design that gets around a lot of the physical limitations of the machines that build these chips. A lot, but not all of them. It turns out that the rumored three-month delay for Blackwell isn't because of a design flaw at all, but due to the accuracy limitations of some of the machines at TSMC, the company that builds Nvidia's chips. The placement of these ultra-high-speed chip-to-chip links needs to be insanely precise to hit those 10TB per second speeds. if the link placement is off even by a tiny amount, the whole chip could fail as its different metal layers and materials warp and expand when they heat up during normal operations.

As a result, the percentage of working blackwell chips that TSMC could produce, or the yield, went down. Fewer chips for the same amount of production time and materials means lower profit margins, causing Nvidia's margins to drop to 75.1% for the quarter. And as for that rumored three-month delay, Nvidia says it already made the necessary changes to improve yields, and Blackwell production is scheduled to ramp up in the fourth quarter and into next year.

Nvidia expects to ship several billion dollars worth of Blackwell chips in Q4 of this year. And one big reason that data centers really care about Blackwell in particular is direct-to-chip liquid cooling. Today, around 90% of all server racks are air-cooled. That means that a lot of data centers, including hyperscalers like Amazon, Google, and Microsoft, are making massive infrastructure changes to support liquid cooling for their Blackwell systems specifically.

Industry estimates suggest that up to 80% of cooling will become direct-to-chip liquid cooling over time. time. So this is a huge shift in how data centers operate around the world.

That's why Blackwell is such an important chip and this was such an important earnings call. Alright, now that we've walked through Nvidia's data center earnings, what's going on with Blackwell and why it's so important to data centers, let's talk about the risks. And if you feel I've earned it, consider hitting the like button and subscribing to the channel.

That really helps me out and lets me know to put out more content like this. Thanks, and with that out of the way, let's talk about Nvidia's risks, starting with their competition. Nvidia has two kinds of competition, and ironically, neither of them actually compete with Nvidia. Here, let me explain. The first kind of competition comes from chip makers like Intel and AMD, but Nvidia's revenue from data centers is 5 times bigger than Intel's and AMD's put together.

And beyond that, a lot of Intel and AMD's data center revenues actually come from selling CPUs, not GPUs or other kinds of accelerators. operators, so they don't really compete with Nvidia as much as serve the part of the data center market that doesn't run on Nvidia's ecosystems. Given Nvidia's 90 plus percent market share for GPUs, I'd say this is a good example is a pretty small piece of the market.

The second kind of competition comes from hyperscalers that make their own accelerators, mainly Microsoft, Google, and Amazon. But there are two key reasons they don't compete with Nvidia either. The first and most obvious reason is they don't sell their chips to anyone else. Microsoft's Azure Maya, Amazon's Tranium and Inferentia chips, and Google's Tensor processing units only exist within their own clouds to help their own customers with AI workloads.

specific to their use cases. NVIDIA's GPUs are much more general purpose and over 55% of NVIDIA's data center revenues come from companies outside these big three cloud providers. And that number is only growing over time.

But the second reason that Microsoft, Amazon, and Google don't really compete with NVIDIA is that's just not how data centers work in the first place. Let me give you an example. Most people only have one phone or one car at a time.

So, every sale for one company is a missed customer for their competition. Every car Tesla sells is one GM doesn't. Every phone Google sells is one that Samsung doesn't. But data centers don't work like that, especially when it comes to AI. When Google buys a GPU from AMD or uses their homegrown Tensor processing units, that's not one less sale for Nvidia, since data centers handle a wide variety of workloads for many different kinds of businesses.

Data centers are portfolios of different hardware and software solutions, and as demand for a specific kind of workload grows, data center operators will buy more of the right hardware to support it. But in the end, their goal is always to meet the changing needs of their customers and and optimize their costs. Only a small portion of all workloads around the world involve AI today, so Blackwell would be total overkill.

That's why data centers buy chips from AMD or Intel for certain tasks, they build their own chips for others, and they use Nvidia's GPUs for heavy-duty training and inference. And no matter what Intel or AMD might say, inference is a is becoming a heavy-duty AI application as the industry moves from predicting the next word in a sentence to generating images and entire videos. So, while companies might make chips that benchmark well against Nvidia for large language models, their performance remains to be seen for text to video models like OpenAI's Sora or complex protein structure and interaction models like AlphaFold3 by Google DeepMind.

My point is, Nvidia's biggest risk isn't competition at all. It's complexity. Nvidia didn't change Blackwell's design to compete with AMD or Amazon.

They changed it because even the best chip fabrication company on the planet had trouble placing their ultra-high speed chip-to-chip connections. And that's just one piece of one part of the ecosystem. Investors may not realize this, but when Nvidia comes out with a new chip architecture like Hopper, Blackwell, or Rubin, they're not designing one chip.

They're designing five separate chips that make up an entire computer. computing platform, a new GPU, an NVLink switch chip that connects multiple GPUs together, a network interface card, and two separate smart switch chips to connect multiple racks together, one for Ethernet and one for InfiniBand. If any one of those chips has an issue, Nvidia's entire data center ecosystem has an issue.

That's why I spend so much time understanding Nvidia's products, not just their profits, and why I focus on the science behind this stock. But credit where credit is due, look how fast Nvidia was able to address this issue with Blackwell and how little it dropped their overall margins, from 78 to 75% for a single quarter. So, the only other risk worth mentioning is Nvidia's support.

constraints. Look, every company on earth is constrained by one of two things. They're either supply constrained or demand constrained. And as long as Nvidia keeps making massive leaps in compute power, they're going to be supply constrained. But Jensen was crystal clear that Nvidia is ramping up supply to meet the insane demand for Blackwell and backfill demand for Hopper.

It seems like a very clearly this was a production issue and not a fundamental design issue with Blackwell. But the deployment in the real world, what does that look like tangibly? And is there a sort of delay in the timeline of that deployment and thus revenue from that product? Let's see.

The fact that I was so clear and it wasn't clear enough kind of tripped me up there right away. And so let's see. We made a mass change to improve the yield. Functionality of Blackwell is wonderful.

We're sampling Blackwell all over the world today. We have started volume production. Volume production will ship in Q4. Q4, we will have billions of dollars of Blackwell revenues, and we will ramp from there.

We will ramp from there. Either way, I'd rather see NVIDIA run out of supply than run out of of demand, and as a shareholder, I trust Nvidia's leadership to keep innovating, expanding their total addressable market, and work closely with TSMC and their other partners to meet as much demand as they can. Speaking of which, let's talk about what's in store for Nvidia stock over the short, medium, and long term, because there's plenty for shareholders to look forward to. In addition to Blackwell sales ramping up over the next few quarters, they're also ramping up their Ethernet-based networking solutions, like their Spectrum 4 switch, and their Bluefield data processing units. Both of these are chips that do calculations to spread AI workloads out across multiple GPUs and then bring the final outputs back together.

3.7 billion dollars of Nvidia's revenue came from networking products this past quarter, which is actually 30% more than all of AMD's data center revenues for the quarter. And beyond the networking sales themselves, Nvidia is expanding its total addressable market big time by supporting Ethernet and as well as InfiniBand. That means even more data centers can integrate Blackwell and Hopper systems down the road.

Over the medium term, Nvidia is joining the likes of Apple, Google, and Meta platforms when it comes to share buybacks. When a company buys back some of its shares, the value of each outstanding share goes up, since it represents a slightly bigger piece of the company. Nvidia's board of directors approved another $50 billion in share buybacks, in addition to the $7.5 billion of shares they're still authorized to buy back from the last repurchase approval.

I like share buybacks way more than dividends because they show more long-term confidence in the business and those shares can be sold again at better valuations while dividends never make it back to the company for future growth. Besides buybacks, Nvidia is gearing up for the AIPC market with new GeForce RTX graphics cards and NIMs. At a high level, Nvidia's NIM microservices stitch together different functions to create a higher level service. For example, Nvidia's Avatar Cloud Engine, or ACE, is built on microservices to do things like translate speech to text and text to speech, match a character's lips and facial expressions to what they're saying, follow a specific set of rules, context and guardrails, and so on.

I think Nvidia's NIMS will be a real revenue multiplier for Nvidia over the long term, since companies pay a per GPU subscription for these services. And in the long term, Jensen actually announced three different GPUs at Computex 2024. The Blackwell Ultra chips, which ship in 2025, the architecture after Blackwell, which is called Ruben, which will ship in 2026, and then Ruben Ultras, which will ship in 2027. All of these GPUs are architecturally compatible, which means that a Hopper compute tray can be replaced with Blackwell or a Ruben tray down the road. So, when a data center buys chips from Nvidia today, they're already investing in the infrastructure that supports new chips, making future sales much more likely for Nvidia.

That also means that older chips will benefit whenever Nvidia or one of their customers writes a new acceleration library. library, or software application, so the whole Nvidia hardware ecosystem gets better over time, which leads to more adoption, which leads to more developers, and so the cycle repeats full circle. This is why it's so important to understand the science behind the stocks.

And if you want to see how I picked some of the highest performing tech stocks before they made massive moves earlier this year, check out this video next. Or if you want to see all my Nvidia coverage, including interviews with their executives, This playlist is for you. Either way, thanks for watching and until next time, this is ticker symbol you.

My name is Alex, reminding you that the best investment you can make is in you.