Back to top

Nvidia DGX: Workhorse of AI Will Drive NVDA to $2 Trillion

Read MoreHide Full Article

This week I wrote an article about the valuation controversy surrounding shares of NVIDIA (NVDA - Free Report) . My goal was to describe both the 40% jump in annual sales forecasts this quarter and the multi-trillion dollar TAM (total addressable market) of AI applications that NVIDIA is serving, which justifies those revenue projections.

I’ll share much of those details here, plus follow-up by providing more color on where the leap in sales forecasts is coming from: the DGX servers which are used to build the “massively parallel architectures” for AI workloads.

The video attached to this article explains what a DGX “box” is, how it works, and the evolution of configurations that make them sell for between $100,000 and $150,000.

In 2018 at the Gamescom conference in Cologne, Germany, CEO Jensen Huang joked that you could buy the first generation DGX server -- with 8 Tesla Volta GPUs capable of 1 PetaFLOPS (1 quadrillion floating-point operations per second) -- for “just 3,000 easy payments of $19.95.”

Below I will share links to the NVIDIA AI experts and partners that I profiled in today’s video who break down the power of the DGX “superpods.”

The Valuation Debate

First, here’s a quick intro to the controversy…

As NVIDIA cruised into the stratosphere of $1 trillion valuations, one of the most common remarks heard across business and social media was “Oh my, look at that sky-high valuation!”

There was first talk of NVDA trading at 100 times on a price to earnings (PE) basis. Then I saw posts about a 200X PE.

Where were investors and journalists -- and armchair analyst-traders -- getting these ridiculous numbers?

The Rearview Mirror

Mostly from looking backwards. Because when NVIDIA reported quarterly earnings on May 24, those looking in the rearview mirror ran off the road in bewilderment.

Maybe they got scared by the “smart trucks” (large quantitative investors) accelerating behind them.

But if they had only listened to me in my umpteenth explanation of why you buy NVDA at all valuations between 10 and 15 times sales, they would have been looking forward.

For years I have said two things over and over again;

1. Advances in Gaming GPU R&D were teaching NVIDIA engineers what they needed to know to build the foundations of world-changing Artificial Intelligence (AI).

2. Datacenter demands for NVIDIA AI “massively parallel architectures” would overtake Gaming revenues in the early 2020s.  

The Quarter That Blew the Roof Off

On March 20, I published a special edition of Zacks Confidential (ZC) where I outlined why NVIDIA was still the reigning “King of AI” and how the acceleration of, and access to, tools from ChatGPT had just put the stock back on the launch pad -- even though we already had 100%+ gains from buying the October lows.

At the time, I was looking at consensus estimates for NVDA sales of about $30 billion for this fiscal year (ends in January) and around $37 billion for next year.

But we were already starting to see the optimism being expressed in those 12-24 months forward estimates as ChatGPT ignited like wildfire and NVIDIA was inking deals left and right with partners like Microsoft (MSFT - Free Report) .

Little did we know that Jensen & Co. would blow the roof off with a big raise in sales guidance on May 24. Here’s how I described the situation in TAZR Trader…

NVIDIA: It's the TAM, Stupid!
Posted on 5/24/23

TAZR Traders

"How extremely stupid not to have thought of that."

--Thomas Henry Huxley, upon hearing of Darwin's theory of evolution by natural selection

NVIDIA blew some doors off tonight -- and not just for the bears.

Yes, they raised guidance significantly. But so many investors and amateur analysts are still grappling to understand the significance of this juggernaut that I have called the King of AI for 6 years...

Get Your "MPA" in Deep Learning
March, 2017

NVIDIA Gaming Drives the Deep Learning-AI Revolution
August 21, 2018

Who Cares NVIDIA Makes Great Gaming Graphics?
August 23, 2018

NVDA Valuation in a Giant TAM

In mid 2022, CEO Jensen Huang guesstimated a TAM just for NVIDIA of $300 billion in hardware and $300 billion in software. This is the defining projection for the CUDA hard+soft stack I've been preaching about for 6 years.

I think Jensen has been so conservative because it would blow the minds of even professional analysts to think about what is possible.

Imagine NVDA growing to just 3X a $600 billion TAM where it would still be the dominant player...

That would be a $1.5 trillion valuation minimum. See you soon above $1T.

(end of May 24 TAZR commentary excerpt)

Architect of the AI Supercomputer

I used to talk 5 years ago about how much NVIDIA’s revenue could grow as they sold DGX “boxes” for $68,000. That’s the “mini” supercomputer in a 2-cubic foot space with 400 to 500 billion transistors for machine learning (ML) applications -- depending on what generation of GPU card is used, from Volta to Ampere to Hopper. The A100 system-on-a-chip GPU costs $10,000 each.

A single DGX box is capable of massive computing power of between 1 to 5 petaFLOPS. A petaFLOP is a billion million floating point operations per second, so you can imagine how many you need to build a supercomputer like the one NVIDIA built at Tesla (TSLA - Free Report) for their autonomous driving R&D.

Here’s a good description from software consulting firm Run:ai of what a user gets for their investment in a single DGX server…

“Beyond the powerful hardware they provide, DGX systems come out of the box with an optimized operating system and a complete pre-integrated environment for running deep learning projects. They provide a containerized software architecture that lets data scientists easily deploy the deep learning frameworks and management tools they need with minimal setup or configuration.”

To build a corporate, university, or scientific R&D lab supercomputer, you need between 100 and 1,000 of these dynamic cubes, “stacked” together to form “massively parallel architectures.”

The CUDA Stack Rises

Now a DGX box costs anywhere from $100,000 to $150,000. Not to mention the value-enriching capabilities of the full CUDA hardware + software stack.

And every major enterprise wants 100 to 1,000 of them now that ChatGPT took center stage. They either need to capitalize on the “chatterbox that ate the internet,” or exceed it. Now imagine the Fortune 500 adopting the mindset of Elon Musk about AI innovation for their companies as he did for Tesla.

THIS is the math you need to understand to grasp how and why Wall Street analysts underestimated NVIDIA sales potential, even after ChatGPT.

And they are just catching up with these potential topline moves…

Fiscal year 2024 (ends January ‘24): $42.64 billion = 58% growth

Fiscal year 2025 (ends January ‘25): $54.70 billion = 28% growth

These are dramatic turnarounds from where revenue projections were only a few months ago when I wrote my ChatGPT report.

In fact, the day I published that ChatGPT report on March 20 was the start of the greatest computing event of the year, the NVIDIA GPU Tech Conference, or GTC. Jensen & Co. are always full of surprises that wow the tech crowd and this was no disappointment as NVIDIA launched DGX Cloud, where users can rent access to single-instance servers for $37,000 per month.

Then in late May, they unveiled DGX GH200, a large-memory AI supercomputer that uses NVIDIA NVLink to combine up to 256 NVIDIA GH200 Grace Hopper Superchips into a single data-center sized GPU. Alphabet’s (GOOGL - Free Report) Google Cloud, Meta Platforms (META - Free Report) and Microsoft are already the first customers.

This was one of the primary drivers in the dramatic leap in sales estimates for this year and next.

The Trillion-Parameter Instrument of AI

Here’s how Jensen & Co. introduced the newest superpod…

Open up enormous potential in the age of AI with a new class of AI supercomputer that fully connects 256 NVIDIA Grace Hopper™ Superchips into a singular GPU. NVIDIA DGX™ GH200 is designed to handle terabyte-class models for massive recommender systems, generative AI, and graph analytics, offering 144 terabytes (TB) of shared memory with linear scalability for giant AI models.

In the video that accompanies this article, I share insights from two prominent technology consultants and partners for NVIDIA GPU architectures that explain the power of these AI tools.

The first is Lambda Labs where they help customers build custom NVIDIA DGX “SuperPOD” solutions. They are a preferred NVIDIA partner who claims to help corporate clients "Guarantee ROI for your NVIDIA DGX™ deployment with Lambda’s deep learning expertise." Here's their mission description...

Accelerate development with purpose-built software for ML/AI

All DGX systems come with the DGX software stack, including AI frameworks, scripts, and pre-trained models. It also comes with cluster management, network/storage acceleration libraries, and an optimized OS.

The second expert is Run:AI where they have a highly useful article breaking down the evolution of DGX generations since 2017.

Be sure to watch the video where I explain machine learning/deep learning neural networks and offer more resources to learn from.

Where to Buy NVDA Shares for the Run to $2 Trillion

It won’t happen this year. Maybe next year. But I think the days of NVDA trading below $750 billion are gone.

And I don’t think the stock will fill the gap back down below $320. A sea-change moment occurred between March and May where corporations and investors realized that NVIDIA AI platforms weren’t a luxury anymore.

They became essential for competition in the world of big data mining and modeling. So buy all dips toward $350.

Disclosure: I own shares of NVDA for the Zacks TAZR Trader portfolio.

Published in