(NVDA - Free Report
) is not only reinventing industries but actually creating new industries
with its GPU-based deep learning and artificial intelligence technologies.
Today's buzzword for that is "disruption," but Austrian economist Joseph Schumpeter called the human engine of innovation "creative destruction" nearly a century ago and that sounds a lot more like what NVIDIA is doing in this decade -- and probably the rest of this century.
From revolutionizing medical imaging and customizing consumer experiences to autonomous cars and powerful new forms of genetic research, the 4th industrial revolution is expanding the economics of the possible and forging unpredictable growth.
The graphical processing unit (GPU) semiconductor that NVIDIA founder Jensen Huang invented for gaming almost 2 decades ago was built out of necessity to power the compute-intensive demands of real-time images and light moving in space.
Instead of one, or a few, "core" processors that create bottlenecks of data signals, GPU technology uses thousands of processors simultaneously. This is known as parallel processing.
The relentless advance of NVIDIA GPU chips and platform architecture like CUDA (Compute Unified Device Architecture) has created an R&D powerhouse in the past few years on par with Intel
(INTC - Free Report
) and Microsoft
(MSFT - Free Report
) and the revolutions they ushered.
The technology is generically known as "massively parallel architecture" when thousands of GPU chips and billions of transistors are connected together in big boxes with racks of computing power. But NVIDIA has spearheaded the technology to new heights once again, as revealed at this week's Gamescom gaming conference in Cologne, Germany.
To learn more about the power mechanics of the GPU chip, see my video from last year...
One of my favorite ways of seeing how GPU chips are forging the 4th industrial revolution is this graphic from NVIDIA, as the exponential power of GPU computing speed is leapfrogging brute force calculating and re-igniting Moore's Law...
Jensen showed us an updated version of this graphic on Monday with four of NVIDIA's core GPU workhorses that have created the exponential advance. And he also implied tongue-half-in-cheek that the big reveal -- Turing RTX technology -- just leapfrogged his own designs by a decade. I'm still trying to figure out how serious he was.
Turing and the $68,000 AI Workhorse You Can (Almost) Own
Here's what I told my followers on Monday evening...
As I type this on Monday afternoon, NVIDIA CEO Jensen Huang is on stage at Gamescom -- the world’s largest gaming expo -- being held this year in Cologne, Germany. He's "wowing" the gamer/developer crowd with amazing views, stats and demos on NVIDIA's record-breaking new "deep learning" architecture called Turing, featuring RTX image and light reconstruction/simulation powers that computers teach themselves.
The keynote had plenty of gaming highlights to keep the crowd's attention, with advanced visuals on the new Battlefield V by Electronic Arts
(EA - Free Report
) . I didn't see anything about the forthcoming Call of Duty: Black Ops 4 from Activision
(ATVI - Free Report
) , but then I'm not much of a gamer.
I'm more interested in what the technology can do to create incredible games, and so much more.
In my over-simplification of what this technology represents, it's like bringing Hollywood, photo-realistic CGI capabilities, that take months of creation and rendering time, to video gaming in real time.
“Turing opens up a new golden age of gaming, with realism only possible with ray tracing, which most people thought was still a decade away,” said the founder of NVIDIA. “The breakthrough is a hybrid rendering model that boosts today’s computer graphics with the addition of lightning-fast ray-tracing acceleration and AI. RTX is going to define a new look for computer graphics. Once you see an RTX game, you can’t go back.”
When Turing was initially unveiled the week before, here's how the company described this breakthrough...
The greatest leap since the invention of the CUDA GPU in 2006, Turing features new RT Cores to accelerate ray tracing and new Tensor Cores for AI inferencing which, together for the first time, make real-time ray tracing possible.
These two engines — along with more powerful compute for simulation and enhanced rasterization — usher in a new generation of hybrid rendering to address the $250 billion visual effects industry. Hybrid rendering enables cinematic-quality interactive experiences, amazing new effects powered by neural networks and fluid interactivity on highly complex models.
Jensen had some fun in the keynote by showing a picture of one of their supreme supercomputer configurations called DGX, which costs $68,000. He offered the audience a special deal of "3,000 easy payments of $19.95."
Then the consummate technology creator-salesman, on par with Steve Jobs, went on to show three new RTX branded cards that will be plenty powerful for gaming fanatics and sold for $500 to $1,000 each.
Jensen is certainly passionate about the hard work his company has put into GPU semiconductor technologies and all the amazing visual performance and speed it brings to the world of gaming. As he said in Monday's keynote...
"Architecture is the single greatest lever in computer graphics."
But he is also very excited about what else he and his teams learn as the continue to push the limits of the possible in machine learning and deep learning (i.e., advance neural networks with many layers of temporal processing).
He sums it up with an idea that should blow your mind: deep learning will enable computers to write their own software (my paraphrase).
And it's fascinating to think that the $100 billion gaming industry is responsible for driving this innovation in deep learning and AI. Essentially, the architectures that Jensen & Co. are creating for gaming help them expand the R&D mechanics of next-gen advanced computing for 100 other industries.
For more on that industry of serious fun, check out my podcast where I break it all down with my Millennial-gamer colleague, Ryan McQueeney...
By the way, Intel doesn't plan to be left out of these powerful trends either. Acquiring Mobileye, the Israeli maker of autonomous driving tech, for $15 billion last year was a key move to keep up in the innovation races. And this month, in a bid to add robust deep learning capabilities across various operating verticals, Intel acquired Vertex.AI, a Seattle-based startup. Per the home page of Vertex.AI, the private company will join Intel’s Artificial Intelligence Products Group.
And Advanced Micro Devices
(AMD - Free Report
) is a close competitor to NVIDIA, offering their own GPU designs for games and AI researchers. To learn more about how both NVIDIA and AMD chipsets have been used in computers that conquered our favorite games of strategy (chess and Go) and probability (poker), check out these podcasts and articles...
By the way, while we don't know what Intel paid for Vertex.AI, we do know what Google paid for DeepMind, the creators of the AlphaGo engines that beat the world's best Go player: $600 million.
The product roll-outs this week from NVIDIA are definitely exciting for both gamers and more pragmatic AI researchers across industries from manufacturing, automotive, and medicine to retail, energy, and finance.
But of course, there will be many more interests in this advanced technology. Last month, NVIDIA announced a new partnership with DARPA, the Defense Department's advanced research lab. From the July 24 PR...
NVIDIA to Collaborate with DARPA to Develop Systems for Post-Moore’s Law Era
NVIDIA has been selected by the Defense Advanced Research Projects Agency (DARPA) to work with a team of university and industry researchers to enable near application-specific integrated circuit performance without sacrificing programmability for data-intensive algorithms.
DARPA, the U.S. Defense Department’s research and development arm, awarded the NVIDIA team a four-year contract worth up to $23 million under the new Software Defined Hardware (SDH) program, part of DARPA’s Electronics Resurgence Initiative (ERI).
(end of PR excerpt)
Now this new collaboration doesn't suggest anything to do with military weapons or surveillance. But we would be naive to not think about this kind of R&D going on in many places around the world as nations find themselves in a race for the next-generations of defense, intelligence, and war-faring technologies.
For more on the potential dark side of AI, check out these 2 podcasts and articles...
In the end -- rather, in the beginning of this revolution -- NVIDIA deep learning and AI technologies will create a future that looks and feels very different than the one you experience today.
I remain an optimist. Here's to hoping that Jensen and the gamer armies keep the battlefields in the cyber realm, while they bring forth AI computing power that makes life better for everyone on the planet.
Disclosure: I own shares of NVDA for the Zacks TAZR Trader portfolio.
Kevin Cook is a Senior Stock Strategist for Zacks Investment Research where he runs the Healthcare Innovators and TAZR Trader services. Click Follow Author above to receive his latest stock research and macro analysis.