Thirty years ago, Taiwan immigrant Jensen Huang founded Nvidia with the dream of revolutionizing PCs and gaming with 3D graphics. In 1999, after laying off the majority of workers and nearly going bankrupt, the company succeeded when it launched what it claims as the world’s first Graphics Processing Unit (GPU). Then Jensen bet the company on something entirely different: AI. Now, that bet is paying off in a big way as Nvidia’s A100 chips quickly become the coveted training engines for ChatGPT and other generative AI. But as the chip shortage eases, other chip giants like Intel are struggling. And with all it’s chips made by TSMC in Taiwan, Nvidia remains vulnerable to mounting U.S.-China trade tensions. We went to Nvidia’s Silicon Valley, California, headquarters to talk with Huang and get a behind-the scenes-look at the chips powering gaming and the AI boom.
Chapters:
02:04 — Chapter 1: Popularizing the GPU
07:02 — Chapter 2: From graphics to AI and ChatGPT
11:52 — Chapter 3: Geopolitics and other concerns
14:31 — Chapter 4: Amazon, autonomous cars and beyond
Produced and shot by: Katie Tarasov
Edited by: Evan Lee Miller
Additional Camera: Andrew Evers
Supervising Producer: Jeniece Pettitt
Graphics by: Jason Reginato
» Subscribe to CNBC: https://cnb.cx/SubscribeCNBC
» Subscribe to CNBC TV: https://cnb.cx/SubscribeCNBCtelevision
» Subscribe to CNBC Classic: https://cnb.cx/SubscribeCNBCclassic
About CNBC: From 'Wall Street' to 'Main Street' to award winning original documentaries and Reality TV series, CNBC has you covered. Experience special sneak peeks of your favorite shows, exclusive video and more.
Connect with CNBC News Online
Get the latest news: https://www.cnbc.com/
Follow CNBC on LinkedIn: https://cnb.cx/LinkedInCNBC
Follow CNBC News on Facebook: https://cnb.cx/LikeCNBC
Follow CNBC News on Twitter: https://cnb.cx/FollowCNBC
Follow CNBC News on Instagram: https://cnb.cx/InstagramCNBC
Subscribe to CNBC PRO: https://cnb.cx/2NLi9AN
#CNBC
How Nvidia Grew From Gaming To A.I. Giant, Now Powering ChatGPT
This is what hundreds of
millions of gamers in the world plays on. It's a
GeForce. This is the chip that's
inside. For nearly 30 years. Nvidia's chips have been
coveted by gamers shaping what's possible in graphics
and dominating the entire market since it first
popularized the term graphics processing unit
with the GeForce 256. Now its chips are powering
something entirely different. ChatGPT has started a very
intense conversation. He thinks it's the most
revolutionary thing since the iPhone. V
enture capital interest in
AI startups has skyrocketed. All of us working in this
field have been optimistic that at some point the
broader world would understand the importance
of this technology. And it's it's actually
really exciting that that's starting to happen. As the engine behind large
language models like ChatGPT, Nvidia is finally
reaping rewards for its investment in AI, even as
other chip giants suffer in the shadow of U.S.-China
trade tensions and an ease in the chip shortage that'
s
weakened demand. But the California-based
chip designer relies on Taiwan Semiconductor
Manufacturing Company to make nearly all its chips,
leaving it vulnerable. The biggest risk is really
kind of U.S.-China relations and the potential impact to
TSMC. That's, if I'm a
shareholder in Nvidia, that's really the only
thing that keeps me up at night. This isn't the first time
Nvidia has found itself teetering on the leading
edge of an uncertain emerging market. It's neared bankruptcy a
handful of t
imes in its history when founder and
CEO Jensen Huang bet the company on impossible
seeming ventures. Every company makes mistakes
and I make a lot of them. And some of them, some of
them puts the company in peril. Especially in the
beginning, because we were small and and we're up
against very, very large companies and we're trying
to invent this brand new technology. We sat down with Huang at
Nvidia's Silicon Valley headquarters to find out
how he pulled off this latest reinvention and got
a b
ehind-the-scenes look at all the ways it powers far
more than just gaming. Now one of the world's top
ten most valuable companies, Nvidia is one of the rare
Silicon Valley giants that, 30 years in, still has its
founder at the helm. I delivered the first one of
these inside an AI supercomputer to OpenAI
when it was first created. 60-year-old Jensen Huang, a
Fortune Businessperson of the Year and one of Time's
most influential people in 2021, immigrated to the U.S
. from Taiwan as a kid and
stud
ied engineering at Oregon State and Stanford. In the early 90s, Huang met
fellow engineers Chris Malachowsky and Curtis
Priem at Denny's, where they talked about dreams of
enabling PCs with 3D graphics, the kind made
popular by movies like Jurassic Park at the time. If you go back 30 years, at
the time, the PC revolution was just starting and there
was quite a bit of debate about what is the future of
computing and how should software be run. And there was a large camp
and rightfully so, that be
lieved that CPU or
general purpose software was the best way to go. And it was the best way to
go for a long time. We felt, however, that
there was a class of applications that wouldn't
be possible without acceleration. The friends launched Nvidia
out of a condo in Fremont, California, in 1993. The name was inspired by N
.V. for next version and
Invidia, the Latin word for envy. They hoped to speed
up computing so much, everyone would be green
with envy. At more than 80% of
revenue, its primary
business remains GPUs. Typically sold as cards that
plug into a PC's motherboard, they
accelerate - add computing power - to central
processing units, CPUs, from companies like AMD and
Intel. You know, they were one
among tens of GPU makers at that time. They are the
only ones, them and AMD actually, who really
survived because Nvidia worked very well with the
software community. This is not a chip
business. This is a business of
figuring out things end to end. But at the start, its future
was f
ar from guaranteed. In the beginning there
weren't that many applications for it,
frankly, and we smartly chose one particular
combination that was a home run. It was computer
graphics and we applied it to video games. Now Nvidia is known for
revolutionizing gaming and Hollywood with rapid
rendering of visual effects. Nvidia designed its first
high performance graphics chip in 1997. Designed, not manufactured,
because Huang was committed to making Nvidia a fabless
chip company, keeping capital e
xpenditure way
down by outsourcing the extraordinary expense of
making the chips to TSMC. On behalf of all of us,
you're my hero. Thank you. Nvidia today wouldn't be here if
and nor nor the other thousand fabless
semiconductor companies wouldn't be here if not for
the pioneering work that TSMC did. In 1999, after laying off
the majority of workers and nearly going bankrupt to do
it, Nvidia released what it claims was the world's
first official GPU, the GeForce 256. It was the first
programable
graphics card that allowed custom shading
and lighting effects. By 2000, Nvidia was the
exclusive graphics provider for Microsoft's first Xbox. Microsoft and the Xbox
happened at exactly the time that we invented this thing
called the programable shader, and it defines how
computer graphics is done today. Nvidia went public in 1999
and its stock stayed largely flat until demand went
through the roof during the pandemic. In 2006, it
released a software toolkit called CUDA that would
eventually pr
opel it to the center of the AI boom. It's essentially a
computing platform and programing model that
changes how Nvidia GPUs work, from serial to
parallel compute. Parallel computing is: let
me take a task and attack it all at the same time using
much smaller machines. Right? So it's the
difference between having an army where you have one
giant soldier who is able to do things very well, but
one at a time, versus an army of thousands of
soldiers who are able to take that problem and do it
in p
arallel. So it's a very different
computing approach. Nvidia's big steps haven't
always been in the right direction. In the early
2010s, it made unsuccessful moves into smartphones with
its Tegra line of processors. You know, they quickly
realized that the smartphone market wasn't for them, so
they exited right from that . In 2020, Nvidia closed a
long awaited $7 billion deal to acquire data center chip
company Mellanox. But just last year, Nvidia
had to abandon a $40 billion bid to acquire Arm,
citing
significant regulatory challenges. Arm is a major
CPU company known for licensing its signature Arm
architecture to Apple for iPhones and iPads, Amazon
for Kindles and many major carmakers. Despite some setbacks, today
Nvidia has 26,000 employees, a newly built
polygon-themed headquarters in Santa Clara, California,
and billions of chips used for far more than just
graphics. Think data centers, cloud
computing, and most prominently, AI. We're in every cloud made by
every computer company
. And then all of a sudden
one day a new application that wasn't possible before
discovers you. More than a decade ago,
Nvidia's CUDA and GPUs were the engine behind AlexNet,
what many consider AI's Big Bang moment. It was a new,
incredibly accurate neural network that obliterated
the competition during a prominent image recognition
contest in 2012. Turns out the same parallel
processing needed to create lifelike graphics is also
ideal for deep learning, where a computer learns by
itself rather
than relying on a programmer's code. We had the good wisdom to go
put the whole company behind it. We saw early on, about
a decade or so ago, that this way of doing software
could change everything, and we changed the company from
the bottom all the way to the top and sideways. Every chip that we made was
focused on artificial intelligence. Bryan Catanzaro was the
first and only employee on Nvidia's deep learning team
six years ago. Now it's 50 people and
growing. For ten years, Wall Street
aske
d Nvidia, why are you making this investment and
no one's using it? And they valued it at $0 in
our market cap. And it wasn't until around
2016, ten years after CUDA came out, that all of a
sudden people understood this is a dramatically
different way of writing computer programs and it
has transformational speedups that then yield
breakthrough results in artificial intelligence. So what are some real world
applications for Nvidia's AI? Healthcare is one big
area. Think far faster drug
discovery
and DNA sequencing that takes hours instead of
weeks. We were able to achieve the
Guinness World Record in a genomic sequencing
technique to actually diagnose these patients and
administer one of the patients in the trial to
have a heart transplant. A 13-year-old boy who's
thriving today as a result, and then also a
three-month-old baby that was having epileptic
seizures and to be able to prescribe an anti-seizure
medication. And then there's art powered
by Nvidia AI, like Rafik Anadol's creati
ons that
cover entire buildings. And when crypto started to
boom, Nvidia's GPUs became the coveted tool for mining
the digital currency. Which is not really a
recommended usage, but that has created, you know,
problems because, you know, crypto mining has been a
boom or bust cycle. So gaming cards go out of
stock prices, get bid up and then when the crypto mining
boom collapses, then there's a big crash on the gaming
side. Although Nvidia did create a
simplified GPU made just for mining, it didn
't stop
crypto miners from buying up gaming GPUs, sending prices
through the roof. And although that shortage
is over, Nvidia caused major sticker shock among some
gamers last year by pricing its new 40-series GPUs far
higher than the previous generation. Now there's too
much supply and the most recently reported quarterly
gaming revenue was down 46% from the year before. But Nvidia still beat
expectations in its most recent earnings report,
thanks to the AI boom, as tech giants like Microsoft
a
nd Google fill their data centers with thousands of
Nvidia A100s, the engines used to train large
language models like ChatGPT. When we ship them, we don't
ship them in packs of one. We ship them in packs of
eight. With a suggested price of
nearly $200,000. Nvidia's DGX A100 server
board has eight Ampere GPUs that work together to
enable things like the insanely fast and uncannily
humanlike responses of ChatGPT. I have been trained on a
massive dataset of text which allows me to
understand and g
enerate text on a wide range of topics. Companies scrambling to
compete in generative AI are publicly boasting about how
many Nvidia A100s they have. Microsoft, for example,
trained ChatGPT with 10,000. It's very easy to use their
products and add more computing capacity. And once you add that
computing capacity, computing capacity is
basically the currency of the valley right now. And the next generation up
from Ampere, Hopper, has already started to ship. Some uses for generative AI
are real t
ime translation and instant text-to-image
renderings. But this is also the tech
behind eerily convincing and some say dangerous deepfake
videos, text and audio. Are there any ways that
Nvidia is sort of protecting against some of these
bigger fears that people have or building in
safeguards? Yes, I think the safeguards
that we're building as an industry about how AI is
going to be used are extraordinarily important. We're trying to find ways
of authenticating content so that we can know if a vid
eo
was actually created in the real world or virtually. Similarly for text and
audio. But being at the center of
the generative AI boom doesn't make Nvidia immune
to wider market concerns. In October, the U.S. introduced sweeping new
rules that banned exports of leading edge AI chips to
China, including Nvidia's A100. About a quarter of
your revenue comes from mainland China. How do you
calm investor fears over the new export controls? Well Nvidia's technology is
export controlled, it's a reflec
tion of the
importance of the technology that we make. The first
thing that we have to do is comply with the
regulations, and it was a turbulent, you know, month
or so as the company went upside down to re-engineer
all of our products so that it's compliant with the
regulation and yet still be able to serve the
commercial customers that we have in China. We're able
to serve our customers in China with the regulated
parts and delightfully support them. But perhaps an even bigger
geopolitical risk
for Nvidia is its dependance on TSMC
in Taiwan. There's two issues. One, will China take over
the island of Taiwan at some point? And two, is there a
viable, you know, competitor to TSMC? And as of right now, Intel
is trying aggressively to to get there. And you know,
their goal is by 2025. And we will see. And this is not just an
Nvidia risk. This is a risk for AMD, for
Qualcomm, even for Intel. This is a big reason why the
U.S. passed the Chips Act last
summer, which sets aside $52 billion t
o incentivize chip
companies to manufacture on U.S. soil. Now TSMC is
spending $40 billion to build two chip fabrication
plants, fabs, in Arizona. The fact of the matter is
TSMC is a really important company and the world
doesn't have more than one of them. It is imperative
upon ourselves and them for them to also invest in
diversity and redundancy. And will you be moving any
of your manufacturing to Arizona? Oh, absolutely. We'll use
Arizona. Yeah. And then there's the chip
shortage. As it larg
ely comes to a
close and supply catches up with demand, some types of
chips are experiencing a price slump. But for
Nvidia, the chatbot boom means demand for its AI
chips continues to grow, at least for now. See, the biggest question
for them is how do they stay ahead? Because their
customers can be their competitors also. Microsoft can try and
design these things internally. Amazon and
Google are already designing these things internally. Tesla and Apple are
designing their own custom chips, to
o. But Jensen says
competition is a net good. The amount of power that the
world needs in the data center will grow. And you
can see in the recent trends it's growing very quickly
and that's a real issue for the world. While AI and ChatGPT have
been generating lots of buzz for Nvidia, it's far from
Huang's only focus. And we take that model and
we put it into this computer and that's a self-driving
car. And we take that computer
and we put it into here, and that's a little robot
computer. Like
the kind that's used at
Amazon. That's right. Amazon and
others use Nvidia to power robots in their warehouses
and to create digital twins of the massive spaces and
run simulations to optimize the flow of millions of
packages each day. Driving units like these in
Nvidia's robotics lab are powered by the Tegra chips
that were once a flop in mobile phones. Now they're
used to power the world's biggest e-commerce
operations. Nvidia's Tegra chips were also used in
Tesla model 3s from 2016 to 2019. N
ow Tesla uses its
own chips, but Nvidia is making autonomous driving
tech for other carmakers like Mercedes-Benz. So we call it Nvidia Drive. And basically Nvidia D
rive's a scalable platform whether you want to use it
for simple ADAS, assisted driving for your emergency
braking warning, pre-collision warning or
just holding the lane for cruise control, all the way
up to a robotaxi where it is doing everything, driving
anywhere in any condition, any type of weather. Nvidia is also trying to
comp
ete in a totally different arena, releasing
its own data center CPU, Grace. What do you say to
gamers who wish you had kept focus entirely on the core
business of gaming? Well, if not for all of our
work in physics simulation, if not for all
of our research in artificial intelligence,
what we did recently with GeForce RTX would not have
been possible. Released in 2018, RTX is
Nvidia's next big move in graphics with a new
technology called ray tracing. For us to take computer
graphics and video g
ames to the next level, we had to
reinvent and disrupt ourselves, basically
simulating the pathways of light and simulate
everything with generative AI. And so we compute one
pixel and we imagine with AI the other
seven. It's really quite amazing. Imagine a jigsaw puzzle and
we gave you one out of eight pieces and somehow the AI
filled in the rest. Ray tracing is used in
nearly 300 games now, like Cyberpunk 2077, Fortnite
and Minecraft. And Nvidia Geforce GPUs in
the cloud allow full-quality str
eaming of 1500-plus
games to nearly any PC. It's also part of what
enables simulations, modeling of how objects
would behave in real world situations. Think climate
forecasting or autonomous drive tech that's informed
by millions of miles of virtual roads. It's all
part of what Nvidia calls the Omniverse, what Huang
points to as the company's next big bet. We have 700-plus customers
who are trying it now, from the car industry to
logistics warehouse to wind turbine plants. And so I'm
really exci
ted about the progress there. And it
represents probably the single greatest container
of all of Nvidia's technology: computer
graphics, artificial intelligence, robotics and
physics simulation all into one. I have great hopes for
it.
Comments
legend has it Jensen was born with that leather jacket.....
The CEO really knows how to drive the stock price 😁😁😁
Completely blown away by the ability of AI to program itself..meaning non-core coders could be left jobless sooner than we thought. Very impressive by nVidia, now i see why its stock is trending upwards. AI is gonna revolutionize everything!
Yes. Hundreds of millions have an RTX 4090 at home. Sold at RRP.
Shootout to CNBC for all these short mini documentaries - really insightful please keep em' coming!
What an insightful look into Nvidia’s history and vision. I think Jensen and the team are visionaries who are not only able to see the next revolution ahead of time, but able masterfully execute incrementally and position themselves for success. When they started putting tensor cores into their graphics cards, gamers were very skeptical. But using deep learning to do super sampling became a major benefit and they were able to leverage all of that development to not only improve gaming, but get into and dominate the machine learning landscape.
More quarters and outlooks from big tech names, powered by Nvidia, as well as this trader FOMO, may lead to renewed buying strength in markets in October. I want to invest more than $300k, but not sure on how to mitigate risk
0:00 Intro 2:07 Popularizing the GPU 7:03 From graphics to AI and ChatGPT 11:54 Geopolitics and other concerns 14:32 Amazon, autonomous cars and beyond Add it to your timelaps so that people's times can be saved ❣️
Success depends on the actions or steps you take to achieve it. Building wealth involves developing good habits like regularly putting money away in intervals for solid investments. Financial management is a crucial topic that most tend to shy away from, and ends up haunting them in the near future.., I pray that anyone who reads this will be successful in life!!
They went to AI Giant by saying AI 75+ times on their earnings call.
AI can help improve games' graphics, so it made sense for them to invest in that direction. Now they have a new business opportunity, just like Amazon, which invested in data storage capacity for its own purposes and ended up creating AWS, a successful service completely unrelated to its original business.
I work in data center infrastructure projects. NVIDIA is really out there. Most of the high scale business centered graphics processing is made with NVIDIA. I worked with projects of AI predictive security outbreak detection, genetic research, protein structure simulation, hospital machine learning diagnosis software, material stress simulation, weather big data... They really excel at those kind of processing needs.
Just a thought : if someone were given the power a few decades ago to decide what industries humans should focus their efforts on, gaming hardware would probably come last. Interesting how now that hardware is revolutionizing out world
Change the 4090 to 1080 and thats the GPU hundreds of millions of gamers play on. Edit: My point is that the cards are too damn expensive for people and it's out of reach for those previous card holders.
just when you think the price of gpu would finally go back to normal, locally deployed AI is here to strike you again.
Wow, this was excellent. Really love this kind of video!
Nvidia created the first graphics card but not the first graphics dedicated chips. That honor goes to commodore and the Amiga line of computers of which nvidia and others took a lot of inspiration from. The amiga line of computers were specifically known for their high levels of graphics and audio capabilities unlike all other computers on the market. The amigas proprietary design was around long before Nvidia was even founded. What was unique about the amiga was the fact that unlike other computer makes, which did both their graphics and their computations on the single CPU , the amiga utilized their own proprietary chipset, that separated the motorolla 68000 cpu from their Agnus graphics processing chip. They used dedicated chips for both graphics and audio all this on their 32bit motherboard architecture. So in actually commodore invented the GPU, but since it was a chip mounted to the motherboard and not a separate installable card as an upgrade, Nvidia later made the claim of being the first when in fact they were not in terms of the dedicated graphics processing unit. *update* and as a friend reminded me.. The very first Commodore amiga's 1000 / 2000 had their graphics chips connected to the motherboard via a "daughter" card plugged into its Zorro slots. So even in regards to the "card" idea Nvidia was not first.
We desperately need more competition in the GPU market. Interesting video though learning about Nvidia and it will be exciting to see where things go
I love CNBC for this! I'm not into anything except Technology. Thank you!
Founder is such a chilled out guy