NVIDIA: From Small Gaming Chipmaker to AI Giant

2024-07-09 09:30:33

Unknown to the general public, but adored by gamers, this graphics card manufacturer is now a tech giant. It rubs shoulders with Microsoft and Apple at the top of the valuation. Nvidia is still the reference for video games, but it is also a key player in artificial intelligence.

It’s a story like the Americans like them. This time, it’s not about the mythical garage (a story (a little too romanticized, as Steve Wozniak, co-founder of Apple, pointed out), but from a restaurant. Three engineers – Jensen Huang, Chris Malachowsky and Curtis Priem – met in a restaurant that is now in the heart of Silicon Valley.

The topic of discussion: Developing a computer chip that would make video game graphics faster and more realistic. That’s how Nvidia was born in 1993. 31 years later, the company is valued at more than $3.2 trillion and is still led by Jensen Huang.

Nvidia’s CEO has been able to leverage artificial intelligence to boost his business. In 2006, researchers at Stanford University discovered that GPUs (Graphics Processing Units) had another use: they could speed up mathematical operations, something that ordinary processing chips could not do.

A pari risqué

Six years later, Alexnet, an AI capable of classifying images, was developed using only two programmable GPUs from Nvidia. Computer scientists quickly integrated these graphics cards not to play games, but to accelerate their tools.

The bet was risky for Nvidia. While traditional GPUs were primarily designed for rendering graphics in video games and visual applications, programmable GPUs can be customized to perform a variety of computational tasks beyond simple graphics processing.

The rest is history: artificial intelligence began to be integrated into different programs and devices. And ChatGPT allowed us to take a new step with generative artificial intelligence.

Called the “next industrial revolution,” AI is increasingly interesting to companies, especially big tech. As a result, Nvidia chips are in every data center running algorithms. According to a recent report by CB Insights, Nvidia holds about 95% of the market for GPUs for machine learning.

For example, OpenAI’s ChatGPT relies on 10,000 Nvidia graphics processing units clustered in a supercomputer owned by Microsoft. Each AI chip costs about $10,000…

Currently, Nvidia is so far ahead of its time that it has no real competition. Its dominance seems assured for now. Will it last? Other major semiconductor companies are starting to get quite aggressive. AMD and Intel are also making GPUs dedicated to AI applications.

Recently, AMD claimed that its new Instinct MI300X chips are faster than Nvidia’s H100 and have the potential to deliver better computing performance.

Google has its tensor processing units (TPUs), used not only for search results but also for some machine learning tasks. Finally, Amazon has a custom chip for training AI models.

The main risk for Nvidia is that companies could make their own AI chips. This has happened in the past. For example, for years, Intel was Apple’s preferred chip supplier. But in 2020, the iPhone maker decided to use its own chips in its devices, ending a 15-year partnership with Intel.

Nvidia’s dominance isn’t just whetting the appetite of other foundries. Regulators are also on the scene. The French Competition Authority could indict it for alleged anticompetitive practices.

1722140505
#NVIDIA #Small #Gaming #Chipmaker #Giant

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.