Nvidia has introduced its most recent artificial intelligence (AI) chip (B200), which the company claims can complete certain duties thirty times quicker than its predecessor.
IN SHORT
Nvidia’s Annual Developer Conference Highlights
- Nvidia CEO Jensen Huang introduced the company’s latest chip, the B200, which is 30 times faster at some tasks than its predecessor.
- The company also introduced a new set of software tools to help developers sell AI models more easily to companies using Nvidia’s technology.
- The announcements will help determine whether Nvidia can maintain its 80% share of the market for AI chips.
- Nvidia’s shares have surged 240% over the past 12 months, making it the U.S. stock market’s third-most valuable company.
- Major customers, including Amazon.com, Alphabet, Google, Microsoft, OpenAI, and Oracle, are expected to use the new chip in the cloud computing services they sell and in their own AI offerings.
- Nvidia is shifting from selling single chips to selling total systems, with the latest iteration housing 72 of its AI chips and 36 central processors.
- Analysts expect Nvidia’s market share to drop several percentage points in 2024 as new products from competitors come to market and Nvidia’s largest customers make their own chips.
- Nvidia also dived deeper into software for emulating the physical world with 3-D models, with partnerships with Ansys, Cadence, and Synopsys.
- The company introduced a new line of chips designed for cars with new capabilities to run chatbots inside the vehicle.
- Nvidia also outlined a new series of chips for creating humanoid robots.
March 18, San Jose, California Nvidia (NVDA.O) launches a fresh tab During his company’s annual developer conference on Monday, Chief Executive Jensen Huang made a number of announcements aimed at maintaining the chip maker’s leadership in the artificial intelligence space.
On a hockey arena stage in the heart of Silicon Valley, Huang introduced Nvidia’s latest chip, which is 30 times speedier at some tasks than its predecessor.
He also detailed a new set of software tools to help developers sell AI models more easily to companies that use technology from Nvidia, whose customers include most of the world’s biggest technology firms.
Huang also said that Nvidia’s software would be able to stream 3-D worlds to Apple’s new Vision Pro headset.
“I hope you realise this is not a concert, this is a developers’ conference,” Huang joked as he took the stage in a packed arena usually reserved for ice hockey games and concerts.
Mr Huang also outlined a new series of chips for creating humanoid robots, inviting several of the robots to join him on the stage. Founded in 1993, Nvidia was originally known for making the type of computer chips that process graphics, particularly for computer games.
Nvidia is the third-most valuable company in the US, behind only Microsoft and Apple.
Its shares have surged 240% over the past year and its market value touched $2tn (£1.57tn) last month.
Nvidia’s new flagship chip, called the B200, takes two squares of silicon the size of the company’s previous offering and binds them together into a single component.
While the B200 “Blackwell” chip is 30 times speedier at tasks like serving up answers from chatbots, Huang did not give specific details about how well it performs when chewing through huge amounts data to train those chatbots – which is the kind of work that has powered most of Nvidia’s soaring sales. He also gave no price details
In addition to the B200 “Blackwell” chip, its chief executive Jensen Huang detailed a new set of software tools at its annual developer conference.
Nvidia said major customers including Amazon, Google, Microsoft and OpenAI are expected to use the firm’s new flagship chip in cloud-computing services and for their own AI offerings.It also said the new software tools, called microservices, improve system efficiency to make it easier for a business to incorporate an AI model into its work.
Other announcements include a new line of chips for cars which can run chatbots inside the vehicle. The company said Chinese electric vehicle makers BYD and Xpeng would both use its new chips.
The event, dubbed the “AI Woodstock” by Wedbush analyst Dan Ives, has become a can’t-miss date on big tech’s calendar due to Nvidia’s singular role in the AI revolution that has taken the world by storm since the introduction of ChatGPT in late 2022.
Nvidia told the audience of developers and tech executives it was releasing an even more powerful processor and accompanying software, on a platform called Blackwell – named after David Blackwell, the first Black academic inducted into the National Academy of Science.
Blackwell GPUs were AI “superchips” four times as fast as the previous generation when training AI models, Nvidia said.
“The rate at which computing is advancing is insane,” Huang said.
Many analysts expect Nvidia’s market share to drop several percentage points in 2024 as new products from competitors come to market and Nvidia’s largest customers make their own chips.
“Rivals like AMD, Intel (INTC.O), opens new tab, startups, and even Big Tech’s own chip aspirations threaten to chip away at Nvidia’s market share, particularly among cost-conscious enterprise customers,” said Insider Intelligence analyst Jason Bourne.
Though Nvidia is widely known for its hardware offerings, the company has built a significant battery of software products as well.
The new software tools, called microservices, improve system efficiency across a wide variety of uses, making it easier for a business to incorporate an AI model into its work, just as a good computer operating system can help apps work well.
In addition to AI software, Nvidia dived deeper into software for emulating the physical world with 3-D models. For work on designing cars, jets and products, Huang also announced partnerships with design software companies Ansys (ANSS.O), opens new tab, Cadence (CDNS.O), opens new tab and Synopsys (SNPS.O), opens new tab. Shares of the three companies jumped around 3% in extended trade following Huang’s comments.