Open Editor's Digest for free
Rula Khalaf, editor of the Financial Times, picks her favorite stories in this weekly newsletter.
This article is an on-site version of the Lex newsletter. Sign up here to get the full newsletter sent directly to your inbox every Wednesday and Friday
Dear reader,
Before Jensen Huang took the stage at Nvidia's massive developer conference, commentators were calling the event the Woodstock of artificial intelligence. Huang clearly had something else on his mind. With flashing lights, references to Taylor Swift and friendship bracelets at the after party, this was the tour of the era for the Semiconductor crowd.
The third most valuable company in the United States attracted a massive crowd to San Jose this week, all eager to hear what's next for generative AI chips. Of course, even with more than 10,000 people cheering, it's not easy to turn a technology conference into an exciting cultural event. It's still harder when you're a slide design company. Tesla can throw metal at car windows and Apple shoots cool videos. But Nvidia's two-hour keynote presentation consisted mostly of Huang, wearing his signature leather jacket, speaking in front of slides. There was, as he warned at the beginning, a lot of mathematics.
The focus of Nvidia's GTC (as in the GPU technology conference) was to unveil the next generation of AI chips. Blackwell chips, most notably the B200, are made using TSMC's advanced 4nm process and are designed to be more powerful and efficient than the Hopper chips that made Nvidia a trillion-dollar company. This is expected to enable developers to create more powerful AI tools.
Nvidia claims that every major cloud provider is making plans to offer the chips when they become available later this year. They are not significantly more expensive. One analyst suggested $50,000 per chip, but Huang said in an interview that it would be in a similar range for Hopper's chips. This will make it difficult for competitors to compete.
The new Blackwell architecture includes a suite of products targeting small business customers and large data centers. One of the most anticipated of these processors is the GB200 “superchip” – made of two GPUs and a CPU.
Other announcements included Nvidia Inference Microservices, also known as Nim (Nvidia product names are varied and difficult to track). These are generative AI models pre-packaged on Nvidia's Cuda software.
There was also Quantum Cloud, a service that lets researchers experiment with quantum computing software and expand the global universe — a virtual second Earth in which companies can model potential business disruptions and test how they would work in the real world. In addition, there have been stronger partnerships with Microsoft, Google, and Amazon Web Services. These are notable given that the three companies also hope to compete in AI chips themselves.
Nvidia knows that billions of dollars are being funneled into a market it dominates. But those who say it can't maintain its lead should remember that after the company was founded in 1993, its early graphics chips were quickly overtaken by competitors. The company was able to reorganize in a way that allowed it to launch new products faster than its peers and build software alongside its hardware. This created an ecosystem that gave customers a reason to stick with the company. She has experience getting ahead of hungry competitors.
Unfortunately for many of us who made the trip to sunny San Jose, what Nvidia didn't do well was explain where AI will go from here. It's a question on which trillions of dollars in market capitalization depend. This probably explains why Nvidia's share price has barely moved this week – despite the slew of new products on offer.
In-depth reading
Lex doesn't think competitors have any hope of catching up to Nvidia anytime soon. Earnings in February showed how much the company has expanded its progress.
Last January, we looked at Dutch chipmaking equipment maker ASML for clues about a new chip cycle. The capacity its customers believe they will need in the next two years exceeds some analysts' expectations. This indicates a cyclical recovery in the chip sector.
This week in Lex
Default rates on U.S. credit cards are rising as Americans run out of savings. Regulators want to cap late fees. Small, independent credit card companies will be the most affected.
Unfortunately for Unilever, there doesn't seem to be a hidden gem in its business mix. Getting rid of Ice Cream will extract a division that has few synergies with the rest, but is unlikely to unlock value for shareholders.
Thames Water may be struggling under its $18.3 billion debt load, but if it can find new investors and focus on conserving cash, it can still avoid nationalization.
Things I enjoyed this week
The elevation of “algorithms” as a mysterious force against which we are powerless is something that technology companies do little to debunk. But this is not true. Henry Farrell, a professor at Johns Hopkins University, explains why we're not all programmable zombies.
The New Yorker has found that several US chains, including McDonald's, are supplied with seafood from Chinese companies that use forced labor from North Korea. Something to think about when ordering your next Filet-O-Fish meal.
Enjoy your weekend.
Elaine Moore
Vice President of Lex
If you'd like to receive regular Lex updates, add us to FT Digest, and you'll receive an instant email alert every time we publish. You can also see each Lex column across the web page
Newsletters recommended for you
Unhedged – Robert Armstrong analyzes the most important market trends and discusses how the best minds on Wall Street are responding to them. Register here
Chris Giles on Central Banking – Your essential guide to money, interest rates, inflation, and what central banks are thinking. Register here