Elon Musk, Chief Executive Officer of SpaceX and Tesla and owner of X speaks at the Milken Conference 2024 Global Conference Sessions at The Beverly Hilton in Beverly Hills, California, U.S., May 6, 2024.
David Swanson | Reuters
Elon Musk claims he can grow Tesla into “a leader in AI & robotics,” an ambition that he’s said will require a lot of pricey processors from Nvidia to build up its infrastructure.
On Tesla’s first-quarter earnings call in April, Musk said the electric vehicle company will increase the number of active H100s — Nvidia’s flagship artificial intelligence chip — from 35,000 to 85,000 by the end of this year. He also wrote in a post on X a few days later that Tesla would spend $10 billion this year “in combined training and inference AI.”
But emails written by Nvidia senior staff and widely shared inside the company suggest that Musk presented an exaggerated picture of Tesla’s procurement to shareholders. Correspondence from Nvidia staffers also indicates that Musk diverted a sizable shipment of AI processors that had been reserved for Tesla to his social media company X, formerly known as Twitter.
Tesla shares slipped as much as 1% on the news in premarket trading.
By ordering Nvidia to let privately held X jump the line ahead of Tesla, Musk pushed back the automaker’s receipt of more than $500 million in graphics processing units, or GPUs, by months, likely adding to delays in setting up the supercomputers Tesla says it needs to develop autonomous vehicles and humanoid robots.
“Elon prioritizing X H100 GPU cluster deployment at X versus Tesla by redirecting 12k of shipped H100 GPUs originally slated for Tesla to X instead,” an Nvidia memo from December said. “In exchange, original X orders of 12k H100 slated for Jan and June to be redirected to Tesla.”
A more recent Nvidia email, from late April, said Musk’s comment on the first-quarter Tesla call “conflicts with bookings” and that his April post on X about $10 billion in AI spending also “conflicts with bookings and FY 2025 forecasts.” The email referenced news about Tesla’s ongoing, drastic layoffs and warned that headcount reductions could cause further delays with an “H100 project” at Tesla’s Texas Gigafactory.
The new information from the emails, read by CNBC, highlights an escalating conflict between Musk and some agitated Tesla shareholders who question whether the billionaire CEO is fulfilling his obligations to Tesla while also running a collection of other companies that require his attention, resources and hefty amounts of capital.
A spokesperson for Nvidia declined to comment for this story. Musk and representatives for X and Tesla did not respond to requests for comment.
Critics have said Musk is only a part-time CEO of Tesla, the company responsible for the vast majority of his wealth. Musk is also the CEO of aerospace company SpaceX, the founder of brain-computer interface startup Neuralink and tunneling venture The Boring Co. He also owns X, which he acquired for $44 billion in late 2022, when it was still called Twitter. He launched his AI startup, xAI, in 2023.
X and xAI are tightly intertwined. In a post on X in November, Musk wrote, “X Corp investors will own 25% of xAI.” Additionally, xAI uses some capacity in X data centers to run some of its training and inference for the large language models behind its chatbot, called Grok, CNBC has learned.
Musk has pitched Grok, originally named Truth GPT, as a politically incorrect chatbot with “a rebellious streak” and a would-be competitor to OpenAI’s ChatGPT and other generative AI services.
While Musk juggles his many ventures, Tesla shareholders have reason for concern. The company is in the midst of a troubling sales decline due in part to its aging lineup of electric vehicles and increased competition. Its reputation has also suffered in the U.S., according to the Axios Harris Poll 100 survey, which attributed some of the slippage to Musk’s “antics” and “political rants.”
Tesla’s stock price is down 29% this year.
Rather than discuss EV sales or the massive restructuring underway at Tesla, Musk has been encouraging investors to focus on future products that he’s been promising for years but has yet to deliver. That includes AI software to turn existing cars into self-driving vehicles, dedicated robotaxis that can make money for their owners, and a driverless transportation network.
“If somebody doesn’t believe Tesla’s going to solve autonomy, I think they should not be an investor in the company,” Musk said on the April earnings call. “We will, and we are.”
To get there, he’s said, Tesla requires plenty of Nvidia’s GPUs which are specialized for AI training and workloads. Those chips are in limited supply due to soaring demand from Google, Amazon, Meta, Microsoft, OpenAI and others.
‘Consuming every GPU that’s out there’
Nvidia, now the third-most-valuable company in the world with a $2.8 trillion market cap, has said it’s hard to keep up with demand. Between the cloud service providers and the companies developing AI models, customers “are consuming every GPU that’s out there,” Nvidia CEO Jensen Huang said on an earnings call in May, after the chipmaker reported its third straight quarter of more than 200% revenue growth.
Huang also said, on an earnings call in February, that Nvidia does its best to “allocate fairly and to avoid allocating unnecessarily,” adding “why allocate something when the data center’s not ready?”
In naming customers that are already using Nvidia’s next-generation Blackwell platform, Huang mentioned xAI on the May call alongside six of the biggest tech companies on the planet as well as Tesla.
Jensen Huang, co-founder and chief executive officer of Nvidia Corp., during the Nvidia GPU Technology Conference (GTC) in San Jose, California, US, on Tuesday, March 19, 2024.
David Paul Morris | Bloomberg | Getty Images
Musk likes to tout his infrastructure spending at both companies.
At Tesla, Musk has promised to build a $500 million “Dojo” supercomputer in Buffalo, New York, and a “super dense, water-cooled supercomputer cluster” at the company’s factory in Austin, Texas. The technology would potentially help Tesla develop the computer vision and LLMs needed for robots and autonomous vehicles.
At xAI, which is racing to compete with OpenAI, Anthropic, Google and others in developing generative AI products, Musk is also seeking to build “the world’s largest GPU cluster” in North Dakota, with some capacity online in June, according to an internal Nvidia email from February.
The memo described a “Musk mandate” to make all 100,000 chips available to xAI by the end of 2024. It noted that the LLM behind xAI’s Grok was relying on Amazon and Oracle cloud infrastructure, with X providing additional data center capacity.
The Information previously reported some details of xAI’s data center ambitions.
On May 26, xAI said it closed a $6 billion financing round led by many of the same investors who funded Musk’s Twitter takeover. The company was incorporated in March 2023, but Tesla didn’t disclose its formation at the time, and it was four months later before Musk publicly introduced the startup.
Conflicts of interest
While Musk has said for years that Tesla is a leader in AI, he wrote in a post on X in January that he’d want more control over the company before pushing further in that direction.
“I am uncomfortable growing Tesla to be a leader in AI & robotics without having ~25% voting control. Enough to be influential, but not so much that I can’t be overturned,” he said in the post.
Tesla’s latest proxy filing indicates Musk has 20.5% of the company’s outstanding shares, a figure that includes options awarded to Musk as part of his unprecedented 2018 CEO pay package. A Delaware court has ordered that compensation to be rescinded. Post-trial proceedings are ongoing and subject to appeal.
If he is unable to reach his desired ownership mark, Musk said in the January post, he “would prefer to build products outside of Tesla.” He’s already doing that at xAI.
Musk’s comments at the time rankled some longstanding bulls, including the company’s largest retail shareholder, Leo Koguan, and Gerber Kawasaki’s Ross Gerber, who characterized his demand as “blackmail.”
Joel Fleming, a securities litigator at Equity Litigation Group, said that by letting his private companies skip ahead of Tesla in procuring critical hardware, Musk is making his conflicts of interest readily apparent.
“When you have someone like Mr. Musk who is a fiduciary to multiple companies, the law recognizes this creates conflict,” Fleming said. “If you owe fiduciary duties to two or more companies that are competing over the same things, you may end up channeling corporate opportunity away from one company to another.”
Fleming, who frequently represents public company investors in shareholder disputes, said that in such situations, other executives would be in the best position to make decisions, while those who are conflicted should abstain.
“That has not historically been the path that Mr. Musk has chosen for himself,” Fleming said.
Musk hasn’t been shy about intermingling corporate resources among his companies.
For example, following his buyout of Twitter, Musk enlisted dozens of Autopilot software engineers and other technical and administrative employees from Tesla to help him make sweeping changes at the company. Some employees even work for two Musk companies at once.
At xAI, Musk has also attracted employees away from Tesla, including machine-learning scientist Ethan Knight, and at least four other former Tesla employees who had been involved in Autopilot and big data projects there before joining the startup.
A former Tesla supply chain analyst, who asked not to be named in order to discuss sensitive matters, told CNBC that Musk has always considered his companies as an extension of his persona and believed he can do whatever he wants with them. That includes Tesla’s 2016 acquisition of SolarCity, where he was chairman and a top shareholder.
However, the person said, redirecting a large shipment of chips from Tesla to X is extreme, given the scarcity of Nvidia’s technology. The decision means the automaker willingly gave up precious time that could have been used to build out its supercomputer cluster in Texas or New York and advance the models behind its self-driving software and robotics.