OpenAI and chip manufacturer Advanced Micro Devices announced a multibillion-dollar cooperation to work on AI data centers powered by AMD processors, posing one of the most direct threats yet to industry leader Nvidia.
OpenAI agreed to buy six gigawatts of AMD chips under the terms of the agreement, beginning with the MI450 chip the following year. Either directly or through its cloud computing partners, the ChatGPT manufacturer will purchase the chips. According to Lisa Su, the head of AMD, the deal would generate tens of billions of dollars in new revenue for the chip maker over the next five years.
The projected total cost of the initiative was not disclosed by the two businesses, although AMD stated that each gigawatt computing capability costs tens of billions of dollars.
If OpenAI reaches specific deployment milestones, it will be granted warrants for up to 160 million AMD shares, or around 10% of the chip manufacturer, at a price of one cent per share, given out in phases. In order for the warrants to be exercised, AMD’s stock price must also rise.
AMD’s stock began Monday morning up 33%.
AMD’s greatest victory in its attempt to challenge Nvidia’s hegemony in the artificial intelligence chip market is this transaction. Although AMD’s processors are extensively utilized in gaming, personal computers, and traditional data center servers, the company hasn’t had much of an impact on the rapidly expanding market for the more expensive supercomputing chips required by sophisticated AI systems.
The AMD processors will be used by OpenAI for inference functions, or the calculations that enable AI applications like chatbots to reply to user inquiries. In an interview with Su, OpenAI CEO Sam Altman stated that the need for inference computing has increased dramatically due to the growth of large language models and other tools.
According to Altman, “it’s hard to overstate how difficult it’s become” to obtain enough processing power. “It takes some time, but we want it really quickly.”
According to the two CEOs, the agreement would unite their businesses and provide them with incentives to invest in the AI infrastructure growth. “Both of our companies benefit, and I’m happy that OpenAI’s incentives are linked to AMD’s success and vice versa,” Su remarked.
AI firms continue to choose Nvidia as their chip provider, but the company is fighting against competition from nearly every industry segment. Cloud behemoths like Google and Amazon create and market their own AI chips, while OpenAI just inked a $10 billion agreement with Broadcom to develop its own proprietary chip. With the promise that it will be more than twice as powerful as its current generation, the Grace Blackwell, Nvidia is launching its much awaited Vera Rubin chip next year.
In the latter part of next year, OpenAI will start utilizing a 1 gigawatt MI450 chip to power its AI models.
According to Altman, as demand for AI services—along with the associated processing and infrastructure requirements—far exceeds supply, the future of many companies will become more intertwined.
“We are in a phase of the build-out where the entire industry’s got to come together and everybody’s going to do super well,” according to Altman. “You’ll notice this on chips. You’ll see this in data centers. You’ll see this further down the supply chain.
During the last month, Altman has been heavily involved in negotiating deals, sometimes securing hundreds of billions of dollars in processing power through innovative financing arrangements. By locking up enough data center capacity, he hopes to win the race to create superintelligence—AI systems that are as intelligent and intuitive as humans.
Over the next ten years, Nvidia plans to invest $100 billion on OpenAI, the company revealed in late September. OpenAI intends to utilize the money from Nvidia to purchase Nvidia chips and install up to 10 gigawatts of processing capacity in AI data centers in accordance with the circular structure. The transaction demonstrated the way in which the market’s seemingly limitless excitement for Nvidia’s stock is supporting the AI industry as a whole.
The Nvidia deal is still pending. The two companies have signed a letter of intent, but they haven’t yet filed a regulatory filing outlining the conditions.
AMD and OpenAI. said Monday’s statement was “definitive,” and they intended to submit information to securities authorities right away, according to those with knowledge of the situation.
In a Monday morning call with investors, Su stated that the agreement was “a clear validation of our technology road map” and that AMD will generate tens of billions of dollars in revenue by 2027.
In response to a question concerning the deal’s special warrant structure, Su described it as “pretty innovative” and said, “I wouldn’t say it came lightly.”
Additionally, Altman just inked a $300 billion megadeal to buy an additional 4.5 gigawatts of cloud computing power over five years from Oracle, the software giant established by multibillionaire Larry Ellison.
If you’re us or our whole industry, you have to think that, based on everything we’re seeing in our research and product metrics, the demand for AI at a sustainable revenue rate will continue to rise sharply, Altman said.
Concerns that an AI bubble is forming have grown as a result of the dealmaking flurry that has engulfed a large portion of the technology sector. Businesses like OpenAI, Meta, Alphabet, and Microsoft are investing in semiconductors, data centers, and electrical power at rates that are far higher than the biggest build-outs in history, such as the railroad boom of the 19th century and the development of the current electrical and fiber-optic grids.
OpenAI’s president and co-founder Greg Brockman stated, “I’m much more concerned about us failing because of too little compute than too much.”
At a meeting in Abilene, Texas, in late September, officials from OpenAI and Oracle outlined their plan to invest billions of dollars in AI data centers, which they said would assist meet ChatGPT’s rapidly growing user base of 700 million per week.
OpenAI has pledged to infrastructure expenditures totaling hundreds of billions of dollars, but it is unclear how it would pay for them. According to The Wall Street Journal, the firm recently informed investors that it would probably spend around $16 billion this year only to rent computing servers, and that would increase to $400 billion in 2029. This year, OpenAI is expected to make $13 billion, and Altman stated that the business is concentrating more on lucrative jobs that can be completed using its tools.
Nvidia is thought to hold over 70% of the market for AI chips, according to Mizuho Securities, but AMD and other competitors have recently tried to provide more reasonably priced options. Each of Nvidia’s most potent combo chips for AI data centers may cost up to $60,000.







