OpenAI , the maker of ChatGPT, has announced a new partnership with chip giant Broadcom to design and build specialized computer processors for artificial intelligence, marking its latest move in a series of high-profile deals. The collaboration follows recent agreements with Nvidia and AMD to supply computer chips, as OpenAI seeks to bolster its infrastructure to meet the soaring global demand for AI services.
What is OpenAI-Broadcom partnership about
The financial terms of the Broadcom deal were not disclosed, leaving uncertainty about the costs of OpenAI’s expanding partnerships. The company said the collaboration will deliver 10 gigawatts of computing power by next year, an amount equivalent to the energy needed to power a major city. By developing custom processors tailored to its AI models, OpenAI aims to enhance the speed and efficiency of its technology, reducing reliance on off-the-shelf chips from companies like Nvidia and AMD.
“Partnering with Broadcom is a critical step in building the infrastructure needed to unlock AI’s potential and deliver real benefits for people and businesses,” said Sam Altman, OpenAI’s co-founder and chief executive. The custom chips will be deployed in data centers operated by OpenAI and its partners, with a new facility already under construction in Abilene, Texas, and additional sites planned in Texas, New Mexico, Ohio, and the Midwest.
Broadcom will not take an equity stake in OpenAI or provide stock as part of the deal. Instead, the partnership focuses on technical collaboration to create processors optimized for OpenAI’s AI workloads, potentially giving the company greater leverage in negotiations with other chipmakers.
Analysts not-so-pleased with OpenAI's spending spree
However, the rapid pace of OpenAI’s partnerships has sparked skepticism among analysts. Despite strong growth and investor enthusiasm, the AI industry, including OpenAI, has yet to show signs of profitability. The lack of transparency around the financing of these large-scale projects has fueled concerns about a potential bubble in AI spending. Critics also warn that the immense power demands of AI chips and data centers could strain electricity providers, raising questions about the sustainability of such infrastructure.
As OpenAI continues its aggressive expansion, the tech world is watching closely to see whether these investments will yield breakthroughs in AI performance or contribute to growing concerns about unchecked spending and resource consumption in the industry.
What is OpenAI-Broadcom partnership about
The financial terms of the Broadcom deal were not disclosed, leaving uncertainty about the costs of OpenAI’s expanding partnerships. The company said the collaboration will deliver 10 gigawatts of computing power by next year, an amount equivalent to the energy needed to power a major city. By developing custom processors tailored to its AI models, OpenAI aims to enhance the speed and efficiency of its technology, reducing reliance on off-the-shelf chips from companies like Nvidia and AMD.
“Partnering with Broadcom is a critical step in building the infrastructure needed to unlock AI’s potential and deliver real benefits for people and businesses,” said Sam Altman, OpenAI’s co-founder and chief executive. The custom chips will be deployed in data centers operated by OpenAI and its partners, with a new facility already under construction in Abilene, Texas, and additional sites planned in Texas, New Mexico, Ohio, and the Midwest.
Broadcom will not take an equity stake in OpenAI or provide stock as part of the deal. Instead, the partnership focuses on technical collaboration to create processors optimized for OpenAI’s AI workloads, potentially giving the company greater leverage in negotiations with other chipmakers.
Analysts not-so-pleased with OpenAI's spending spree
However, the rapid pace of OpenAI’s partnerships has sparked skepticism among analysts. Despite strong growth and investor enthusiasm, the AI industry, including OpenAI, has yet to show signs of profitability. The lack of transparency around the financing of these large-scale projects has fueled concerns about a potential bubble in AI spending. Critics also warn that the immense power demands of AI chips and data centers could strain electricity providers, raising questions about the sustainability of such infrastructure.
As OpenAI continues its aggressive expansion, the tech world is watching closely to see whether these investments will yield breakthroughs in AI performance or contribute to growing concerns about unchecked spending and resource consumption in the industry.