
OpenAI teams up with Broadcom to design and produce its first AI chip, set for mass production in 2026.
Introduction: A new chapter in the AI hardware revolution on OpenAI AI chip
The role of hardware in the field of artificial intelligence (AI) is like the engine that powers the entire technology. And in the month of September of the year 2025, a big news came out when OpenAI announced that it will develop and mass produce its first AI chip in collaboration with Broadcom in 2026. Also, this announcement can prove to be a turning point for the AI industry. But the decision of a major AI company like OpenAI to develop its own chip will not only meet its own needs but can change the geography of the entire AI hardware market.
Also, in this article, we will discuss every aspect of OpenAI’s AI chip in detail from the background of the project and the major partners involved to technical details business impact and future prospects. But we will analyze how this move will reduce OpenAI’s dependence on Nvidia so that what impact will it have on the AI industry and what does it indicate for the future of AI hardware.
Outline of the partnership
The partnership happening between OpenAI and Broadcom is an important event for the AI industry. And according to the Financial Times report, this partnership will lead to the production of OpenAI’s first AI chip in 2026 along with Broadcom, a leading American semiconductor company. But is joining this project with its expertise in AI chip design and development. And this partnership is not just a supply agreement but a strategic alliance. But where both the companies are developing a cutting-edge AI chip together. Also, Broadcom already has experience in designing custom chips with tech giants like Google. But the company’s AI revenue is expected to improve significantly in the year 2026. Also, the $10 billion order received from OpenAI can be a major contributor.
Broadcom’s role
Broadcom is acting as the main design partner in this project. The company will play an important role in ensuring chip design as well as production. Broadcom specializes in AI accelerator design. But she will help OpenAI refine various components of the chip, especially those that help rapidly transfer information on and off the chip, which is particularly important for AI systems where thousands of chips work together. Also, Broadcom already has experience working with companies like Alphabet, Amazon and Meta. Who are creating their own custom chips. And this shows that Broadcom is a reliable and experienced partner in the field of AI hardware design.
Talking about OpenAI’s role
OpenAI is acting as the technical lead in this project, which will define the functional requirements and features of the chip. Also, the company has formed a chip team. But which consists of about 20 people, led by top engineers who previously created Tensor Processing Units (TPUs) at Google. And this team includes experienced professionals like Thomas Nori and Richard Ho. But who have extensive experience in AI hardware design. Also, OpenAI’s main focus is to design a high-performance chip to suit the computing needs of its AI models. And the company will oversee the design and functionality of the chip while Broadcom will work to turn the design into reality and ensure production.
History of OpenAI’s chip plans
OpenAI’s plan to develop its own AI chip is nothing new. And in the year 2023, Reuters reported that OpenAI was considering various options to reduce its dependence on Nvidia. But at that time the company evaluated several options to diversify its chip supply and reduce costs. Also in February 2024, Reuters reported that OpenAI is moving forward with its plan to reduce its dependence on Nvidia by developing its first generation in-house AI silicon. But at that time sources had said that the ChatGPT maker was finalizing the design of its first in-house chip in the next few months and was planning to send it to TSMC for manufacturing.
Also in October 2024, Reuters published an exclusive report revealing that OpenAI is building its first in-house chip in collaboration with Broadcom and TSMC. Also, it is including Nvidia chips as well as AMD chips to meet the increase in infrastructure demands.
Chip design and capabilities
OpenAI’s first AI chip will be a special AI accelerator designed specifically for AI workloads. According to the Financial Times, this chip will focus mainly on AI inference. But inference is the process where a trained AI model makes predictions or decisions based on new data. While training is the process where the AI model learns from the data. Also, currently the demand for AI chips is more for training. Analysts estimate that as more AI applications are deployed, the need for inference chips may exceed the need for training chips. Also, OpenAI is specially designing its chip to meet this growing need. Additionally, the company has not made the exact technical specifications of the chip public yet. However, they expect it to focus on high performance, energy efficiency, and scalability.
Manufacturing process
The chip will be manufactured by Taiwan Semiconductor Manufacturing Co (TSMC), the world’s largest semiconductor manufacturing company. TSMC is known for its expertise in advanced manufacturing processes, and is expected to manufacture OpenAI’s chip on the company’s most advanced nodes. Broadcom will act as an intermediary between OpenAI and TSMC to help secure manufacturing capacity and manage the production process. This agreement is a common practice in the semiconductor industry where design companies work with specialized foundries for manufacturing.
Integration and deployment
In the first phase, OpenAI will use its chip internally rather than supply it to external customers, but this means that the chip will first be deployed in OpenAI’s own data centers to power the company’s own AI models and services. Also, the chip will be designed to integrate seamlessly with OpenAI’s existing infrastructure. This will include compatibility with cloud services, software frameworks, and other hardware components. OpenAI will still continue to use chips from Nvidia and AMD for its chip requirements, but with its own chips, the company will be able to diversify its reliance on various suppliers.
Current State of the AI Hardware Market
Currently, Nvidia dominates the AI hardware market, with more than 80% market share. The company’s GPUs have become the industry standard for both AI training and inference, and are widely used by major AI companies and research institutions. But Nvidia’s dominance has created challenges, including supply constraints, high prices, and limited availability of alternatives. These challenges have led major tech companies to develop their own custom AI chips, as well as custom chip efforts from other companies. OpenAI isn’t the only company developing its own AI chips, and other major tech companies have also launched similar initiatives.
Cost savings opportunity
By developing its own AI chips, OpenAI hopes to make significant cost savings in the long-term. Also, the computing costs of training and running AI models are a major expense for the company. OpenAI projected a loss of $5 billion in 2024. But against revenue of $3.7 billion, compute costs – the expenses of hardware electricity and cloud services – are the company’s biggest expense. Also, by developing its own optimized chips, OpenAI can improve its computing efficiency and reduce its overall compute costs. Also, these cost savings can eventually help improve the company’s financial performance and offer more affordable AI services.
Financial implications for Broadcom
The partnership with OpenAI is a significant financial opportunity for Broadcom. With the company upgrading its AI revenue outlook for fiscal 2026, with expectations of significant improvement. Broadcom’s shares increased 4.5% after the news, reflecting investors’ optimism about the partnership. Also for Broadcom, the partnership with OpenAI is an opportunity to diversify and expand the company’s AI business. And the company already works with Google and other major tech companies. And the addition of OpenAI will further strengthen its customer portfolio.