Software

AI + ML

OpenAI allegedly wants TSMC 1.6nm for in-house AI chip debut

Another job for Broadcom, then


OpenAI's first custom-designed silicon chips allegedly will be manufactured by Taiwan Semiconductor Manufacturing Company (TSMC), the same outfit churning out processors for Nvidia, Apple, AMD, Intel, and others.

United Daily News Group, one of Taiwan's largest media orgs, this week said industry sources claim OpenAI has booked capacity for TSMC's A16 process node, which is targeting mass production in the second half of 2026.

A16 is a 16 Angstrom or 1.6-nanometer manufacturing process. "Compared with N2P," TSMC's 2nm-class process node, "A16 offers 8-10 percent speed improvement at the same Vdd [working voltage], 15-20 percent power reduction at the same speed, and 1.07-1.10X chip density," according to the factory giant.

The chip manufacturing goliath did not respond to a request for comment.

Apple is said to have been the first major customer to have reserved A16 production capacity.

The iPhone maker in June announced a partnership with OpenAI to integrate ChatGPT into iOS 18, iPadOS 18, and macOS Sequoia via a service called Apple Intelligence. While Apple says that many of the machine learning models used by Apple Intelligence run on-device, the iBiz also plans to deploy a service called Private Cloud Compute, running on Apple Silicon, to use server-based AI models to handle complex requests.

There's no indication presently that OpenAI anticipates Apple, which has its own Neural Engine hardware for accelerating AI workloads, will end up using OpenAI silicon when it becomes available.

OpenAI did not respond to a request for comment.

OpenAI has reportedly explored investing in its own chip fabs to the tune of $7 trillion, but now those ambitions appear to have been scaled back. Instead of negotiating with TSMC to build a dedicated wafer factory, which is ridiculous when you think about it, the AI model maker is believed to be pursuing the fabrication of its own machine-learning accelerating ASIC with the help of Broadcom and Marvell on a TSMC node.

That seems more realistic and normal: A software company partnering with a chip design company to have a custom, app-specific processor fabbed by a contract manufacturer. Broadcom helped Google design the web giant's TPUs after all.

UDN expects TSMC to fab Broadcom- and Marvell-designed ASICs on its 3nm node and its subsequent A16 process. Ergo it's not really a surprise that TSMC would fab a 1.6nm chip for OpenAI with Broadcom or similar aiding in the design ad testing.

OpenAI is said to be working on a funding deal that would see the company valued at $100 billion. The AI super-lab, which as of June had a reported annualized revenue of $3.4 billion, now says it has more than 200 million weekly active users of ChatGPT, double the number cited last November.

And it claims that 92 percent of Fortune 500 companies are using OpenAI's products to some degree. Also, API usage is said to have doubled since the release of GPT-4o mini in July. On the other hand, OpenAI took more than $10 billion in pledged support from Microsoft to get to this point, among other investments, and may dive $5 billion into the red this year due to its non-trivial neural net training and staff costs. ®

Send us news
5 Comments

AI-pushing Adobe says AI-shy office workers will love AI if it saves them time

knowledge workers, overwhelmed by knowledge tasks? We know what you need

AI firms propose 'personhood credentials' … to fight AI

It's going to take more than CAPTCHA to prove you're real

Canadian artist wants Anthropic AI lawsuit corrected

Tim Boucher objects to the mischaracterization of his work in authors' copyright claim

GPT apps fail to disclose data collection, study finds

Researchers say that implementing Actions omit privacy details and expose info

Have we stopped to think about what LLMs actually model?

Claims about much-hyped tech show flawed understanding of language and cognition, research argues

OpenAI co-founder's Safe Superintelligence startup inhales $1B in funding

No product? No problem!

Buying a PC for local AI? These are the specs that actually matter

If you guessed TOPS and FLOPS, that's only half right

The future of AI/ML depends on the reality of today – and it's not pretty

The return of Windows Recall is more than a bad flashback

Dell's all-in bet on AI pays off in latest earnings

The term was mentioned over 140 times during the earnings call

Brit teachers are getting AI sidekicks to help with marking and lesson plans

Isn't the education system in enough trouble already?

If every PC is going to be an AI PC, they better be as good at all the things trad PCs can do

Microsoft's Copilot+ machines suck at one of computing's oldest use cases

Fintech outfit Klarna swaps humans for AI by not replacing departing workers

Insists it's not cutting jobs and pays harder-to-automate people more with AI savings