Arm, the chip design firm whose architecture underpins the vast majority of the world's mobile processors, is now building and selling its own AI chips — putting it in direct competition with the customers it has long supplied with intellectual property.

For decades, Arm Holdings operated a licensing model: it designed processor architectures and collected royalties when partners like Apple, Qualcomm, and Nvidia manufactured chips based on those designs. That model made Arm one of the most influential companies in semiconductors without ever fabricating a single chip. The move into proprietary AI hardware represents the most consequential strategic pivot in the company's history.

Meta, OpenAI, and Cloudflare Among First Customers

According to Wired, Meta, OpenAI, Cerebras, and Cloudflare are among the first customers signed onto Arm's new AI CPU product. The presence of OpenAI on that list is particularly notable — the AI lab is simultaneously one of the world's largest consumers of compute and an organisation actively working to reduce its dependence on any single chip supplier, including Nvidia.

Arm is now building and selling its own AI chips — putting it in direct competition with the customers it has long supplied with intellectual property.

Cloudflare's inclusion signals that the hardware is designed to serve inference workloads at the network edge, not just large-scale datacenter training runs. Cerebras, which itself makes unconventional AI silicon, joining as a customer suggests a potential integration or complementary deployment scenario rather than a straightforward competitive dynamic.

Why This Challenges Arm's Existing Business Model

Arm's licensing model depends on maintaining trust with a broad ecosystem of chip manufacturers. By producing its own silicon, the company risks alienating partners who now face direct competition from the very firm that licenses them their core architecture. The tension is not unlike what Amazon faced when it began competing with merchants on its own marketplace, or what Microsoft encountered when it entered the hardware business with Surface.

Arm's counterargument, implicitly, is that the AI era demands a different approach. Custom silicon optimised for AI inference and training requires tighter integration between architecture and implementation than a pure licensing model allows. The company appears to be betting that the revenue opportunity in AI hardware outweighs the reputational risk with existing licensees.

The Competitive Landscape Arm Is Entering

The AI chip market is currently dominated by Nvidia, whose H100 and B200 GPUs remain the preferred hardware for large-scale model training. However, a growing number of challengers — including AMD, Intel, Google (with its TPUs), Amazon (with Trainium and Inferentia), and a raft of AI chip startups — are competing for a share of inference workloads in particular.

Arm's entry into this space brings distinct advantages. Its architecture already runs efficiently on low-power devices, and its deep relationships with cloud providers and hyperscalers give it distribution channels that most startups lack. The company also holds a structural advantage: it understands the architecture at a level no external licensee can fully match.

The specific technical characteristics of the new AI CPU — clock speeds, memory bandwidth, power envelope, and target workloads — have not been fully detailed in available reporting. What is confirmed, according to Wired, is that the product exists and has secured marquee customers.

What the Move Signals About Arm's Ambitions Under SoftBank

SoftBank, which took Arm private in 2016 and retained a majority stake following the company's Nasdaq relisting in September 2023, has consistently signalled its intention to position Arm as a central player in AI infrastructure. SoftBank founder Masayoshi Son has spoken publicly about artificial general intelligence and Arm's role in enabling it. A proprietary AI chip line is consistent with that stated ambition.

Arm's current market capitalisation sits above $150 billion, reflecting investor expectations of significant AI-driven revenue growth beyond traditional licensing fees. Producing and selling chips directly would, if successful, open an entirely new and potentially higher-margin revenue stream.

What This Means

Arm's move from licensor to chip manufacturer signals that the AI hardware race has reached a point where even foundational infrastructure companies feel compelled to compete directly — and any firm that builds on Arm's architecture now does so knowing its supplier is also a rival.