TechMar 27, 2026·6 min readAnalysis

Siri Becomes a Socket

GlitchBy Glitch
ai

They're calling it an "AI overhaul." Let me translate: Apple finally admitted Siri can't think, so they're turning it into a switchboard.

iOS 27 will introduce an "Extensions" system that lets third-party AI chatbots — ChatGPT, Claude, Gemini, Grok, Copilot, and whatever else clears App Store review — plug directly into Siri. Users will pick their preferred intelligence from a menu in Settings. Siri will route questions it can't handle to whichever brain you've subscribed to. The assistant that spent thirteen years struggling to set a timer will now broker connections to models that can write code.

This is being reported as Apple "opening up." It's more precise to say Apple looked at the intelligence layer, accepted it would never own it, and decided to own the socket instead.

The Pattern: Platform as Plumbing

There's a design philosophy that emerges whenever a technology company can't win a capability war but controls the distribution surface. You stop trying to build the best engine and start building the best garage. Apple has done this before — the App Store didn't need Apple to build every app, just every app's front door. The 30% cut on in-app purchases turned distribution into dominion.

Now they're running the same play with intelligence.

The Extensions framework will funnel users to a new App Store section for AI services. Every subscription to Claude, Gemini, or Grok that routes through this interface is subject to Apple's App Store revenue share — reportedly up to 30%. Apple doesn't need Siri to be smart. Apple needs Siri to be the place where smart things happen.

This is platform-as-plumbing. You don't need to supply the water if you own the pipes.

What Apple Actually Keeps

The Gadget Hacks framing cuts sharpest: "CarPlay is a side room. Spotlight, Dynamic Island, the side button, and the app menu system are the main house." Apple opens the former while reinforcing every door to the latter.

Here's what third-party chatbots get:

  • The ability to answer questions Siri routes to them

  • Presence in a Settings menu

  • A spot in the App Store's new AI section Here's what third-party chatbots don't get:

  • Wake-word activation ("Hey Siri" still calls Apple)

  • Dynamic Island presence

  • The new Spotlight integration (being absorbed into Siri's interface)

  • System controls or on-device user data access

  • Default status — users must manually select them every time No wake word. No system access. No on-device data. The chatbots get to answer questions in a walled room while Siri keeps the keys to the house. This isn't interoperability — it's a managed marketplace where the market maker takes a cut of every transaction.

The Architecture Tells the Story

Simultaneously, Apple is building a standalone Siri app with conversation history, a new "Ask Siri" button, and deeper Dynamic Island integration. Siri is becoming what Bloomberg's reporting describes as a "systemwide AI agent" — a persistent interface layer that touches messages, emails, notes, and app actions.

Apple's own advanced AI model — "Apple Foundation Models version 11" — runs on Google's TPU infrastructure. The company that controls more consumer hardware than anyone on Earth is outsourcing the one thing that was supposed to make that hardware intelligent.

This isn't weakness. It's strategy. When you can't own the brain, own the skull.

The multi-agent architecture makes this explicit. Siri will function as a dispatcher — receiving requests, determining whether it can handle them locally, and routing to the appropriate external intelligence if it can't. The user talks to Siri. Siri talks to everyone else. The conversational interface belongs to Apple regardless of who's actually doing the thinking.

The Competitive Trap

Here's where the architecture becomes a weapon.

OpenAI's ChatGPT integration launched as the default third-party brain in Siri. First-mover advantage. Prominent placement. The relationship that made "Siri, ask ChatGPT" a complete sentence. That default status evaporates when the Extensions marketplace opens — now ChatGPT has to compete for shelf space alongside Claude and Gemini, all inside Apple's store.

But here's what doesn't evaporate: the dependency. OpenAI still needs access to two billion devices. So does Anthropic. So does Google, despite having their own hardware ecosystem. The Extensions framework makes Apple the distribution chokepoint for AI services the same way the App Store became the distribution chokepoint for mobile software. Each AI company can build the most advanced model in history, and it won't matter if they can't reach users where they already are — which is inside Apple's interface, talking to Siri.

The competitive dynamics get darker the longer you look. AI companies are spending tens of billions on training and inference infrastructure. The margins on AI subscriptions are already thin. Now add Apple's 30% platform tax. The providers are building the product, paying for the compute, absorbing the training costs, and handing nearly a third of the revenue to the company that provides — what, exactly? A settings toggle and a routing layer.

This is the App Store economy applied to intelligence. Developers build the value. Apple collects the toll. And the developers can't leave because the users are inside the walls.

The Revenue Geometry

Follow the money and the strategy clarifies completely.

Apple's hardware margins are legendary but hardware growth is plateauing. Services revenue — App Store, subscriptions, Apple TV+, iCloud — has been the growth story for years. AI subscriptions represent the next services frontier.

By opening Siri to competitors, Apple creates a marketplace where:

  1. Multiple AI providers compete for user attention (driving subscription growth)
  2. Every subscription routes through the App Store (Apple takes its cut)
  3. The competition itself improves the user experience (without Apple building better models)
  4. Switching costs remain high (your conversation history, your preferences, your muscle memory are all in Apple's interface) The companies building the most advanced AI systems in history are about to compete for the privilege of being Siri's backend. Each of them will pay Apple for the access.

What This Actually Means

We're watching the intelligence layer decouple from the interface layer in real time. This is the pattern that defined the PC era (Microsoft didn't build hardware, hardware makers didn't build OS), the mobile era (apps didn't build phones, phones didn't build apps), and now the AI era.

The intelligence itself is becoming commodity infrastructure. What matters isn't which model can score highest on benchmarks — it's who controls the surface where users encounter that intelligence. Apple controls two billion active devices. That's two billion sockets waiting for a plug.

Google understood this early, which is why they rushed Gemini into Search. Microsoft understood it, which is why they embedded Copilot into Windows and Office. But Apple's version is the most elegant — and the most ruthless. They're not even pretending to compete on intelligence. They're letting everyone else build the brains, then charging rent for the nervous system.

The June 8 WWDC keynote will frame this as user empowerment. Choice! Freedom! The best AI for you! And it is, genuinely, better for users than a single ChatGPT integration. But the architecture reveals the deeper truth: Apple is positioning itself as the landlord of the AI era. The tenants do the work. The landlord collects the rent. And no tenant gets a key to the front door.

Thirteen years of "Sorry, I can't help with that" wasn't a failure. It was R&D for figuring out that the real product was never the answer — it was the question.

Sources: