AI ToolsLanguage ModelSiri AI (iOS 26.4)
Siri AI (iOS 26.4)
Free
Visit
Language Model
Siri AI (iOS 26.4)

Apple sued over Siri ads, paid $1.5B/year to Google, lost its top AI exec to DeepMind — and the new Siri still might not ship with iOS 26.4.

New AI Siri iOS 26.4: What's Actually Coming (And What Apple Keeps Delaying)

New AI Siri — Status as of March 5, 2026:

  • Target release: iOS 26.4, expected March–April 2026 — but Bloomberg's Mark Gurman now says key features slipped to iOS 26.5 (May) or iOS 27 (September)
  • The engine: Google Gemini powering new Siri via Apple's Private Cloud Compute — a $1.5 billion/year deal Apple signed in January 2026
  • Three new core features: Personal Context (reads your emails, messages, files), On-Screen Awareness (sees what's on your screen), In-App Actions (does things inside apps without you opening them)
  • Secret weapon: "World Knowledge Answers" — Apple's internal codename for a Perplexity-style AI answer engine being built into Siri, Safari, and Spotlight
  • What it's NOT: A full chatbot — no long-term memory, no back-and-forth conversation mode in iOS 26.4. That's iOS 27 territory

Apple announced the new Siri at WWDC 2024. It was supposed to ship with iOS 18. Then iOS 18.4. Then iOS 26. Internal testing revealed the enhanced Siri worked properly only about two-thirds of the time — an error rate so high that software chief Craig Federighi effectively killed the original version and ordered the team to rewrite Siri from scratch on a new LLM-based infrastructure. Apple sued for false advertising, the company settled a class-action lawsuit in December 2025 after being forced to acknowledge that the AI features it had aggressively marketed — the ones that drove millions of iPhone 16 upgrades — did not exist.

Meanwhile, Apple chose Google as its AI partner in January 2026 — a multi-year deal worth approximately $1.5 billion per year — after reportedly passing on OpenAI for this role and evaluating Anthropic and Perplexity as alternatives. The senior Siri executive Stuart Bowers left Apple and went directly to Google DeepMind. Meta had poached several key Apple AI engineers with packages as high as $200 million. Apple's own AI head, John Giannandrea, retired. His replacement — Amar Subramanya, a former Google and Microsoft AI researcher — has been in the job barely three months.

Against all of that backdrop, the new Siri is still expected to debut sometime in spring 2026 — possibly with iOS 26.4, possibly slipping further. This is what's actually coming, what's already been pushed back, what the secret "World Knowledge Answers" project means for search, and why the stakes for Apple couldn't be higher.

Why Siri Needed a Complete Rebuild (Not Just an Update)

The original Siri — the one you've been using since the iPhone 4S in 2011 — was not built on large language models. It was a rules-based, intent-recognition system. You said words, Siri parsed them against a fixed decision tree, and it triggered an action. It worked for "set a timer for 10 minutes." It failed at everything nuanced.

Apple rebuilt Siri from the ground up on a new LLM-based infrastructure after the original engineering team's attempts to bolt AI capabilities onto the legacy system produced unacceptable reliability. Siri in iOS 26.4 will be more similar to Google Gemini than the current Siri — though without full chatbot capabilities. Apple plans to continue running some features on-device and use Private Cloud Compute to protect user privacy. Personal data stays on-device, requests are anonymized, and AI features can still be disabled entirely.

The new architecture is a hybrid: Apple will keep personal data on-device while routing world-knowledge queries through its Private Cloud Compute servers running Gemini. Siri will not function as a chatbot — the updated version will not feature long-term memory or extended back-and-forth conversations, and Apple plans to use the same voice-based interface with limited typing functionality in iOS 26.4. The full chatbot mode — internally codenamed "Campos" — is a separate project targeting iOS 27 in September.

The Three Core Features Coming to Siri in iOS 26.4

1. Personal Context — Siri Finally Knows Who "Mom" Is

Personal context will facilitate Siri's ability to retrieve specific information from various sources like notes and emails. This is the feature Apple demoed at WWDC 2024 — the one that was supposed to ship 18 months ago — where a user asks "find the book recommendation Mom sent me" and Siri hunts through Messages, Mail, and Notes simultaneously to surface it. Siri will be able to keep track of emails, messages, files, photos, and more, learning more about you to help you complete tasks. Examples include: "Show me the files Eric sent me last week," "Find the email where Eric mentioned ice skating," and "Find the books that Eric recommended to me."

Awareness of your personal context enables Siri to help you in ways that are unique to you. Need your passport number while booking a flight? Siri can use its knowledge of the information on your device to help find what you're looking for — without compromising your privacy. The privacy architecture matters here: your emails and messages are processed on-device by Apple's own foundation models, not sent to Google's servers. Only world-knowledge queries hit the Gemini layer.

2. On-Screen Awareness — Siri Sees What You're Looking At

On-screen awareness will allow Siri to perform tasks currently visible to the user. Apple Intelligence empowers Siri with onscreen awareness, so it can understand and take action with things on your screen. If a friend texts you their new address, you can say "Add this address to their contact card" and Siri will take care of it.

The practical applications here go further than that example. Looking at a photo of a restaurant menu and want to know the calories? Ask Siri. Reading a news article and want a summary of the author's other work? Ask Siri about what's on screen. Watching a video and want to know who the actor is? Siri sees the frame. Apple has shown an iPhone user asking Siri to provide information about their mother's flight and lunch plans, using information it gained from their messages and emails — while simultaneously referencing what's on screen. It's the combination of personal context and screen awareness that makes this genuinely new ground for any built-in assistant.

3. In-App and Cross-App Actions — Siri Does Things Without You Opening Anything

App actions without opening apps: another promised change was Siri's ability to perform actions inside apps without launching them. Apple demonstrated use cases where users could ask Siri to locate a photo, edit it, and save it to a specific folder using voice commands alone.

Seamlessly take action in and across apps with Siri. You can make a request like "Send the email I drafted to April and Lilly" and Siri knows which email you're referencing and which app it's in. This is the feature that turns Siri from a search interface into an actual agent — one that executes multi-step tasks across your entire phone silently in the background. Deeper app integration means that Siri will be able to do more in and across apps, performing actions and completing tasks that simply aren't possible with the current personal assistant.

The Secret Weapon: "World Knowledge Answers" — Apple's Answer to Perplexity

Apple has a new feature in the pipeline: its own generative AI search engine, internally dubbed World Knowledge Answers (WKA). Apple has used this opportunity to rebuild the assistant to create a new search product in order to compete with the likes of Perplexity and OpenAI's ChatGPT. That feature is scheduled for March as part of the new Siri, and it may eventually make its way into the Safari browser and Spotlight search tool.

This is where the stakes become existential for both Apple and Google. Apple's iPhone is Google's most valuable search distribution channel — instead of giving users a list of blue links, Siri will provide summarized, conversational answers, mirroring Google's AI Overviews and ChatGPT responses. Traditional rankings may matter less — getting content included in AI summaries becomes key. If World Knowledge Answers ships and gains adoption, every query that currently goes through Safari to Google could go through Siri's WKA instead. Google pays Apple approximately $20 billion per year to be Safari's default search engine. WKA doesn't cancel that deal — but it fundamentally changes what that deal means.

The Gemini Paradox:

Apple is paying Google $1.5 billion per year to power Siri — while simultaneously building an AI search product (WKA) designed to route queries away from Google Search. Apple declined to comment on whether ChatGPT's integration within Apple Intelligence would change given the Gemini deal, and Apple isn't making any changes to its OpenAI agreement either. So in 2026, Siri may simultaneously use OpenAI GPT-4o for creative tasks, Google Gemini for world-knowledge queries, and Apple's own on-device models for personal context — all without the user knowing which engine answered their question. Google gets paid by Apple to answer questions that Apple is also building a competing system to answer. That's the deal.

What's Delayed: The Features Slipping Past iOS 26.4

Reports now say Apple's most advanced Siri features may not ship with iOS 26.4. Internal testing has revealed quality problems and performance issues, particularly with the parts powered by machine learning. Bloomberg's Mark Gurman said key context-aware functions and deeper app integrations have been postponed to later releases — iOS 26.5 in May or even iOS 27 in September.

Feature Expected Release Status
Personal Context (basic) iOS 26.4 (March/April 2026) ✅ On track per Apple code leaks
On-Screen Awareness iOS 26.4 (March/April 2026) ✅ On track per Apple code leaks
In-App Actions (basic) iOS 26.4 (March/April 2026) ✅ On track per Apple code leaks
World Knowledge Answers (WKA) iOS 26.4 (March/April 2026) ⚠️ Targeted for March per Gurman; uncertain
Deeper cross-app integration iOS 26.5 (May 2026) ❌ Slipped from iOS 26.4
Full chatbot mode ("Campos") iOS 27 (September 2026) ❌ Confirmed iOS 27 — not iOS 26.4
Long-term memory across sessions iOS 27 (September 2026) ❌ Not in iOS 26.4
WKA integration into Safari/Spotlight iOS 27 or later ❌ Phase 2 rollout

Apple has confirmed to CNBC that it is on track to release a more intelligent version of Siri in 2026. This confirmation came directly after Bloomberg reported Apple had postponed the overhaul from iOS 26.4 to iOS 26.5 or iOS 27. The two statements aren't contradictory — "coming in 2026" covers a lot of ground. Apple isn't breaking its promise by shipping iOS 26.4 with the three core features while pushing the chatbot mode to September. It's threading the needle between "not lying again" and "not shipping something broken again."

Which iPhones Will Get the New Siri?

The new Siri will presumably run on all devices that support Apple Intelligence, though Apple hasn't explicitly confirmed hardware requirements. Some Siri and Apple Intelligence features need newer chips — often starting with the A17-class silicon in recent iPhone models. This hardware requirement makes contextual AI features more reliable on modern devices with dedicated neural engines.

Supported Devices (Apple Intelligence Required):

  • iPhone: iPhone 16 series, iPhone 16e, iPhone 17 series — all supported. iPhone 15 Pro and Pro Max supported for most features. iPhone 15 and older: not supported.
  • iPad: iPad Pro M1 and later, iPad Air M1 and later
  • Mac: Any Mac with Apple Silicon (M1 and later)
  • Language: English (US, UK, Australia) at launch; additional languages in phases through 2026

If you bought an iPhone 15 (non-Pro) expecting Apple Intelligence — this is where the false-advertising lawsuit becomes personal. Apple's advertising campaign built significant consumer expectations for AI features before they existed, and the company settled the resulting class-action in December 2025. The settlement doesn't get you the features — it gets you a partial refund.

The Privacy Architecture: What Google Does and Doesn't See

The Apple-Google deal immediately raised questions about user data. The answer is nuanced but genuinely privacy-preserving — at least by design:

  • Your emails, messages, photos, files: Processed entirely on-device by Apple's own foundation models. Google never sees these. Personal context queries never leave your phone.
  • World-knowledge queries: Routed through Apple's Private Cloud Compute servers running Gemini. If Apple goes forward with Gemini for these features, it will utilize Private Cloud Compute, so Google still won't receive data from these queries. Apple strips identifying information before any query hits the cloud layer.
  • ChatGPT queries: The system-wide ChatGPT integration remains — Siri determines when to send requests to ChatGPT. Users are prompted before any data or photos are sent to OpenAI's servers, and IP addresses are obscured. This remains opt-in.
  • Opting out entirely: Apple will continue to allow AI features to be disabled. Every AI feature in iOS 26.4 can be turned off in Settings → Apple Intelligence & Siri.

iOS 26.4 Release Date: When Will This Actually Ship?

Developer beta release was expected in the week of February 23, 2026. Apple usually follows developer previews with a public beta one to two weeks later, then a full release in late March or early April. Apple's Siri overhaul is targeted for a March 2026 launch alongside iOS 26.4, though testing setbacks have added uncertainty. Final timing depends on test results and any further issues uncovered during the beta period.

The new version of Siri is expected to be introduced either in iOS 26.4 in March or April 2026, or pushed further to iOS 27 in September — the two endpoints of a spectrum that Apple's own PR and Bloomberg are currently occupying simultaneously. The safest bet: iOS 26.4 ships in late March or April 2026 with the three core features (Personal Context, On-Screen Awareness, In-App Actions) and possibly WKA in limited form. The full chatbot experience lands in iOS 27 in September.

New AI Siri vs. Google Assistant vs. ChatGPT

Capability New AI Siri (iOS 26.4) Google Assistant / Gemini ChatGPT (iOS app)
Reads your emails/messages ✅ On-device, private ✅ (Google account required) ❌ Separate app only
On-screen awareness ✅ System-wide ✅ Android only ❌ In-app only
Acts inside other apps ✅ (basic, iOS 26.4) ✅ Android deep integration
AI answer engine (no links) ✅ WKA (March 2026) ✅ Gemini in Search ✅ (best-in-class)
Full chatbot mode ❌ iOS 27 (Sept 2026) ✅ Now ✅ Now
Long-term memory ❌ iOS 27 (Sept 2026) ✅ Gemini app ✅ Pro/Max plans
Privacy (data stays on device) ✅ Best-in-class ❌ Cloud-dependent ❌ Cloud-dependent
Cost Free (built into iOS) Free / Gemini Pro $19.99 Free / Plus $20/month

Frequently Asked Questions

What Is the New AI Siri in iOS 26.4?

iOS 26.4 is expected to bring three core Siri upgrades: Personal Context (retrieving specific information from notes, emails, messages), On-Screen Awareness (completing tasks based on what's currently on screen), and In-App Actions (controlling apps by voice without opening them). Powered by a combination of Apple's on-device models and Google Gemini via Private Cloud Compute.

When Does iOS 26.4 Come Out?

Developer beta was expected in the week of February 23, 2026, with a public beta following one to two weeks later and a full release in late March or early April 2026. Apple has not confirmed a specific date. Based on historical patterns and beta testing timelines, March 26 to April 9, 2026 is the likely release window.

Is the New Siri Powered by Gemini or ChatGPT?

Both — and Apple's own models. Apple confirmed a multi-year partnership with Google to use Gemini models and cloud infrastructure for future Apple Foundation Models, including the new Siri. ChatGPT (GPT-4o) remains integrated for world-knowledge queries users choose to route there. Apple's current ChatGPT integration is opt-in, with users prompted before any data is sent to OpenAI servers, and IP addresses obscured. Apple says this arrangement isn't changing despite the Gemini deal.

Will the New Siri Have Memory?

Siri in iOS 26.4 will not feature long-term memory or back-and-forth conversations in the chatbot sense. Personal Context means Siri can access your device data — that's persistent by nature. But the ChatGPT-style "remember what we discussed last month" memory across general conversations is a feature targeted for iOS 27's full chatbot mode, not iOS 26.4.

What Is "World Knowledge Answers"?

World Knowledge Answers (WKA) is Apple's internal name for a generative AI search product that competes directly with Perplexity and ChatGPT — designed to give direct answers rather than links. It's expected to debut alongside the new Siri in iOS 26.4 and may later integrate into Safari and Spotlight search. Think Perplexity AI, but built natively into every iPhone.

What iPhones Support the New AI Siri?

All Apple Intelligence-compatible devices: iPhone 16 and later (all models), iPhone 15 Pro and Pro Max, iPad Pro M1+, iPad Air M1+, and all Apple Silicon Macs. Some features require A17-class chips for the neural engine processing involved. iPhone 15 base model and iPhone 14 and older do not support Apple Intelligence or the new Siri.

Is the New Siri a Full Chatbot?

No — Siri in iOS 26.4 will not function as a full chatbot. It will retain the same voice-based interface with limited typing, without back-and-forth conversation history or long-term memory. The full chatbot version of Siri, internally codenamed "Campos," is a separate project targeting iOS 27 in September 2026 — with capabilities comparable to Gemini and ChatGPT.

Does Google See My Data When Siri Uses Gemini?

If Apple goes forward with Gemini for cloud features, it utilizes Private Cloud Compute, so Google still won't receive data from these queries. Personal context queries (your emails, messages, files) are processed entirely on-device — Google never has access to them. Only world-knowledge queries (things that don't require your personal data) route through the Gemini layer, with Apple anonymizing requests first. You can also disable all AI features entirely in Settings.

Is the New Siri Free?

Yes — Apple Intelligence and the new Siri features are free for all compatible devices on iOS 26.4 and later. No subscription, no paid tier. The Gemini licensing cost ($1.5B/year to Google) is absorbed by Apple — it does not get passed to users. ChatGPT integration remains free for basic use; ChatGPT subscribers can connect their accounts for premium features, but neither is required.

Reviews

Real experiences from verified users

-
0 reviews

No reviews yet

Be the first to share your experience